We, at iPASS, leverage transit and e-payment data to drive our business. Being one of few companies powering Taiwan's main transportation, iPASS is uniquely positioned in the market as the cornerstone of the e-payment ecosystem.
Your role is crucial in helping the team investigate the data and provide analytical reports to support business decisions. You will also be instrumental in developing a flexible dashboard or API for third parties. Let's stay excited as we uncover the next chapter of the transit and the e-payment ecosystem.
Responsibilities
• Build and scale data orchestration services that support complex analysis across iPASS.
• Design and implement tooling for access management, monitoring, data controls, and self-service ETL creation.
• Review the data modeling practices of other teams.
Requirements
• 4+ years experience in the data warehouse space.
• Strong experience working with big data tools:Spark, Kafka, Hadoop, Druid, etc.
• Able to build clean, maintainable code in a production environment.
• Strong skill in Python and SQL.
• Experience in container technology, such as Docker.
• Experience with distributed systems and designing APIs.
• Knowledge in standard engineering methodology, such as unit testing, code reviews, etc.
Nice To Have
• Experience with modern development methodologies like CI/CD.
• Experience with any cloud services.