1. Designing, building, and maintaining scalable data pipelines and data processing systems.
2. Collecting, aggregating, and transforming large datasets from various sources into usable formats.
3. Developing and implementing data integration processes to consolidate data from multiple systems.
4. Implementing data quality and validation procedures to ensure accurate and reliable data.
5. Building and managing data marts or data lakes for storage and retrieval of data.
6. Developing and optimizing data models and schemas for efficient data storage and retrieval.
7. Collaborating with cross-functional teams (e.g., data analysts and IT engineers) to understand data needs and requirements.
8. Staying up-to-date with emerging data engineering technologies and trends.
9. Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes
10. Work closely with the data analyst and business teams to develop data models and pipelines that add values to our product offerings, customer experiences and retention
At Allianz, we rely on insightful data to power our services and business. We are seeking an experienced pipeline-centric data engineer to put it to good use. The ideal candidate will have the expected mathematical and statistical expertise, combined with a rare curiosity and creativity. This person will wear many hats in the role, but much of the focus will be on building out our Data ETL processes, data pipeline management, data model deployment and monitoring. Beyond technical prowess, the data engineer will need soft skills for clearly communicating highly complex data trends to business leaders. We’re looking for someone willing to jump right in and help the company get the most from its data.