This vacancy is open for talent pool collection. We will contact you if we have proper vacancies that fit with your profile.
Job Mission
Represent manufacturing and act as gatekeeper from manufacturing to D&E function
Add value in overall manufacturing processes such as forming, machining, joining, and assembling
Job Description
Contribute to the solution of faults and takes the necessary initiatives and practical decisions to ensure zero repeat
Identify gaps and drive assigned process improvement projects and successful delivery
Initiate and drive new procedure changes and projects
Develop and maintain networks across several functional stakeholders
Prioritize works and projects based on business situation
Transfer knowledge and train colleagues on existing and newly introduced products
Education
Master degree in technical domain (e.g. electrical engineering, mechanical engineering, mechatronics)
Experience
3-5 years working experience in design engineering
Personal skills
Show responsibility for the result of work
Show proactive attitude and willing to take initiative
Drive for continuous improvement
Able to think outside of standard processes
Able to work independently
Able to co-work with different functional stakeholders
Able to demonstrate leadership skills
Able to work in a multi-disciplinary team within a high tech(proto) environment
Able to think and act within general policies across department levels
Diversity and inclusion
ASML is an Equal Opportunity Employer that values and respects the importance of a diverse and inclusive workforce. It is the policy of the company to recruit, hire, train and promote persons in all job titles without regard to race, color, religion, sex, age, national origin, veteran status, disability, sexual orientation, or gender identity. We recognize that diversity and inclusion is a driving force in the success of our company.
Need to know more about applying for a job at ASML? Read our frequently asked questions.
[Job Description]
•Develop and maintain a web-based system for financial data processing.
•Implement P&L calculations, exposure tracking, and performance reporting over user-defined periods.
•Integrate market data and indicators or signals from the quant team to support trading strategies.
•Process and analyze external financial data sources/APIs (e.g., Bloomberg).
•Ensure data accuracy, system stability, and performance optimization.
•Maintenance and implementation of machine learning models.
[Job Requirements]
•Strong experience in backend development (Python(preferred), Node.js, or other relevant languages).
•Proficiency in working with databases (SQL(preferred) or NoSQL).
•Proficiency in creating and maintaining databases will be considered a plus.
•Basic knowledge of frontend technologies (HTML, CSS, JavaScript) to support system integration.
•Familiarity with financial markets and handling large datasets, also machine learning.
•Experience in market data processing and familiarity with financial instruments is a plus.
•Proficiency in written and spoken English and Chinese
•Can work under pressure on ad-hoc projects
*This position requires on-site test only. Please ensure you are able to attend the test in person. (僅有實體測驗,且此職位非遠距,請確保您能配合再投遞履歷)
*Please provide an English resume.*
工作內容:
• Designing and maintaining cloud-based data warehouse, including data collection, modeling, and storage.
• Maintaining batch and streaming pipelines, ensuring data quality.
• Developing data APIs based on product requirements and deploying them to Kubernetes using Gitlab CI/CD.
• Understanding user needs and handling data retrieval and dashboard support tasks.
• Continuously learning, optimizing data architectures, and introducing new technologies.
- Set up and execute extract, transformation, and load (ETL) functions to build a data pipeline.
- Extraction and analysis of large data sets from MySQL.
- Performance tuning for current and newly added queries to the databases, ensuring the database resources are fully utilized.
- Delivering clear analysis and reporting of core business metrics to shareholders.
- Creation and management of reports and dashboards.
- Data management.
- Enhance and optimize existing reporting processes.
- Ad hoc analysis and reporting to clients and shareholders.
- Aid in reconfiguring existing architecture and database structure to address our shareholder's evolving needs.
- Daily maintenance and monitoring of all BI-related databases and dashboards, including proficient handling of emergencies.
- Providing actionable insight to drive the growth of core products.
Our Stack
MySQL (Must)
Python (Good to have)
AirFlow (Good to have)
AWS (Good to have)
Metabase (Good to have)
Redshift (Good to have)
Linux (Good to have)
【Job Responsibilities】
・Own the design, scalability, and reliability of the company’s data platform and pipelines
・Define and implement best practices for data modeling, orchestration, and data warehouse architecture.
・Work closely with Product, data scientists, and Infra teams to define data strategy.
・Initiate the adoption of MLOps practices: design and set up the framework for deploying, monitoring, and scaling ML/DL models in production.
【Skills】
■ Must-have
・Proficient in SQL and Python.
・Proven experience in designing and maintaining large-scale data pipelines and data warehouses (GCP preferred).
・Strong knowledge of schema design, structured/unstructured data handling, and performance optimization.
・Solid understanding of CI/CD and version control (Git).
・Familiarity with Unix/Linux environments.
・Experience with workflow orchestration tools (e.g., Airflow, dbt, Prefect).
・Experience in defining architecture and technical direction for data platforms.
■ Nice-to-have
・Hands-on experience with MLOps practices and tools (e.g., MLflow, Kubeflow).
・Exposure to real-time/streaming data pipelines (e.g., Kafka, Pub/Sub).
・Experience with Kubernetes for scalable data and ML workloads.
About the Opportunity:
We're seeking a highly motivated and experienced Data Engineer to join our growing team at BTSE. You will play a critical role in designing, implementing, and maintaining our data infrastructure, with a focus on cloud-based solutions. You will work closely with the Senior Data Engineer to ensure the efficient and reliable delivery of data to support various business needs.
Key Roles & Responsibilities:
●Design, implement, and maintain data infrastructure, including cloud-based solutions (AWS preferred).
●Develop and optimize ETL pipelines for performance, scalability, and reliability.
●Collaborate with the Senior Data Engineer to ensure data quality and integrity.
●Explore options to upgrade and replace existing systems to improve efficiency and reduce manual effort.
●Proactively identify and resolve data-related issues.
●Stay up-to-date with the latest data engineering technologies and best practices.
Qualifications:
●Bachelor's degree in Computer Science, Information Technology, or a related field.
●3+ years hands on experience in cloud technology and data engineering.
●Working experience in GIT, GitLab
●Strong knowledge of database design principles, backup and recovery strategies, security best practices, and performance tuning techniques.
●Proficiency in database management systems such as MySQL, PostgreSQL, Oracle, or SQL Server.
●Solid experience with cloud-based database services (e.g., AWS RDS, Azure SQL Database) is preferred.
●Excellent analytical and problem-solving skills.
●Strong communication and interpersonal skills.
●(Mandatory) : Able to independently hit the ground running with development tasks.