Role Description: We are seeking a motivated and detail-oriented Data Engineer to join our growing data team. In this role, you will be instrumental in building and maintaining our data infrastructure, ensuring the efficient and reliable data flow for analysis and reporting. You will collaborate closely with data scientists, analysts, and other engineers to develop and implement data pipelines, optimize data storage, and contribute to the overall data strategy. This is an excellent opportunity for an early-career to mid-level professional to gain hands-on experience in a dynamic and data-driven environment. Responsibilities: -Data Pipeline Development and Maintenance: -Design, develop, and maintain robust and scalable data pipelines using Python and Airflow. -Extract, transform, and load (ETL/ELT) data from various sources into our data warehouse. -Monitor and troubleshoot data pipeline issues, ensuring data integrity and reliability. -Data Storage and Management: -Assist in the design and implementation of efficient data storage solutions on AWS. -Optimize database performance and contribute to data governance practices. -Maintain data schemas, data dictionaries, and documentation. Cloud Infrastructure (AWS): -Utilize AWS services such as S3, EMR, Redshift, and EC2 for data storage and processing. -Participate in the deployment and management of cloud-based data infrastructure. -Monitor and optimize cloud resource usage for cost-effectiveness and performance. -Collaboration and Communication: -Work closely with data scientists, analysts, and engineers to understand data requirements. -Collaborate cross-functionally to deliver reliable data solutions for analytics and product needs. -Document processes and communicate progress clearly to stakeholders. Required Skills: Programming & Data Engineering: -Proficiency in Python, with strong command of data manipulation libraries such as Pandas and NumPy. -Solid understanding of Hadoop and PySpark, with experience building distributed data processing workflows. -Experience with Apache Airflow for orchestrating and managing data workflows. Cloud & Data Systems: -Familiarity with Amazon Web Services (AWS), especially S3, EMR, Redshift, and EC2. -Understanding of ETL processes, relational databases, and data warehousing concepts. -Experience managing large-scale datasets and optimizing performance in distributed environments. Soft Skills: -Strong problem-solving and analytical thinking. -Excellent communication and collaboration skills. -Ability to learn quickly and adapt to evolving technologies. -High attention to detail and a structured, process-oriented mindset. Preferred Qualifications (Nice to Have): -Bachelor's degree in Computer Science, Data Science, or a related technical field. -Hands-on experience with dbt for building and maintaining data transformation models. -Experience working on LLM (Large Language Model) or AI-related projects, such as model integration, fine-tuning, or deploying AI-powered features in production. -Familiarity with data modeling best practices and version control (e.g., Git). -Knowledge of data visualization tools such as Power BI or similar platforms.
待遇面議
(經常性薪資達 4 萬元或以上)
未填寫
ViewSonic規劃符合組織與個人發展之學習方案,讓同仁充分發揮潛能與專業志趣。開放式的組織環境以及自由愉快的工作氣氛,帶領每一位員工盡情揮灑熱情並尋找到屬於自己的絕佳舞台。 ■ 具競爭力之薪資制度 公司提供具市場競爭力的薪資制度,並不吝與同仁分享營運上的績優表現。 正職員工享有: -薪資14個月 -與獲利連結之員工績效獎金 ■ 彈性的假勤制度: -彈性上班時間早上8點至10點,可依照自己的生活習慣安排你的上班時間。 -一年7天有薪年假 -5天全薪病假 ■ 福委會活動: -規劃年度員旅補助、婚喪補助、年節獎金、特約商店、手作課程、公益捐血等活動,增進同仁間情感交流。 -10元販賣機(餅乾、零食、麵包、飲料)通通只要10元 ■ 多元化的社團活動 瑜珈社、登山社、籃球社、羽球社、桌球社,還有更多的社團等你來發掘或組建。 ■ 舒適的工作環境 寬敞明亮且具有設計感的工作環境,可使用一個以上的電腦螢幕,實現螢幕自由。 ■ 員工健身房: 為您省去健身房月費,公司樓下就有健身房。重訓、有氧、舒緩等器材應有盡有。 ■ 教訓訓練: 語言學習補助、證照補助、領導階程訓練課程等,不定期舉辦學習講座讓你也可以接收產業新知。 ■ 獨創的藝文陶冶環境: 除了掛滿名家畫作的辦公環境外,不定期舉辦主題性知名畫家之畫作展覽,培養同仁的藝文氣息與美學觀念,鼓勵從藝術氛圍中激發創意。 ■ 優惠的員購機會: 本公司各項產品均可洽詢員購優惠價格。