Lead impactful data engineering projects in a hybrid work environment. Shape next-generation analytics platforms using Microsoft Fabric and Delta Lakehouse. Enjoy professional growth opportunities in a dynamic role.
Lead Data Engineer
in Information Technology PermanentJob Detail
Job Description
Overview
- Contribute to the development of a cutting-edge data analytics platform leveraging Microsoft Fabric and Delta Lakehouse technologies.
- Utilize expertise in PySpark and Medallion Architecture to optimize data pipelines and processing workflows.
- Collaborate with stakeholders to deliver scalable solutions for business intelligence and machine learning initiatives.
- Provide leadership and mentorship to a team of data engineers, fostering best practices.
- Ensure reliability, efficiency, and high performance of data ingestion and transformation processes.
- Engage in continuous improvement of data engineering practices, including governance and observability.
- Work in a hybrid environment with opportunities for professional growth and development.
Key Responsibilities & Duties
- Design and develop scalable ETL pipelines using PySpark and Microsoft Fabric technologies.
- Architect and optimize Medallion Architecture layers within a Delta Lakehouse environment.
- Implement data quality, governance, and lineage tracking best practices.
- Collaborate with cross-functional teams to support analytics and machine learning projects.
- Lead and mentor junior engineers, promoting high standards in code quality and deployment.
- Integrate batch and streaming data workflows for enhanced operational efficiency.
- Utilize automation and CI/CD practices to streamline data pipeline deployments.
- Monitor and troubleshoot data pipelines to ensure optimal performance and availability.
Job Requirements
- Bachelor of Science degree in a relevant field is required.
- Minimum of 5 years of experience in data engineering; 10 years preferred.
- Proficiency in PySpark, Delta Lake, and Medallion Architecture.
- Experience with Microsoft Fabric and Azure cloud technologies.
- Strong SQL skills and familiarity with database performance tuning.
- Knowledge of data testing frameworks like Great Expectations or dbt.
- Expertise in CI/CD practices for data pipeline deployments.
- Preferred experience with streaming technologies and event-driven architectures.
- ShareAustin: