Drive innovation in data engineering by building scalable, reliable infrastructure. Collaborate with cross-functional teams to deliver impactful data solutions. Enhance your expertise in cloud platforms and modern data tools.

DATA ENGINEER
in Information Technology PermanentJob Detail
Job Description
Overview
- Contribute to building scalable and reliable data infrastructure supporting analytical and machine learning initiatives.
- Develop robust ETL/ELT pipelines for diverse data sources ensuring data integrity and availability.
- Apply software engineering principles to data practices, including version control and CI/CD pipelines.
- Architect optimized data models and solutions for efficient data analysis and reporting.
- Enhance cloud data platforms using AWS or Azure services for improved performance and scalability.
- Implement observability measures such as logging, monitoring, and alerting for proactive issue resolution.
- Mentor team members on best practices in data engineering and software development.
- Stay updated with emerging trends in data management and technologies.
Key Responsibilities & Duties
- Design, develop, and maintain scalable data pipelines for diverse data sources.
- Implement software engineering practices in data workflows, including testing and infrastructure automation.
- Architect and optimize data models for performance and usability.
- Manage and improve cloud data platforms, ensuring operational excellence.
- Instrument data pipelines with monitoring tools for proactive issue identification.
- Collaborate with cross-functional teams to deliver reliable data solutions.
- Mentor team members on data engineering and software development practices.
- Contribute to continuous learning and improvement within the team.
Job Requirements
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Minimum of 3 years of experience in data engineering; 5 years preferred.
- Proficiency in SQL and Python for data processing and pipeline development.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Hands-on experience with data orchestration tools like Airflow or Prefect.
- Knowledge of Infrastructure as Code tools like Terraform or CloudFormation.
- Strong communication skills for technical and non-technical stakeholders.
- Self-starter with a collaborative mindset and continuous learning attitude.
- ShareAustin: