Lead impactful data engineering projects in a dynamic, fast-paced environment. Develop scalable solutions using cutting-edge technologies like Snowflake and Airflow. Collaborate with talented professionals to drive innovation and efficiency.
Data Engineer
in Information Technology PermanentJob Detail
Job Description
Overview
- Develop and maintain scalable data pipelines and workflows supporting investment, research, and operational teams.
- Design and optimize data models and analytical datasets in Snowflake for business intelligence.
- Implement monitoring and alerting systems to ensure data quality and reliability.
- Collaborate with cross-functional teams to integrate data solutions across the organization.
- Contribute to the modernization of legacy systems using advanced technologies.
- Participate in code reviews and uphold engineering standards within the team.
- Support the adoption of containerization and cloud-native solutions for scalability.
- Engage in on-call rotations to maintain system reliability and performance.
- Actively identify and automate manual processes to enhance operational efficiency.
Key Responsibilities & Duties
- Develop and maintain data pipelines for ingesting, transforming, and delivering critical data.
- Create and manage data models in dbt, ensuring documentation and lineage tracking.
- Optimize analytical datasets in Snowflake to support reporting needs.
- Write and maintain Airflow DAGs and scripts using Python and SQL.
- Collaborate with engineering and analytics teams to deliver integrated solutions.
- Implement monitoring tools like Datadog to ensure data pipeline reliability.
- Translate business logic from legacy systems into modern data models.
- Contribute to the adoption of containerization and orchestration technologies.
- Participate in code reviews to maintain high-quality engineering practices.
Job Requirements
- Bachelor of Science (BS) degree in a relevant field.
- 3-5+ years of experience in data engineering roles.
- Proficiency in Python and SQL for production-level coding.
- Hands-on experience with Airflow, Snowflake, and SQL Server.
- Knowledge of dimensional modeling and data warehouse design patterns.
- Experience with dbt, Docker, Terraform, Azure, GitHub Actions, and Datadog preferred.
- Ability to work autonomously and proactively solve problems.
- Strong communication skills to explain technical concepts to non-technical stakeholders.
- Prior experience in wealth management or investment operations is advantageous.
- ShareAustin: