Data Engineer

in Information Technology
  • New York, NY View on Map
  • Salary: $125,000.00 - $130,000.00
Permanent

Job Detail

  • Experience Level Mid Level
  • Degree Type Bachelor of Science (BS)
  • Employment Full Time
  • Working Type On Site
  • Job Reference 0000015482
  • Salary Type Annually
  • Industry Financial Services;Private Equity
  • Selling Points

    Lead impactful data engineering projects in a dynamic, fast-paced environment. Develop scalable solutions using cutting-edge technologies like Snowflake and Airflow. Collaborate with talented professionals to drive innovation and efficiency.

Job Description

Overview

  • Develop and maintain scalable data pipelines and workflows supporting investment, research, and operational teams.
  • Design and optimize data models and analytical datasets in Snowflake for business intelligence.
  • Implement monitoring and alerting systems to ensure data quality and reliability.
  • Collaborate with cross-functional teams to integrate data solutions across the organization.
  • Contribute to the modernization of legacy systems using advanced technologies.
  • Participate in code reviews and uphold engineering standards within the team.
  • Support the adoption of containerization and cloud-native solutions for scalability.
  • Engage in on-call rotations to maintain system reliability and performance.
  • Actively identify and automate manual processes to enhance operational efficiency.

Key Responsibilities & Duties

  • Develop and maintain data pipelines for ingesting, transforming, and delivering critical data.
  • Create and manage data models in dbt, ensuring documentation and lineage tracking.
  • Optimize analytical datasets in Snowflake to support reporting needs.
  • Write and maintain Airflow DAGs and scripts using Python and SQL.
  • Collaborate with engineering and analytics teams to deliver integrated solutions.
  • Implement monitoring tools like Datadog to ensure data pipeline reliability.
  • Translate business logic from legacy systems into modern data models.
  • Contribute to the adoption of containerization and orchestration technologies.
  • Participate in code reviews to maintain high-quality engineering practices.

Job Requirements

  • Bachelor of Science (BS) degree in a relevant field.
  • 3-5+ years of experience in data engineering roles.
  • Proficiency in Python and SQL for production-level coding.
  • Hands-on experience with Airflow, Snowflake, and SQL Server.
  • Knowledge of dimensional modeling and data warehouse design patterns.
  • Experience with dbt, Docker, Terraform, Azure, GitHub Actions, and Datadog preferred.
  • Ability to work autonomously and proactively solve problems.
  • Strong communication skills to explain technical concepts to non-technical stakeholders.
  • Prior experience in wealth management or investment operations is advantageous.
  • ShareAustin:

Related Jobs