Drive impactful data projects, ensuring seamless integration and transformation. Collaborate with dynamic teams to deliver reliable data solutions. Enhance operational efficiency through innovative data engineering practices.
Data Engineer
in Information Technology PermanentJob Detail
Job Description
Overview
- Design, build, and maintain scalable data pipelines for efficient data integration and transformation.
- Collaborate with cross-functional teams to deliver data projects aligned with business goals.
- Ensure data quality, integrity, and security across all processes and platforms.
- Develop and optimize data solutions using Python, SQL, and Windows batch scripting.
- Utilize Snowflake and SQL Server for data storage and processing workflows.
- Create insightful data visualizations and reports using Power BI.
- Automate and document repetitive data tasks to improve operational efficiency.
- Monitor and troubleshoot data pipelines to address performance and reliability issues.
- Support business decision-making through effective data management and analysis.
Key Responsibilities & Duties
- Design, implement, and maintain data pipelines using Airflow and other ETL/ELT frameworks.
- Optimize data storage and processing workflows in Snowflake and SQL Server.
- Develop SQL queries, views, and stored procedures for accurate data transformations.
- Create data visualizations and reports using Power BI to support operational efficiency.
- Collaborate with stakeholders to gather requirements and translate them into technical specifications.
- Monitor and troubleshoot data pipelines, resolving performance and reliability issues.
- Automate and document repetitive tasks to enhance workflow efficiency.
- Lead and manage data projects to ensure timely delivery and alignment with objectives.
- Provide regular updates and insights to team members and stakeholders.
Job Requirements
- Bachelor of Science (BS) degree in a relevant field.
- Minimum of 1 year of experience in data engineering; 3 years preferred.
- Proficiency in Snowflake, SQL Server, Python, Airflow, and Windows batch scripting.
- Experience with ETL/ELT frameworks for designing and maintaining pipelines.
- Familiarity with data modeling, database design, and optimization for scalability.
- Ability to integrate and deliver data from multiple endpoints using APIs and SFTP.
- Strong analytical and problem-solving skills for technical issue resolution.
- Knowledge of credit union products, services, and regulations is a plus.
- Excellent communication skills for effective collaboration with teams and stakeholders.
- ShareAustin: