Data Architect

in Information Technology Contract

Job Detail

  • Experience Level Sr Level
  • Degree Type Bachelor of Science (BS)
  • Employment Full Time
  • Working Type Hybrid
  • Job Reference 0000011904
  • Salary Type Hourly
  • Industry Financial Services;Investment Banking;Investment Management
  • Selling Points

    Join as a Lead Azure Databricks Engineer to architect innovative Lakehouse solutions. Collaborate with teams to unify data platforms and ensure regulatory compliance. Enhance your expertise in Azure and Databricks technologies.

Job Description

Overview

  • Lead the design and optimization of Azure Databricks Lakehouse architectures in a regulated financial services environment.
  • Collaborate with cross-functional teams to unify siloed systems into cohesive data platforms.
  • Implement medallion architecture to ensure scalability, governance, and analytics agility.
  • Configure and manage Unity Catalog for enhanced security and compliance.
  • Optimize Spark workloads for performance and cost efficiency in large-scale environments.
  • Support BI and analytics teams by delivering performant semantic layers.
  • Champion best practices for SDLC, CI/CD, testing, and DevOps in Azure ecosystems.
  • Ensure solutions meet regulatory requirements and deliver secure data environments.

Key Responsibilities & Duties

  • Architect and maintain Azure-based Lakehouse solutions leveraging Databricks and Delta Lake.
  • Develop data ingestion frameworks for batch and streaming pipelines using PySpark and Spark SQL.
  • Implement row- and column-level security via Unity Catalog for compliance.
  • Collaborate with teams to integrate systems into unified data platforms.
  • Optimize Spark workloads for performance and cost efficiency.
  • Support analytics teams by delivering governed semantic layers.
  • Ensure solutions meet regulatory requirements in financial services environments.
  • Promote best practices for development, testing, and DevOps in Azure Databricks.

Job Requirements

  • Bachelor of Science degree in a relevant field.
  • 7+ years in data engineering, with 3+ years focused on Azure Databricks.
  • Expertise in Azure services like Data Factory, SQL, and Fabric.
  • Hands-on experience with Unity Catalog for security and compliance.
  • Proficiency in Python, PySpark, Spark SQL, and T-SQL.
  • Experience in performance tuning and cost optimization of Spark workloads.
  • Strong knowledge of data governance and security protocols.
  • Comfortable working in regulated, high-stakes environments.
  • ShareAustin:

Related Jobs