Title: Senior Data & AI Engineer
Location: Phoenix, AZ (hybrid remote)
Type: 6-month contract to hire
Pay: $50-60/hr
We’re looking for a Senior Data & AI Engineer to lead the design and delivery of secure, scalable data and AI solutions within complex healthcare environments. The position focuses on building modern data platforms, integrating diverse clinical and claims datasets, and operationalizing machine learning models that improve cost, quality, and patient outcomes.
Your role
· Design, implement, and optimize data platforms using Snowflake and Microsoft Fabric, including Lakehouses, Warehouses, OneLake, and engineering pipelines.
· Build and maintain scalable ingestion frameworks for batch and streaming data sources such as APIs, ADLS, SFTP, and event streams with full lineage and governance.
· Develop secure data environments that comply with HIPAA and PHI requirements using role-based access, masking, tokenization, and de-identification.
· Create conceptual, logical, and physical data models using dimensional, normalized, and data vault approaches.
· Transform and normalize structured and unstructured healthcare data including claims, eligibility, enrollment, provider, and clinical documentation.
· Integrate and harmonize data using FHIR, HL7, X12/EDI 837/835, NCPDP, and CMS standards across payer, provider, EHR, and HIE systems.
· Build and deploy machine learning pipelines for risk modeling, utilization forecasting, fraud detection, quality measurement, and care gap analysis.
· Operationalize models with strong MLOps practices including versioning, CI/CD, monitoring, and drift detection.
· Implement data cataloging, metadata management, lineage tracking, and quality validation using tools such as Microsoft Purview or equivalent.
· Monitor and optimize pipeline performance, cost, and reliability across Snowflake and Fabric environments.
· Collaborate with clinicians, actuaries, product teams, and analysts to translate business needs into scalable technical solutions.
· Document architecture, data mappings, and design standards while mentoring engineers and contributing to enterprise best practices.
What you’ve got
· 8+ years of experience in data engineering or analytics with at least 5 years of hands-on Snowflake expertise including virtual warehouses, tasks, streams, Snowpipe, RBAC, masking, and data sharing.
· 2+ years of experience with Microsoft Fabric including OneLake, Lakehouses, Warehouses, Dataflows Gen2, Notebooks, and Pipelines.
· Advanced SQL skills with strong experience in ETL/ELT development using Python, dbt, Dataflows, or Fabric/ADF pipelines.
· Deep knowledge of healthcare data standards including CMS datasets, FHIR, HL7, X12/EDI, provider data, eligibility, and claims processing.
· Strong data modeling experience including dimensional modeling, SCD types, surrogate keys, 3NF, and data vault methodologies.
· Experience building and deploying machine learning solutions using tools such as scikit-learn, PyTorch, TensorFlow, Azure ML, or Fabric ML.
· Practical experience managing HIPAA compliance, PHI handling, auditing, and secure access controls within cloud data environments.
· Experience working with both structured data formats such as Parquet and CSV and unstructured data such as clinical notes and PDFs.
· Strong communication skills with the ability to produce mapping specifications, lineage documentation, and present technical trade-offs clearly.
· Preferred: Experience with Epic or Cerner integrations, HEDIS or risk adjustment programs, MLOps tools such as MLflow or GitHub Actions, Power BI semantic modeling, and relevant Snowflake or Microsoft certifications.
To find more great tech-centric jobs, please visit www.phoenixstaff.com.