Overview
JOB SUMMARY
The Central Health System's Data Engineer is responsible for designing, developing, and maintaining scalable data systems that support enterprise reporting, analytics, data science, AI and operational workflows. This role requires deep technical expertise in cloud-native data engineering including pipeline development, data modeling, and platform optimization, within Central Health's Snowflake and Azure environment. The Data Engineer is accountable for automating the import and export of enterprise data, ensuring data integrity across all internal and external systems, and building reliable, well-documented data infrastructure that cross-functional teams can depend on.
The Data Engineer collaborates closely with Data Architects, Engineering leadership, and all internal and external enterprise partners to establish and uphold best practices in data engineering and governance. This role works in close coordination with the Data Integration team, maintaining awareness of integration workflows and serving as a technical liaison where data engineering and integration intersect. The Data Engineer also contributes to the broader evolution of Central Health's data capabilities including emerging work in analytics engineering and applied AI, and serves as a key technical contributor across the Data Engineering and Integration Services team.
Responsibilities
Essential Functions
Data Pipeline & Engineering:• Design, develop, test, and maintain scalable ELT/ETL pipelines using Azure Data Factory, dbt, Python, and Snowflake.• Build and optimize data models (star/snowflake schemas, dimensional models) within Snowflake to support analytics and operational reporting.• Develop, test, and maintain Snowflake objects.• Write advanced T-SQL and Python scripts for data transformation, automation, and data quality enforcement.• Automate data extraction, transformation, and loading processes to reduce manual intervention and improve reliability.• Investigate and troubleshoot coding errors and data flow failures.
Cloud Modernization & Platform Migration:• Participate in the assessment and migration of SQL Server objects (stored procedures, SSIS packages) to Snowflake and Azure-native equivalents, as applicable.• Validate migrated data for completeness, accuracy, and consistency against source systems.• Document migration decisions, data lineage, and transformation logic throughout the modernization lifecycle.
Data Quality & Governance:• Ensure data accuracy, completeness, and consistency across pipelines and data products; investigate and resolve data flow failures and anomalies.• Analyze data quality issues and develop solutions to maintain integrity and efficiency across the data platform.• Coordinate with the Data Integration team on data transport, interface issues, and cross-system data flows as needed.• Adhere to data governance standards and contribute to data cataloging and documentation (e.g., OvalEdge).• Coordinate with QA Specialists to develop and execute test plans verifying that pipelines and systems satisfy design requirements.
Collaboration & Leadership:• Collaborate with Data Architects, database administrators, and other members of the Joint Technology team.• Partner with Data Engineering & Integration Services teammates and cross-functional stakeholders on project delivery.• Serve as a subject matter expert (SME) during project planning in collaboration with the Project Management Office (PMO).• Answer provider, partner, and vendor questions; assist in troubleshooting and resolving data-related problems.• Create and maintain schema documentation, data dictionaries, and other technical artifacts.• Adhere to coding standards and participate in peer code reviews; utilize standardized tools to document work and track progress.• Perform vendor management and oversight as needed.• Ability to work outside of standard business hours when required.
Qualifications
REQUIRED EDUCATION:
-Bachelor's Degreein computer science or related fieldOR
-Associate's Degree in computer science or related field AND one (1) year of additional experience in related fieldOR
-High School Diploma or equivalent or GED AND Two (2) years of additional experience in related field
MINIMUM EXPERIENCE:
-1 year of experience in Healthcare IT-2 years of experience in data engineering, database development, or a related role-2 years of experience writing advanced T-SQL and/or Python for data processing, automation, or pipeline development-2 years of experience with cloud data platforms (Snowflake, Azure SQL, Azure Data Factory, or equivalent)-1 year of experience with data pipeline or ETL/ELT development and orchestration-Familiarity with data integration concepts and interoperability standards in a healthcare or enterprise environment
REQUIRED CERTIFICATIONS/LICENSURE:
EPIC certification (Within 6 Months)Boomi Professional Developer (Within 3 Months)Snowflake SnowPro Core Certification (Within 3 Months)Microsoft Certified: Azure Data Engineer Associate (DP-203) (Within 6 Months)