Senior Data Engineer

New York Technology Partners
Charlotte, NC

Sr Data Engineer

Location: Charlotte or Raleigh


Core Technical & Functional Requirements

  • 10+ years of experience in data engineering, data platform development, or related technical fields.
  • 3–5 years of hands‑on experience building solutions with Databricks and Apache Spark.
  • Strong command of SQL, including complex transformations, optimization, and analytical querying.
  • Proven experience designing and maintaining ETL/ELT pipelines, data models, and performance‑tuned workflows.
  • Proficiency in Python or Scala for data engineering and automation tasks.
  • Ability to troubleshoot complex data issues and communicate clearly with both technical and non‑technical stakeholders.

Role Overview

We are looking for a highly skilled Databricks Data Engineer who can design, build, and optimize data pipelines and analytics workflows in a cloud environment. This role requires strong engineering fundamentals, hands‑on development experience, and the ability to operate in a fast‑moving environment. Experience with insurance data or Power BI is helpful but not required.


Key Responsibilities

Data Pipeline & Workflow Development

  • Build and maintain scalable ETL/ELT pipelines using Spark, PySpark, and Databricks.
  • Develop efficient SQL transformations to support analytics, reporting, and downstream applications.
  • Partner with product owners, architects, and analysts to design data models, Delta Lake structures, and end‑to‑end workflows.
  • Implement orchestration, scheduling, and monitoring for data jobs and pipelines.
  • Ensure data quality, reliability, and performance across all stages of the pipeline.
  • Develop and maintain CI/CD processes for Databricks notebooks, jobs, and workflows.
  • Work within cloud‑based data platforms (Azure preferred; AWS/GCP acceptable).

Collaboration & Delivery

  • Work closely with business teams and data analysts to translate requirements into scalable engineering solutions.
  • Support production environments, troubleshoot issues, and ensure high availability of data pipelines.
  • Contribute to best practices around coding standards, documentation, and operational excellence.

Required Skills & Experience

  • 10+ years in data engineering or related technical roles.
  • 3–5 years of hands‑on Databricks and Spark development.
  • Strong SQL expertise and analytical problem‑solving skills.
  • Experience with ETL/ELT design, data modeling, and performance tuning.
  • Proficiency in Python or Scala for data engineering.
  • Strong communication skills and the ability to work cross‑functionally.

Nice to Have

  • Familiarity with insurance industry data structures or workflows.
  • Exposure to Power BI or similar BI tools.

// // //