Data Engineer

TALENT Software Services
Redmond, WA

Are you an experienced Data Engineer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Data Engineer for a contract position to work at their company 100% remotely.


Position Summary: The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and systems that deliver trusted data for analysis and product use cases. This role partners with cross-functional teams to understand data needs and implement solutions that support both near-term and long-term objectives. This role requires the ability to contribute to technical design, ensure data quality, and operate with increasing independence and accountability.


Primary Responsibilities/Accountabilities:

  • Develop and maintain batch and streaming data pipelines using modern tools and frameworks. Design transformations, optimize performance, and ensure reliable data delivery.
  • Design and implement scalable and maintainable data models and storage solutions that align with business needs and support efficient querying, analysis, and data integration efforts.
  • Engage in agile best practices, help refine stories, identify dependencies, and proactively raise risks or concerns to ensure work is completed on time or escalated when needed.
  • Implement and enforce data quality controls, validation, and compliance standards across pipelines.
  • Support the deployment, scheduling, and monitoring of data pipelines and workflows to ensure consistent, reliable execution.
  • Maintain comprehensive documentation and advocate for coding standards, best practices, and reusable components.
  • Collaborate regularly with cross-functional teams to clarify data requirements, document assumptions, and deliver high-quality solutions. Communicate clearly during stand-ups, design discussions, and retrospectives. Actively contribute to team code reviews and share learnings with peers.


Qualifications:

  • 2-5 years of experience in data engineering, data modelling, and ETL pipelines
  • Proficient in SQL and Python for creating, improving, and fixing data pipelines
  • Experience with cloud and data platforms, especially Azure and Databricks (Delta Live Tables and Unity Catalog)
  • Strong understanding of tools like SnapLogic, Azure Data Factory, and Jenkins for data integration and orchestration
  • Practical experience with Terraform for infrastructure as code and managing deployment pipelines
  • Experience integrating with APIs.
  • Knowledge of data quality and monitoring tools, particularly Soda or similar
  • Proficient in version control and CI/CD workflows, using tools like GitHub
  • Solid understanding of data modelling principles (e.g., dimensional modelling, normalization)
  • Comfortable working in agile teams, with a proactive approach to planning, organizing tasks, and collaborating
// // //