Data Engineer

AccelOne
Buenos Aires, AR

AI & Data Center of Excellence – Abu Dhabi, UAE

Role Overview

As a Data Engineer, you will be responsible for building and maintaining scalable, reliable, and secure data platforms that power analytics and AI use cases across the organization.

This role is critical in enabling data-driven decision-making in financial services environments. You will work closely with data scientists, AI engineers, and business stakeholders to ensure data systems are robust, performant, and aligned with regulatory and operational requirements.

Experience Bands

Senior Data Engineer: 8–10 years of experience
Data Engineer: 5–7 years of experience

Key Responsibilities

• Design and implement robust ETL/ELT pipelines for structured and unstructured data
• Build and manage scalable data lakes, data warehouses, and real-time data pipelines
• Ensure data quality, lineage, governance, and compliance across data platforms
• Enable reliable data availability for analytics, reporting, and AI systems
• Optimize data infrastructure for performance, scalability, and cost efficiency
• Collaborate with Data Science and AI teams to productionize machine learning pipelines
• Monitor and troubleshoot data workflows and system performance
• Implement best practices for data security and reliability

Financial Services Use Cases (Preferred)

Candidates with experience in financial data environments will be highly valued, particularly in:

• Transaction data pipeline development and management
• Regulatory reporting and compliance data systems
• Risk and finance data marts
• Customer 360 and customer analytics platforms

Technical Skills

Data Platforms
• Snowflake
• BigQuery
• Amazon Redshift
• Databricks

Data Processing Technologies
• Apache Spark
• Apache Kafka
• Apache Flink

Databases
• SQL databases
• NoSQL databases

DevOps & Engineering Practices
• CI/CD pipelines
• Version control systems (e.g., Git)

Containers & Infrastructure
• Docker
• Kubernetes

Cloud Platforms
• AWS
• Azure
• Google Cloud Platform (GCP)

Evaluation Criteria

Candidates will be evaluated based on:

• Complexity and scale of data systems built and maintained
• Reliability and performance of data pipelines in production environments
• Experience implementing data governance and compliance standards
• Exposure to AI and machine learning data pipelines
• Ability to design scalable and resilient data architectures

Key Performance Indicators (KPIs)

• Reliability of data pipelines (uptime, failure rate)
• Data latency and freshness
• Data quality and integrity metrics
• Cost optimization and efficiency of data infrastructure
• Stability and scalability of data platforms

Preferred Profile

• Experience working within financial data ecosystems
• Understanding of regulatory data requirements and compliance standards
• Exposure to MLOps and machine learning data pipelines
• Experience working in distributed or cross-functional teams
• Strong problem-solving and ownership mindset

// // //