Overview BlueWater Federal is looking for a Jr. Data Engineer to support CENTCOM at MacDill AFB. As a Data Engineer, you will design, implement, and maintain data pipelines for ingesting, transforming, and storing structured and unstructured data from diverse sources.� US Citizenship and active Top Secret clearance required.
Responsibilities- Design, implement, and maintain data pipelines for ingesting, transforming, and storing structured and unstructured data from diverse sources.
- Connect to authoritative data sources via APIs (e.g., REST APIs) to ingest data into a data lakehouse architecture for further processing and storage.
- Automate Extract, Transform, and Load/Extract, Load, and Transform (ETL/ELT) workflows to enable real-time or near-real-time data processing.
- Curate ingested data into structured datasets optimized for analytics and reporting.
- Implement and enforce DoD-compliant data governance frameworks, including metadata tagging and data classification.
- Monitor and improve compliance with data policies and access control protocols.
- Optimize the storage and retrieval of data to ensure seamless integration with analytics platforms and dashboards.
- Support analytics initiatives using tools like Jupyter Notebooks, Databricks, Qlik, Palantir Foundry, Tableau, and other platforms.
- Partner with mission teams to understand operational data requirements and translate them into technical solutions.
- Explore and implement emerging technologies such as machine learning (ML) tools and AI frameworks to improve data processing and insights.
Qualifications- BA/BS in Data Management/Science/Analytics, Computer Sciences, Information Technology, or related fields is desired
- Must be a US Citizen. Active TS clearance and SCI eligibility is required.
- 1+ years of Data Analysis experience
- Experience in building data pipelines, ETL/ELT workflows, and data storage solutions.
- Experience connecting to authoritative data sources via APIs (e.g., REST APIs) and integrating that data into data lakehouses or similar architectures. Proficiency in Python, SQL, and/or other programming languages for data manipulation.
- Hands-on experience with data storage and processing tools such as Apache Spark, Databricks, Hadoop, or similar.
- Experience with data normalization, mining/aggregation, and visualization
- Experience with operational dashboard design, development, and use
- Familiarity with one or more of the following: Jupyter Notebooks, Databricks, Qlik, Palantir Foundry, Tableau, and/or similar data analytics platforms.
Desired
- Expertise in Experience with cloud platforms (e.g., Azure, AWS, Google Cloud) and related services.
- Experience with implementing data lakehouse architectures and integrating multiple analytics platforms.
- Familiarity with AI/ML concepts and tools like TensorFlow, PyTorch, or Scikit-learn.
- Experience with / knowledge of software engineering / languages (Python, Ruby, R, Java, Scala, Rust, etc.)
�
�
BlueWater Federal is proud to be an Equal Opportunity Employer.� All qualified candidates will be considered without regard to race, color, religion, national origin, age, disability, sexual orientation, gender identity, status as a protected veteran, or any other characteristic protected by law. BlueWater Federal is a VEVRAA federal contractor and we request priority referral of veterans.
Qualifications- BA/BS in Data Management/Science/Analytics, Computer Sciences, Information Technology, or related fields is desired
- Must be a US Citizen. Active TS clearance and SCI eligibility is required.
- 1+ years of Data Analysis experience
- Experience in building data pipelines, ETL/ELT workflows, and data storage solutions.
- Experience connecting to authoritative data sources via APIs (e.g., REST APIs) and integrating that data into data lakehouses or similar architectures. Proficiency in Python, SQL, and/or other programming languages for data manipulation.
- Hands-on experience with data storage and processing tools such as Apache Spark, Databricks, Hadoop, or similar.
- Experience with data normalization, mining/aggregation, and visualization
- Experience with operational dashboard design, development, and use
- Familiarity with one or more of the following: Jupyter Notebooks, Databricks, Qlik, Palantir Foundry, Tableau, and/or similar data analytics platforms.
Desired
- Expertise in Experience with cloud platforms (e.g., Azure, AWS, Google Cloud) and related services.
- Experience with implementing data lakehouse architectures and integrating multiple analytics platforms.
- Familiarity with AI/ML concepts and tools like TensorFlow, PyTorch, or Scikit-learn.
- Experience with / knowledge of software engineering / languages (Python, Ruby, R, Java, Scala, Rust, etc.)
�
�
BlueWater Federal is proud to be an Equal Opportunity Employer.� All qualified candidates will be considered without regard to race, color, religion, national origin, age, disability, sexual orientation, gender identity, status as a protected veteran, or any other characteristic protected by law. BlueWater Federal is a VEVRAA federal contractor and we request priority referral of veterans.
Responsibilities- Design, implement, and maintain data pipelines for ingesting, transforming, and storing structured and unstructured data from diverse sources.
- Connect to authoritative data sources via APIs (e.g., REST APIs) to ingest data into a data lakehouse architecture for further processing and storage.
- Automate Extract, Transform, and Load/Extract, Load, and Transform (ETL/ELT) workflows to enable real-time or near-real-time data processing.
- Curate ingested data into structured datasets optimized for analytics and reporting.
- Implement and enforce DoD-compliant data governance frameworks, including metadata tagging and data classification.
- Monitor and improve compliance with data policies and access control protocols.
- Optimize the storage and retrieval of data to ensure seamless integration with analytics platforms and dashboards.
- Support analytics initiatives using tools like Jupyter Notebooks, Databricks, Qlik, Palantir Foundry, Tableau, and other platforms.
- Partner with mission teams to understand operational data requirements and translate them into technical solutions.
- Explore and implement emerging technologies such as machine learning (ML) tools and AI frameworks to improve data processing and insights.