Position Summary
Prometheus Federal Services (PFS) is a trusted partner to federal health agencies, delivering mission-driven data, analytics, and technology solutions. We are seeking skilled Cloud/Data Engineering to design, build, and modernize data platforms that support analytics, automation, and AI/ML initiatives across federal health programs. In this role, you will develop scalable cloud-based data pipelines, implement data quality and governance frameworks, and contribute to modernization efforts that drive insight, performance, and operational excellence for our federal clients.
Essential Duties and Responsibilities- Design, develop, and deploy end-to-end pipelines for data acquisition, preparation, cleaning, and transformation
- Design, develop, and deliver data quality monitoring and observability tools to track data lineage and manage integrity across data pipelines
- Design, develop, and manage cloud solutions to deliver data, analytics, and AI/ML products, including migration and modernization efforts
- Manage cloud environments to support data and analytics development, including provisioning, configuration, and cost management
- Apply DevOps tools to deliver enhanced automation, workflow orchestration, and monitoring
Minimum Qualifications- Bachelor's degree in Computer Science, Data Science, or a related field
- Five (5) + years of experience with data engineering and data architecture development, including developing scalable ETL/ELT pipelines for reporting and analytics
- Three (3) + years of experience working with cloud infrastructure (i.e., AWS, Azure, GCP), including environment provisioning, configuration, and management
- Three (3) + years of experience utilizing languages such as SQL, Python, and PySpark for data ingestion, cleaning, and transformation
- Experience working with DevOps/DevSecOps tools and frameworks, including Continuous Integration/Continuous Delivery (CI/CD), build automation, and Infrastructure as Code (e.g., Terraform, CloudFormation) and containerization (e.g., Docker, Kubernetes)
- Experience working across a variety of data warehousing tools, including SQL and Oracle
- Experience with GitHub, GitLab, and GitHub Actions
- Excellent written and verbal communication skills with both technical and non-technical stakeholders
- Authorized to work in the U.S. indefinitely without sponsorship
- Ability to obtain a public trust
Preferred Qualifications- Experience building data pipelines and architecture to manage ingest, integration, and data product development for large-scale unstructured datasets (PDF, documents) across multiple source systems
- Experience working with data platforms and processing tools, including Databricks and Spark.
- Experience migrating legacy and on-premises data and systems to cloud environments
- Experience developing data products through a medallion architecture
- Knowledge and experience applying data governance and data quality management tools, frameworks, and best practices
- Amazon or Azure cloud certifications