Position Summary
The CloudOps & DevOps – Hybrid Engineer supports the Juno Labs team by maintaining and enhancing secure data flows between Kafka, PostgreSQL, and S3. This role focuses on strengthening observability tools, ensuring secure access for all relevant roles, validating OpenTelemetry data collection, and coordinating user setup across stakeholders. The engineer collaborates closely with developers, security teams, and delivery leaders to ensure reliable, compliant infrastructure that meets the needs of sensitive client projects.
Key Responsibilities
- Design and maintain secure Kafka-PostgreSQL-S3 data flow architectures to support project initiatives
- Configure and manage secure access controls for all observability roles across tools and platforms
- Test and validate OpenTelemetry flows to ensure accurate and secure data collection
- Coordinate access provisioning and user setup for stakeholders, ensuring adherence to security and compliance requirements
- Collaborate with developers, data engineers, and security teams to streamline CI/CD pipelines and improve operational efficiency
- Monitor system health, troubleshoot issues, and maintain high availability and performance of data pipelines
- Integrate observability solutions (logs, metrics, and traces) to ensure full visibility into hybrid cloud infrastructure
- Implement security best practices for data in transit and at rest across all integrated systems
- Document processes, configurations, and security controls to support knowledge sharing and compliance standards
- Enhance infrastructure and tooling to improve scalability, performance, and security for ongoing and future initiatives
- Other duties as assigned
Qualifications
- Bachelor’s degree in Computer Science, Information Technology, or related field, or equivalent experience
- U.S. Citizenship (required for work on designated client projects)
- Demonstrated experience in CloudOps and/or DevOps roles supporting hybrid cloud environments
- Strong hands-on experience with Kafka, PostgreSQL, and S3 data flows
- Awareness of FedRAMP, HIPPAA and SOC2 protocols
- Proficiency with OpenTelemetry and observability frameworks
- Proven ability to secure cloud data pipelines and manage role-based access control
- Experience with containerized deployment and promotion pipelines (Docker/ECR/EKS/Fargate/AKS/GKE)
- Experience coordinating with cross-functional teams for user setup and secure access
- Solid understanding of CI/CD tools, infrastructure-as-code (e.g., Terraform, Ansible), and container orchestration (Kubernetes preferred)
- Excellent troubleshooting, collaboration, and documentation skills
- Alignment with RTS Core Values