About Skyflow:
We are Skyflow, a Silicon Valley startup that has built the world’s first data privacy vault delivered as an API. Our mission is to transform how businesses handle and protect their users’ financial, healthcare, and personal information — the data that powers our digital economy. Inspired by the zero trust data vaults that Apple and Netflix built to handle customer data, we've built a cloud-based vault that is available through a simple and elegant API. With Skyflow, developers can easily build best-of-breed data privacy, security and compliance directly into their applications, the same way they use Stripe, Twilio, or Okta.
Skyflow is based in Palo Alto California, with offices in Bangalore, India, and team members working from locations all around the world. We have former Executives and Leaders from the likes of Salesforce, Google, Twilio, and Oracle. Come join us!
About the role:
As a Senior DevOps Engineer, you will be responsible for driving the effort to identify, design, and develop the best technical and field solutions to automate our production systems. This position will collaborate often with various internal and external business and engineering teams. You will also have an opportunity to eventually lead efforts to champion and instill a culture of DevOps at Skyflow.
We know great DevOps Engineers come from diverse backgrounds so no single individual may have all the desired skills on day one. But if you are the kind of software engineer who would have loved to engineer solutions for Stripe or Twilio API's, or the Slack or Zendesk app, or the Snowflake or MongoDB platform - we want to talk to you.
Desired Qualifications:
- 5+ overall years hands-on experience with 2+ years of experience in infrastructure automation and software delivery using DevOps practices
- Familiarity with cloud platforms (e.g., AWS, Azure, GCP).
- Coding experience with Go (preferred) or Python.
- Experience with DevOps tools - CloudFormation/Terraform, Jenkins, Ansible and others
- Hands-on experience with Linux Systems Engineering, Docker and Kubernetes container orchestration, RDBMS, and scripting for automation
- Ability to work with distributed teams to provide technical guidance and leadership
- Solid understanding of the common challenges with migrations and modernizations, the ability to choose the right path based on previous experience
- Expertise with application observability patterns and site reliability practices
- Extensive experience working with large distributed infrastructures
Responsibilities:
- Utilize programming languages like Python and Go, Container Orchestration services including Docker and Kubernetes, CM tools including Terraform and Helm, and a variety of AWS tools and services on a daily basis
- Develop and maintain CI/CD pipelines to enable automated testing, building, and deployment of applications.
- Collaborate with cross-functional teams and clients to deliver robust cloud-based solutions that drive best-in-class experiences to Skyflow customers
- Automate and maintain tools/systems involving software builds, continuous testing, automated deployments, software health monitoring and software releases
- Evaluate reliability, performance, scalability, and engineering aspects to ensure a smooth software production rollout and delivery
- Be a thought leader and key contributor within our DevOps team and help build a DevOps culture
Benefits:
- Excellent Insurance Options
- Very generous PTO
- Flexible Hours
- Generous Equity
At Skyflow, we believe that diverse teams are the strongest teams. We invite applicants of all genders, races, ethnicities, nationalities, ages, religions, sexual orientations, disability statuses, educational experiences, family situations, and socio-economic backgrounds.