***Local W2 Only. No C2C. 3 days onsite/week***
Job Description:
As a DevOps/Data Scientist, you will play a critical role in bridging the gap between data science and infrastructure. You will be responsible for designing, implementing, and maintaining scalable and efficient cloud-based solutions on Google Cloud Platform (GCP) using Terraform. Your work will enable our data science team to deploy models and analytics solutions seamlessly, ensuring high availability and performance.
Key Responsibilities:
- Design, build, and maintain scalable cloud infrastructure on GCP using Terraform.
- Collaborate with data scientists to deploy machine learning models and analytics solutions in a production environment.
- Automate infrastructure provisioning, configuration management, and application deployment processes.
- Monitor and optimize system performance, ensuring high availability and reliability.
- Implement security best practices and ensure compliance with industry standards.
- Develop and maintain CI/CD pipelines to streamline the deployment process.
- Troubleshoot and resolve infrastructure and application issues.
- Stay up-to-date with the latest industry trends and technologies to drive continuous improvement.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, Data Science, or a related field (or equivalent experience).
- Proven experience in DevOps and Data Science roles, with a strong focus on cloud infrastructure.
- Expertise in Terraform for infrastructure as code (IaC) on GCP.
- Solid understanding of GCP services, including Compute Engine, Cloud Storage, BigQuery, and Kubernetes.
- Experience with CI/CD tools such as Jenkins, GitLab CI, or CircleCI.
- Proficiency in programming languages such as Python, Go, or Bash.
- Strong problem-solving skills and the ability to work collaboratively in a team environment.
- Excellent communication skills and the ability to convey complex technical concepts to non-technical stakeholders.
Preferred Qualifications:
- Experience with containerization technologies such as Docker and orchestration tools like Kubernetes.
- Familiarity with data science tools and frameworks such as TensorFlow, PyTorch, or Scikit-learn.
- Knowledge of monitoring and logging tools like Prometheus, Grafana, or Stackdriver.
- Experience with version control systems, particularly Git.
Job ID: 112024-91756
Skills, experience, and other compensable factors will be taken into account when determining pay rate. The pay range provided in this posting is a reflection of a W2 hourly rate; other employment options may be available that may result in pay outside of the provided range.
W2 employees of Eliassen Group who are regularly scheduled to work 30 or more hours per week are eligible for the following benefits: medical (choice of 3 plans), dental, vision, pre-tax accounts, other voluntary benefits including life and disability insurance, 401(k), with match, and sick time if required by law in the worked-in state/locality.