DevOps Lead / Architect – GCP, AWS, Docker, Kubernetes

6 - 11 Years

Job Description

Experience - 6+ Years

Location - Pune

Job Description:

  • Design, implement, and manage DevOps capabilities in cloud offerings using CI/CD toolsets and automation
  • Strong automation skills (tool agnostic) and the ability to drive initiatives to automate processes.
  • Ensure DevOps systems build are robust in the sense they can scale, handle rapid growth, and limit exposure to single points of failure and security vulnerabilities
  • This is a hands-on role that architects and supports build and release automation pipelines. You will be part of one team that will deploy a full software stack in public/ private clouds.
  • Plan, install and deploy highly available solutions in public cloud
  • Manage entire pipelines and working with tools such as Jenkins, Ansible, Chef, Puppet, Salt stack and Terraform
  • Ability to review deployment and operational environments, i.e., execute initiatives to reduce failure, troubleshoot issues across the entire infrastructure stack, expand monitoring capabilities, and manage technical operations.
  • Expert skill in any of the following tools and systems: (will vary depending on the job): Docker, Jenkins, Chef, Puppet, Salt, Git, MongoDB (or other NoSQL DB).
  • Strong operational experience in Linux/Unix environment and scripting languages: Shell, Perl, Python.
  • Support the automation requirements of continuous integration and continuous deployment
  • Integrate test data provisioning with automated environment provisioning
  • Identify and develop metrics and dashboards to monitor adoption and maturity of Dev Ops within the AppDev teams.
  • 6+ recent years as a DevOps engineer, in a role responsible for planning, designing, and leading implementation of high volume software development infrastructure growth.
  • Strong understanding and familiarity with the fundamentals of UNIX systems administration


  • Able to troubleshoot issues quickly and effectively
  • Hands on with Shells, Shell scripting basics.
  • Experience with microservices architectures and deploying Docker containers
  • Experience with Git, Maven and Artifactory and similar
  • Experience working alongside and supporting multiple Agile development teams
  • Experience deploying to leading cloud providers AWS, GCP and Azure
  • Above average expertise in any one programming language- C++, C#, Ruby, Python

  • Expert at Python
  • Expert at Kubernetes and/or Docker and/or OpenShift
  • Bachelors degree in Computer Science or related field


Desired Candidate Profile

Please refer to the Job description above


UG:B.Tech/B.E. - Any Specialization

Company Profile

Datametica Solutions Pvt Ltd

Datametica is a global leader in Cloud deployments and DevOps for Data Management, Big Data, and Advanced Analytics.

Being a preferred solutions provider, Datametica helps medium and large enterprises to consolidate their data into a modern data platform on the Cloud, move their Legacy Data Warehouses, Hadoop, Mainframe and ETL to the Cloud and modernize analytics capabilities.

Our services and technologies bring experience, and clarity for organizations to migrate their data services and analytics to the cloud with world-leading automation which helps in removing complexity, reducing risk and speeding migration.

Our Datametica Big Data specialists, with the scale, expertise, experience, flexibility, and cultural alignment understands the business, analytics, and data management imperatives of your organization.

Datametica aims to take the fear, uncertainty and risk out of a migration to the Cloud.

We are honored to be a preferred migration, modernization, big data, and analytics solution provider by Google Cloud Platform, Amazon Web Services, and Microsoft Azure.

Datametica: Knowledge, Focus and Integrity.
View Contact Details+

Contact Company:Datametica Solutions Pvt Ltd


Not Disclosed by Recruiter

Role Category

Programming & Design


Project Lead

Employment Type

Full Time, Permanent