Chennai
37 days ago
Lead II - DevOps Engineering

Role Proficiency:

Act under guidance of DevOps; leading more than 1 Agile team.

Outcomes:

Interprets the DevOps Tool/feature/component design to develop/support the same in accordance with specifications Adapts existing DevOps solutions and creates relevant DevOps solutions for new contexts Codes debugs tests and documents and communicates DevOps development stages/status of DevOps develop/support issues Selects appropriate technical options for development such as reusing improving or reconfiguration of existing components Optimises efficiency cost and quality of DevOps process tools and technology development Validates results with user representatives; integrates and commissions the overall solution Helps Engineers troubleshoot issues that are novel/complex and are not covered by SOPs Design install and troubleshoot CI/CD pipelines and software Able to automate infrastructure provisioning on cloud/in-premises with the guidance of architects Provides guidance to DevOps Engineers so that they can support existing components Good understanding of Agile methodologies and is able to work with diverse teams Knowledge of more than 1 DevOps toolstack (AWS Azure GCP opensource)

Measures of Outcomes:

     Quality of Deliverables      Error rate/completion rate at various stages of SDLC/PDLC      # of components/reused      # of domain/technology certification/ product certification obtained      SLA/KPI for onboarding projects or applications      Stakeholder Management      Percentage achievement of specification/completeness/on-time delivery

Outputs Expected:

Automated components :

Deliver components that automates parts to install components/configure of software/tools in on premises and on cloud Deliver components that automates parts of the build/deploy for applications


Configured components:

Configure tools and automation framework into the overall DevOps design


Scripts:

Develop/Support scripts (like Powershell/Shell/Python scripts) that automate installation/configuration/build/deployment tasks


Training/SOPs :

Create Training plans/SOPs to help DevOps Engineers with DevOps activities and to in onboarding users


Measure Process Efficiency/Effectiveness:

Deployment frequency
innovation and technology changes.


Operations:

Change lead time/volume Failed deployments Defect volume and escape rate Meantime to detection and recovery

Skill Examples:

     Experience in design installation and configuration to to troubleshoot CI/CD pipelines and software using Jenkins/Bamboo/Ansible/Puppet /Chef/PowerShell /Docker/Kubernetes      Experience in Integrating with code quality/test analysis tools like Sonarqube/Cobertura/Clover      Experience in Integrating build/deploy pipelines with test automation tools like Selenium/Junit/NUnit      Experience in Scripting skills (Python Linux/Shell Perl Groovy PowerShell)      Experience in Infrastructure automation skill (ansible/puppet/Chef/Poweshell)      Experience in repository Management/Migration Automation – GIT BitBucket GitHub Clearcase      Experience in build automation scripts – Maven Ant      Experience in Artefact repository management – Nexus/Artifactory      Experience in Dashboard Management & Automation- ELK/Splunk   Experience in configuration of cloud infrastructure (AWS Azure Google)   Experience in Migration of applications from on-premises to cloud infrastructures   Experience in Working on Azure DevOps ARM (Azure Resource Manager) & DSC (Desired State Configuration) & Strong debugging skill in C# C Sharp and Dotnet   Setting and Managing Jira projects and Git/Bitbucket repositories Skilled in containerization tools like Docker & Kubernetes

Knowledge Examples:

     Knowledge of Installation/Config/Build/Deploy processes and tools      Knowledge of IAAS - Cloud providers (AWS Azure Google etc.) and their tool sets      Knowledge of the application development lifecycle      Knowledge of Quality Assurance processes      Knowledge of Quality Automation processes and tools      Knowledge of multiple tool stacks not just one      Knowledge of Build and release Branching/Merging      Knowledge about containerization      Knowledge of Agile methodologies Knowledge of software security compliance (GDPR/OWASP) and tools (Blackduck/ veracode/ checkmarxs)

Additional Comments:

Job Summary We are seeking a skilled and motivated DevOps Engineer with a strong background in AWS big-data solutions and expertise in implementing CI/CD pipelines using Harness.io. As a key member of our dynamic team, you will play a crucial role in designing, developing, and maintaining robust, scalable, and secure data pipelines and infrastructure for our big-data applications. Accountabilities AWS Big-Data Solutions: Collaborate with cross-functional teams to design, deploy, and manage AWS-based big-data solutions, including data storage, processing, and analytics services. Leverage AWS services such as Amazon S3, Amazon EMR, Amazon Redshift, and AWS Glue to architect efficient and scalable data workflows. Harness.io Implementation: Lead the adoption and utilization of Harness.io for the continuous integration and continuous deployment (CI/CD) pipelines. Design, configure, and automate CI/CD workflows to streamline the development, testing, and deployment processes of big-data applications. Security Validation: Integrate robust security practices into the CI/CD pipelines and build/release processes. Implement security checks, vulnerability scanning, and compliance validation to ensure data privacy, integrity, and protection at every stage of the pipeline. Infrastructure as Code (IaC): Champion the IaC approach for managing infrastructure resources. Use tools like AWS CloudFormation or Terraform to provision and manage AWS resources, ensuring consistency and reproducibility. Automation and Orchestration: Drive automation initiatives to increase operational efficiency. Automate repetitive tasks, infrastructure provisioning, and configuration management using scripting languages and tools like Ansible. Collaboration and Knowledge Sharing: Foster a culture of collaboration, knowledge sharing, and continuous improvement within the DevOps and broader engineering teams. Mentor junior team members and participate in peer code reviews. Best Practices and Innovation: Stay up-to-date with the latest trends, tools, and technologies in the AWS and big-data domain. Introduce innovative solutions and best practices to enhance the performance, reliability, and security of our data infrastructure. Basic Qualifications: Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience). Proven experience in designing and deploying AWS big-data solutions, leveraging services like S3 and Glue. Hands-on expertise in implementing CI/CD pipelines using Harness.io or similar tools for big-data applications. Strong knowledge of security principles and experience integrating security checks into CI/CD pipelines (eg Sonarqube, Checkmarx). Proficiency in infrastructure automation using tools like AWS CloudFormation, Terraform, or similar. Solid scripting skills in languages such as Python, Bash, or PowerShell. Experience with IaC, configuration management tools (e.g., Ansible), and version control systems (e.g., Bit Bucket). Strong problem-solving skills, ability to troubleshoot complex issues, and an eye for detail. Excellent communication and teamwork abilities, with a focus on fostering a collaborative and inclusive work environment.

Confirm your E-mail: Send Email