Position: Lead Software Engineer
Location: Pune, Maharashtra, India
Best Watches available for you on Amazon FashionAbout the job
As a senior Hadoop and DevOps Engineer at Circana, you will leveraging your proven Hadoop and DevOps expertise and applying them to the develop, maintain, and automate code and infrastructure deployment. We are looking for a talented and motivated DevOps Engineer who has worked on Hadoop stack and knows how to manage, monitor, and automate routine tasks. The ideal candidate will be a highly motivated individual with a focus Azure/AWS/Google Cloud CICD, scripting and automation. Person should have good deal of understanding of Hadoop echo system components. You will be responsible for building CICD pipelines for Serverless and Containerized applications, improve development cycles, and ensure scalability and reliability. Experience with Cloud Services, Python, along with good communication and documentation skills would be ideal. Deeper understanding of Unix/Linus OS and scripting is essential for this role. Sr. Hadoop DevOps Engineer role is expected to apply a software engineering mindset to system administration topics elevating the availability, reliability, and security of solutions while reducing operational costs.
Essential Job Responsibilities:
- Install, maintain, monitor, and evaluate Hadoop Platform health and alerts.
- Apply patches and/or assist in upgrade and build out of cluster(s) as needed.
- Design, implement, and manage CI/CD pipelines using tools like Azure DevOps.
- Design, manage and administrate repositories in Git, GitHub, SVN etc.
- Working experience with Oracle or MySQL.
- Experience with programming and scripting with languages like Python, Bash and SQL.
- In depth understanding of monitoring tools like Grafana and Azure Monitor.
- Administer and scale applications in Azure and AWS environments, ensuring high availability and performance.
- Deploy, configure, and manage Kubernetes clusters.
- Provide technical support for deployment-related issues and outages.
- Implement monitoring, logging, and alerting solutions to maintain system health and performance.
- Work with development teams to streamline the software delivery process through automation and best practices.
- Ensure infrastructure security by implementing industry best practices and regular audits.
- Optimize infrastructure costs and resource usage.
- Stay up to date with emerging technologies and propose solutions to enhance the existing infrastructure.
Minimum Qualifications:
- Bachelor’s degree in computer science or related field.
- 10+ years of experience in a DevOps or similar role.
- Experience in maintaining Hadoop infrastructure.
- Proficiency in Azure DevOps.
- Familiarity with infrastructure as code tools like Terraform or CloudFormation.
- Strong expertise in Azure or AWS or Google Clous services and infrastructure.
- Hands-on experience with Kubernetes and its ecosystem.
- Excellent problem-solving skills, proactive approach, and the ability to work independently.
- Excellent verbal, written and communication skills.
Knowledge/Skills/Abilities Required:
- Demonstrated success in communication, collaboration, and motivation of cross-functional departments to achieve exceptional service.
- Strategic thinking, complex problem solving and analytical capabilities.
- Excellent Communication, diagnostic and issue resolution skills
- Ability to see the big-picture and take a holistic approach to problem-solving.
- Ability to take complex information and communicate in a manner that is understandable to all audiences.
- Demonstrated understanding of the evolving landscape of technology.
- Ability to effectively network with colleagues to share knowledge and gain new perspectives.
Technical Skills:
- Expert knowledge of Hadoop administration and maintenance.
- Comfortable with Unix/Linux OS and scripting.
- Expert knowledge of Cloud DevOps skills.
- Expert knowledge in CICD techniques.
- Experienced with core cloud infrastructure especially AWS, GCP, Azure.
- Experience with programming and scripting with languages like Python, Unix Shell
- Experience working with deployment/configuration management tools like Atlassian Bitbucket Pipelines, Jenkins, Maven, Puppet, PaciCorp Terraform, or Ansible is required.
- In-depth understanding of version control tools like GIT, Azure DevOps.
- Working Knowledge of relational DB like Oracle/MySQL.
Apply: Click here to Apply
Get new laptop from Amazon