Location : Mumbai
Reports to : Director, Enterprise Information Mgmt & Analytics
Workplace type : Hybrid
Salary : 22 – 28 lpa
Our Digital Technology Solutions organization has followed our digital strategy for the last several years and built a solid base for digital transformation through the deployment of several digital platforms that are adding value to the organization. We are now prepared to move forward and align with Ingredion’s Play to Win (P2W) strategy and organize our Digital Technology Solutions organization to achieve this strategy. To accomplish the next stage of our strategy, we have designed an organization model and structure that will closely align with our global functions and will establish improved governance models to ensure we align functional technology investments with business priorities that enable realized value.
The Data Engineer is responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. They play a crucial role in building and managing the data pipelines that enable efficient and reliable data integration, transformation, and delivery for all data users across the enterprise.
Core Responsibilities:
Design and implement processes for the ETL process from various sources into the organization’s data systems.Build data pipelines that enable efficient and reliable data integration, transformation, and delivery for all data users across the enterprise.Ensure timely releases of data pipelines and data products into production.Collaborate with and review the work of contracting data engineers and dataops engineers to successfully accomplish the goals of the project.Work with Agile and DevOps techniques and implementation approaches in the deliveryActively participate in all ongoing initiatives related to the data and analytics ecosystem, including capital projects, operational projects, and proof of concept work.Contribute to Ingredion internal groups and forums.Requirements:
Bachelor’s Degree related field.Experience in all aspects of data engineering especially building pipelines and populating data stores.Must have experience with databases like Oracle, SQL Server, etc. and Big Data/Cloud platforms like Microsoft Azure, Google Big Query/ Google cloud platform (GCP), DatabricksHands-on experience with ETL and coding for unstructured, semi-structured, and structured dataMust have experience with REST API development.Experience with Kafka, Event Hubs, Python, and Spark is preferredAbility to work independently and as part of a team to successfully execute projects.Ability to multitask and meet aggressive deadlines efficiently and effectively.Excellent communication and collaboration skills, experience working in collaborative environment.Proven success in contributing to a multi-location team-oriented environment.Certification in Data engineering or related fields is a plus.Relocation Available:
No