Bangalore
6 days ago
Lead II - Data Engineering

Role Proficiency:

This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.

Outcomes:

      Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others.       Interpret requirements create optimal architecture and design solutions in accordance with specifications.       Document and communicate milestones/stages for end-to-end delivery.       Code using best standards debug and test solutions to ensure best-in-class quality.       Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure.       Create data schemas and models effectively.       Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes.       Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams.

Measures of Outcomes:

TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches.

Outputs Expected:

Code:

Develop data processing code with guidance
ensuring performance and scalability requirements are met. Define coding standards
templates
and checklists. Review code for team and peers.


Documentation:

Create/review templates
checklists
guidelines
and standards for design/process/development. Create/review deliverable documents
including design documents
architecture documents
infra costing
business requirements
source-target mappings
test cases
and results.


Configure:

Define and govern the configuration management plan. Ensure compliance from the team.


Test:

Review/create unit test cases
scenarios
and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team.


Domain Relevance:

Advise data engineers on the design and development of features and components
leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications.


Manage Project:

Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules.


Manage Defects:

Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality.


Estimate:

Create and provide input for effort and size estimation
and plan resources for projects.


Manage Knowledge:

Consume and contribute to project-related documents
SharePoint
libraries
and client universities. Review reusable documents created by the team.


Release:

Execute and monitor the release process.


Design:

Contribute to the creation of design (HLD
LLD
SAD)/architecture for applications
business components
and data models.


Interface with Customer:

Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs.


Manage Team:

Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures.


Certifications:

Obtain relevant domain and technology certifications.

Skill Examples:

      Proficiency in SQL Python or other programming languages used for data manipulation.       Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.       Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).       Conduct tests on data pipelines and evaluate results against data quality and performance specifications.       Experience in performance tuning.       Experience in data warehouse design and cost improvements.       Apply and optimize data models for efficient storage retrieval and processing of large datasets.       Communicate and explain design/development aspects to customers.       Estimate time and resource requirements for developing/debugging features/components.       Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification.

Knowledge Examples:

Knowledge Examples

      Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.       Proficient in SQL for analytics and windowing functions.       Understanding of data schemas and models.       Familiarity with domain-related data.       Knowledge of data warehouse optimization techniques.       Understanding of data security concepts. Awareness of patterns frameworks and automation practices.

Additional Comments:

12813 - Azure Architect – 8 – 12 yrs Mandatory Skills: Cloud Computing, Ms Azure, Azure Devops Skill to Evaluate: Cloud Computing, Ms Azure, Azure Devops Experience: 8 to 12 Years Location: Bengaluru Job Description: Job Title: Azure Architect As an Azure Architect, you will be responsible for designing, implementing, and maintaining scalable and secure cloud solutions on the Microsoft Azure platform. You will work closely with cross-functional teams to understand business requirements and translate them into effective architectural designs. Your expertise in Azure services and deep understanding of cloud infrastructure will be critical in driving the success of our projects. Key Responsibilities: 1. Design and implement Azure cloud solutions to meet business needs and requirements. 2. Develop architecture blueprints and technical design documentation for Azure-based projects. 3. Collaborate with internal teams to ensure seamless integration and deployment of cloud solutions. 4. Provide technical leadership and guidance on best practices for cloud architecture and design. 5. Keep abreast of the latest Azure technologies and recommend enhancements to existing systems. Requirements: 1. Proven experience as an Azure Architect or similar role with a solid understanding of cloud computing and infrastructure. 2. Deep knowledge of Microsoft Azure services, including compute, storage, networking, and security. 3. Strong proficiency in designing and implementing enterprise-grade solutions on the Azure platform. 4. Excellent communication and interpersonal skills with the ability to work effectively in a collaborative team environment. 5. Relevant certifications such as Microsoft Certified: Azure Solutions Architect Expert or comparable credentials. Education Qualificaiton: Degree in CS or equivalent Roles & Responsibilities: Your responsibilities as an Azure Architect at Sony India Software Centre will include: 1. Designing Azure-based solutions: You will be responsible for designing scalable, reliable, and secure solutions on the Azure platform. This will involve understanding the company's requirements and then translating them into architectural blueprints that make use of a variety of Azure services such as Virtual Machines, Azure SQL Database, Azure Cosmos DB, Azure Functions, Azure Storage, and more. 2. Architecting applications for the cloud: You will work closely with development teams to ensure that applications are designed and built to run efficiently on the Azure platform. This may involve re-architecting existing applications or designing new ones from the ground up to take advantage of Azure's capabilities. 3. Implementing and configuring Azure services: Once the architectural blueprints are in place, you will be responsible for implementing and configuring Azure services to bring the designs to life. This will involve setting up virtual networks, configuring security and access controls, and fine-tuning the environment for optimal performance. 4. Providing technical guidance and mentorship: As an Azure Architect, you will be a subject matter expert on Azure technologies and will be expected to provide technical guidance and mentorship to other teams within the organization. This could involve conducting training sessions, workshops, and one-on-one coaching to help others understand and leverage Azure effectively. 5. Continuous learning and staying updated with Azure advancements: The world of cloud computing is constantly evolving, and as an Azure Architect, you will be expected to stay abreast of the latest advancements in Azure services and technologies. This will involve continuous learning, staying updated with new features and best practices, and evaluating how these advancements can be leveraged to benefit the company..

Confirm your E-mail: Send Email