Educational
Experience being part of high-performance agile teams in a fast-paced environment
Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams
Proven ability to produce results in the analysis, design, testing and deployment of applications
Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
Responsibility
The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks
Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
Write up and maintain technical specifications, design documents and process flow.
Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
Elaborate user stories for technical team and ensure that the team understands the deliverables.
Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
Provide direction to the Agile development team and stakeholders throughout the project.
Assist in Data Architecture design, tool selection and data flows analysis.
Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
Handle ad-hoc analysis & report generation requests from the business.
Respond to data related inquiries to support business and technical teams.
Functional Competency
Technical expertise regarding data architecture, models and database design development
Strong knowledge of and experience with Java, SQL, XML’s, Python, ETL frameworks and Databricks
Working knowledge/familiarity with Git version control.
Strong Knowledge of analyzing datasets using Excel
Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development
Knowledge & working experience with Duck creek is an added plus
Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus
Experience with JIRA
7+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB)
Extensive experience developing complex solutions focused on data ecosystem solutions.
Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
In depth knowledge of data engineering and architecture disciplines
Extensive experience working with Big Data tools and building data solutions for advanced analytics.
Technical expertise regarding data architecture, models and database design development
Strong knowledge of and experience with Java, SQL, XML’s, Python, ETL frameworks and Databricks
Educational
Experience being part of high-performance agile teams in a fast-paced environment
Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams
Proven ability to produce results in the analysis, design, testing and deployment of applications
Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
Responsibility
The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks
Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
Write up and maintain technical specifications, design documents and process flow.
Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
Elaborate user stories for technical team and ensure that the team understands the deliverables.
Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
Provide direction to the Agile development team and stakeholders throughout the project.
Assist in Data Architecture design, tool selection and data flows analysis.
Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
Handle ad-hoc analysis & report generation requests from the business.
Respond to data related inquiries to support business and technical teams.
Functional Competency
Technical expertise regarding data architecture, models and database design development
Strong knowledge of and experience with Java, SQL, XML’s, Python, ETL frameworks and Databricks
Working knowledge/familiarity with Git version control.
Strong Knowledge of analyzing datasets using Excel
Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development
Knowledge & working experience with Duck creek is an added plus
Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus
Experience with JIRA
7+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB)
Extensive experience developing complex solutions focused on data ecosystem solutions.
Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
In depth knowledge of data engineering and architecture disciplines
Extensive experience working with Big Data tools and building data solutions for advanced analytics.
Technical expertise regarding data architecture, models and database design development
Strong knowledge of and experience with Java, SQL, XML’s, Python, ETL frameworks and Databricks