Charlotte, NC, USA
1 day ago
Data Governance Tech Lead
Sr Data Engineer - GE07BEData Engineer - GE08AE

We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future.          

The Hartford is on a mission to enable our employees and customers to make good business decisions by providing consistent and accurate business intelligence and insights. The Data Governance program is at the core of this transformation, creating the processes that make data a highly valuable, discoverable, reusable asset.

As a Data Governance Tech Lead, you will be an expert in our procedures and tools for Data Catalog, Data Marketplace, Data Lineage, and Data Quality.

Responsibilities:

Serve as technical expert for usage of Informatica IDMC Catalog, Data Quality, and Data Marketplace and custom add-on solutions; enhance and maintain custom add-ons.

Design and implement highly scalable, secure, and performant data architectures and data pipelines leveraging cloud technologies and modern data frameworks to meet evolving customer needs.

Formulate logical statements of business problems and devise, test and implement efficient, cost-effective application program solutions (e.g., code and/or reuse existing code using program development software alternatives and/or integrate purchased solutions.)

Troubleshoot IDMC and add-on tool results and be able to differentiate between root causes such as user error, tools configuration, technical limitations, or bugs and provide viable and realistic solutions.

Collaborate with cross-functional teams, including Data Architects, Data Engineers, Platform Engineers (Data), Testers, and Data Engineering leadership.

Partner with Data Engineering and Data Platform leadership to design and implement processes and observability tooling that can be used to monitor data quality and availability to ensure that data is accurate, available for stakeholders, and supports proactive issue resolution.

Build functional prototypes and software component primitives that can be used as quick-starts or solution scaffolding, which data engineering teams can extend to leverage as accelerators for rapid time to value.

Collaborate with the Platform Engineering team to co-develop and maintain automation that reduces toil for Data Engineers (i.e., make it easy to manage and move code through environments to production).

Proactively assess technical issues and risks that could impact speed, functionality, flexibility, or clarity.

Serve as lead engineer in disaster recovery drills to ensure data environments are stable and resilient ensuring high availability and operational continuity.

Partner with platform engineering teams for capacity planning and workload optimizations to ensure data systems can handle data volumes (current and future) and workloads efficiently.

Assist in creating and continuously improving Data Governance processes and reference materials.

Proactively manage priorities by working with Product Managers.

Participate in or lead a Community of Practice that advances the discipline of data management and data engineering by driving alignment across teams to embrace standardized patterns & practices.

Stay abreast of the latest developments in data management, data architecture, cloud (AWS, Informatica IDMC), and contemporary engineering practices.

Promote a culture of ownership to transform data into a strategic asset.

Partner with data architecture and engineering teams to implement security best practices and compliance standards for protecting sensitive data and ensure regulatory compliance (e.g., HIPAA, CCPA).

Mentor junior data engineers, fostering a culture of continuous learning and rapid experimentation while evangelizing best practices and Enterprise standards.

Perform routine design and code reviews to assess design quality, code quality, validate adherence to Enterprise standards, and coach junior engineers on best practices.

Develop and evangelize strategies and DataOps practices for configuration management, source code management, environment management, and CI/CD tooling (e.g., Github, Jenkins).

Participate in the evaluation of new technologies and tools that will enhance the organizations data infrastructure and capabilities.

Foster a positive work environment by exemplifying cultural values and encouraging others to promote collaboration, accountability, diversity, and continuous improvement.

Required Qualifications:

Bachelor’s degree in Computer Science, Data Analytics, Engineering, or a related discipline and 7 or more years’ experience in IT systems analysis and application program development.

7+ years of experience in data engineering/ETL, data warehouse, or data science.

5+ years of technical leadership experience (e.g., Tech Lead, Senior Staff Engineer).

7+ years of experience with Enterprise database systems (e.g., Oracle) with very strong SQL skills.

4+ years Cloud (AWS) Technologies experience.

5+ years of hands-on experience in Python Development

Strong working experience with AWS services (Lambdas, EC2, RDS, AWS secret manager, APIG, ALB)

API development and integration expertise, including handling authentication, authorization, pagination, and error handling.

Infrastructure as code experience with Terraform/CloudFormation.

Proficiency in frontend technologies such as Angular, React, Bootstrap.

Knowledge of Unix/Linux Shell scripting, Autosys.

Experience with continuous integration and DevOps methodologies using GitHub and Jenkins.

Ability to create system and architecture designs based on business requirements.

Ability to work effectively in an Agile environment.

Strong critical thinking skills, analytical ability, problem analysis and attention to detail.

Strong verbal and written communication skills.  Ability to explain complex concepts in a clear and understandable manner to both technical and non-technical individuals.

Ability to work collaboratively with diverse teams.

Ability to work closely with Data Governance teammates in two-way knowledge transfer.

Team player with transformation mindset.

Proven ability to work independently, organize and manage multiple priorities in a timeline driven environment.

Preferred Qualifications

Experience with Informatica IDMC, Snowflake, PySpark, DevOps, Docker, Serverless, Decision engineering (security, storage type, timeliness, desired outcome, etc.), change management.

Knowledge of Informatica Data Integration and/or Talend.

AWS certification.

Experience with Observability monitoring tools such as Dynatrace, Splunk.

This role will have a Hybrid work arrangement, with the expectation of working in an office 3 days a week (Tuesday through Thursday).

Candidates must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position.

Compensation

The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford’s total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:

$97,600 - $170,040

The posted salary range reflects our ability to hire at different position titles and levels depending on background and experience.

Equal Opportunity Employer/Females/Minorities/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age

About Us | Culture & Employee Insights | Diversity, Equity and Inclusion | Benefits

Confirm your E-mail: Send Email