Valhalla, NY
53 days ago
Data Engineer
General Description: Participate in the design and development of data pipelines. Collaborate with various cross functional teams to understand data requirements and deliver data driven solutions. Includes data scientists, analysts, and business stakeholders. Support the data needs of multiple teams, systems, and products.
Responsibilities:• Participate in the design, development, and maintenance of Azure Data Factory and dbt data pipelines. Utilize extract, load, and transform (ELT) methodologies to move data from various systems into a Snowflake enterprise data warehouse.• Participate in the design, development, and maintenance of SSIS/TSQL processes. Utilize extract, transform, and load (ETL) methodologies to move data from disparate systems into SQL databases.• Contribute to the development of best practices for data ingestion, dataset creation, storage and updates, naming conventions, and retention.• Collaborate with stakeholders to define tables, views, and schemas to support data products.• Participate in the design and implementation of data models in Snowflake. Ensure they are efficient, scalable, and optimized for performance.• Monitor the data movement processes and pipelines for overall health and performance. • Investigate and troubleshoot issues to ensure data availability and reliability.• Fine tune and optimize SQL queries and pipelines for improved efficiency.
Cyber Security Job Responsibilities:• Operate in accordance with USI Policies for the Information Security Program (PISP) and USI Standards for the Information Security Program (SISP).• Keep up to date with security updates and improvements. Implement improvements as appropriate.• Protect systems against unauthorized access by appropriately defining access.• Upgrade systems by implementing and maintaining security and technical controls.
Knowledge, Skills and Abilities:• 2 years + SQL programming, SQL database design, and data architecture concepts.• Bachelor’s degree in computer science, information systems, statistics, or related field.• Experience with data models and data warehousing.• Experience working with scrum agile processes.• Experience with Microsoft Azure components such as DevOps and Azure Data Factory.• Good understanding of Snowflake or similar cloud data warehousing.• Good understanding of modern data transformations using dbt or similar tool.• Hands-on experience working with Git code repositories and CI/CD tools.• Working knowledge of insurance brokerage or sales workflows and systems preferred.

#LI-JM4

#LI-Remote 


Confirm your E-mail: Send Email