Oakland, CA, US
18 hours ago
Data Engineer, Expert

Requisition ID # 160693 

Job Category: Information Technology 

Job Level: Individual Contributor

Business Unit: Information Technology

Work Type: Hybrid

Job Location: Oakland

 

 

Department Overview

Information Systems Technology Services is a unified organization comprised of various departments which collaborate effectively in order to deliver high quality technology solutions.

 

Position Summary

The Data Analytics and Insights team is seeking an experienced and talented Expert Data Engineer to join our growing team of analytics experts. As a key member of our team, you will play an essential role in the design, development, and maintenance of data pipelines, analytic products, which includes data applications, reports, and dashboards. We are looking for a proactive, detail-oriented, and motivated individual who can thrive in a fast-paced environment and help us scale our analytic product development to meet our clients' ever-evolving needs. The data engineer will collaborate with our cross functional team including solution architects, data pipeline engineers, data analysts, and data scientists on mission critical initiatives and will ensure optimal delivery of analytic products.

 

You will have a unique opportunity to be at the forefront of the utility industry and gain a comprehensive view of the nation’s most advanced smart grid. It is the perfect role for someone who would like to continue to build upon their professional experience and help advance PG&E’s sustainability goals.

 

This position is hybrid, working from your remote office and the Oakland General Office(OGO) once a month and or based on business needs.

 

PG&E is providing the salary range that the company in good faith believes it might pay for this position at the time of the job posting. This compensation range is specific to the locality of the job. The actual salary paid to an individual will be based on multiple factors, including, but not limited to, specific skills, education, licenses or certifications, experience, market value, geographic location, and internal equity. Although we estimate the successful candidate hired into this role will be placed between the entry point and the middle of the range, the decision will be made on a case-by-case basis related to these factors.​ This job is also eligible to participate in PG&E’s discretionary incentive compensation programs.  

 

A reasonable salary range is:​

 

Bay Area Minimum:    $132,000  

Bay Area Maximum:   $226,000

 

Responsibilities

Work closely with Subject Matter Experts (SMEs) to design and develop data model,  data pipelines and front end applications. Implement data transformations to derive new datasets or create ontology objects necessary for business applications. Monitor and debug critical issues such as data staleness or data quality Improve on performance of data pipelines (latency, resource usage) Implement operational applications using Foundry Tools (Workshop, Quiver, and Slate) Implement data visualizations using Foundry Tools (Quiver and Contour). Maintain applications as usage grows and requirements change. Available for 7x24 operational support.

Minimum Qualifications

Bachelor's Degree in Computer Science, Engineering, or a related field. 7+ years of experience as a data engineer or in a similar role Hands-on experience with building data pipelines from ingestion to final object delivery. Experience with developing analytic applications, reports, or dashboards Experience in using no- and low-code tools for developing analytic applications Strong SQL skills and experience working with large datasets and complex data structures. Complete proficiency in Python and or Pyspark Experience in Typescript (preferred) or Javascript Excellent problem-solving and analytical skills with a strong attention to detail. Experience with Palantir Foundry platform.

 

 

Preferred Qualifications

Knowledge with commercial visualization tools such as Tableau or Power BI. Databases – familiarity with common relational database models and proprietary instantiations, such as SAP, Salesforce etc. Git – knowledge of version control / collaboration workflows and best practices Agile – familiarity with agile and iterative working methodology and rapid user feedback gathering concepts UX design – knowledge of best practices and applications Data literacy – data analysis and statistical basics to ensure correctness in data aggregation and visualization.

 

 

 

 

 

#featuredjob

Confirm your E-mail: Send Email