San Francisco, CA, 94103, USA
1 day ago
(INTL) Sr. Technical Curriculum Developer
Job Description Design and implement data models tailored to specific business needs within the Databricks Lakehouse environmentt. Differentiate between different types of modeling techniques and understand their respective use cases. Analyze business needs to determine data modeling decisions. Design logical and physical data models for specific use cases. Explore the stages of data product lifecycle Understand Data Products definition and use cases Understand the data product lifecycle Organize Data in Domains and in Unity Catalog Utilize Delta Lake and Unity Catalog to define data architectures. Explore Data Integration and secure data sharing techniques Use your data engineering expertise to lead the design, development, and maintenance of multi-modal training content for Databricks customers, partners, and internal staff using the latest versions of Databricks features for data engineering Work with subject-matter-experts and learning architects to scope needs for new courses and updates to existing ones Support instructors with adopting new materials and updates via creation of instructor guides and train the trainers Function as a company-wide thought leader and subject-matter-expert on data engineering concepts at Databricks PR $25-30/hr We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com .     To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ . Skills and Requirements Experience with topics related to big data engineering such as data streaming, pipeline optimization, deployment, and data security Experience with Python software development lifecycle (Git, testing, modularization, etc.) Strong proficiency in Python and SQL Able to learn new technologies quickly Strong analytical skills and data-driven decision-making Familiarity with at least one common cloud provider (AWS, Azure, or Google Cloud) Experience designing and developing highly technical training based on instructional design best practices null We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal employment opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment without regard to race, color, ethnicity, religion,sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military oruniformed service member status, or any other status or characteristic protected by applicable laws, regulations, andordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to HR@insightglobal.com.
Confirm your E-mail: Send Email