San Mateo, California, USA
1 day ago
Marketplace Operations Associate

Build the future of the AI Data Cloud. Join the Snowflake team.

Snowflake Data Product Operations Manager. Build the future of data with Snowflake.

Snowflake’s Data Cloud is revolutionizing how businesses discover, integrate, and use data products. Our Marketplace allows organizations to seamlessly access valuable datasets, applications, and AI models.

At the core of this ecosystem is Secure Data Sharing, which enables data providers and consumers to share and monetize data efficiently—without relying on traditional file transfers or APIs. Marketplace and Data Sharing are key to Snowflake’s vision, driving the network effect of our Data Cloud.

About the Role

As part of the Collaboration & Marketplace team, you’ll play a crucial role in Snowflake’s Data Cloud strategy. You’ll help build Snowflake’s internal content arm, transforming public-domain data (offered under the Snowflake Data Provider name) into scalable, high-quality data products that enhance our customers’ analytics capabilities.

We’re looking for a highly motivated Data Product Operations Manager to help develop and scale our suite of data products. In this role, you will act as the “Data Owner” for Snowflake’s data products, blending technical expertise with customer-facing and operational responsibilities. You’ll work at the intersection of data engineering, product development, and stakeholder collaboration, ensuring our data products are technically strong and aligned with real-world customer needs.

This is a unique opportunity to shape Snowflake’s Data Business while working with cutting-edge technology.

What You’ll Do / Responsibilities

Data Pipeline Development – Design, build, and maintain efficient, scalable, and reliable data pipelines to transform and integrate public-domain financial data sources.

Customer Support & Communication – Serve as the technical point of contact for customer inquiries via email and calls, helping users get the most value from our data products.

Operational Excellence – Ensure pipelines run smoothly and customers can access the insights they need without interruptions.

Continuous Improvement – Identify ways to improve product quality, usability, and performance based on feedback from customers and internal teams (engineering, product, and customer success).

Documentation & Standards – Write clear, comprehensive data documentation, including modeling decisions and definitions, to support both internal teams and customers.

Analytics Engineering Best Practices – Maintain high standards in data modeling, testing, version control, and code management, creating a scalable and disciplined engineering environment.

Data Source Evaluation – Assess and scope new public-domain datasets for ingestion, ensuring they meet quality and technical requirements.

Impact & Growth – Help shape the future of Snowflake’s data ecosystem, driving innovation and expansion of our internal content arm.

What You’ll Need to Succeed / Requirements

Bachelor’s Degree. 10+ years of experience, including 2-4 years in an analytics-focused role (e.g., Data Product Analyst, Analytics Engineer, Data Engineer, or Analyst/Consultant with a strong engineering background).

Strong SQL skills (preferably Snowflake) with the ability to write efficient, optimized queries.

Proven track record of operational rigor, delivering high-quality, scalable data products.

Excellent communication skills, with experience working directly with customers to understand their needs and provide support.

Self-starter mentality – ability to learn quickly, problem-solve, and work autonomously in an evolving environment.

Business-focused mindset – prioritize delivering usable solutions quickly over building a perfect but unused model.

Strong collaboration skills, comfortable working with both internal teams and external customers.

Familiarity with modern data stack tools (e.g., Snowflake, dbt, AWS S3, Dagster or similar).

Understanding of analytics engineering best practices, including data modeling, version control, documentation, testing, and code management.

Experience in financial services or a related industry is a plus.

Bonus Skills

Experience writing high-quality data documentation for internal and external users.

Familiarity with APIs and integrating them into data pipelines.

Conducting analyses with public-domain financial data.

Additional Details

Hybrid Role: Based in Menlo Park (starting April), currently working from San Mateo. Some travel to NYC may be required.

This is your chance to join a fast-growing team at Snowflake and make a real impact. If you’re passionate about data, analytics, and operational excellence, we’d love to hear from you!

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.

How do you want to make your impact?

The following represents the expected range of compensation for this role: The estimated base salary range for this role is $139,000 - $205,800.Additionally, this role is eligible to participate in Snowflake’s bonus and equity plan.

The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location. This role is also eligible for a competitive benefits package that includes: medical, dental, vision, life, and disability insurance; 401(k) retirement plan; flexible spending & health savings account; at least 12 paid holidays; paid time off; parental leave; employee assistance program; and other company benefits.

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.

How do you want to make your impact?

Confirm your E-mail: Send Email