Hyderabad, India
6 days ago
Data Engineering - SMTS - Hyderabad

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts.

Job Category

Software Engineering

Job Details

About Salesforce

We’re Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too — driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good – you’ve come to the right place.

The Salesforce Cloud Economics and Capacity Management (CECM) team is looking for a Lead Engineer with experience across distributed systems to join us!

You will be working cross-functionally with engineers, architects, and product managers to build the breakthrough features that our internal customers will love, adopt and use while ensuring stable and scalable applications. You'll be a part of a modern, lean, self-governing product engineering team where you have the ability to switch hats between coding to requirements gathering, to testing for quality and performance.

CECM develops intelligent, data driven tools which enable strategic decision-making pertaining to Salesforce infrastructure expenditure and capacity management. We are building a platform that provides near real-time monitoring of cost and capacity utilization of the infrastructure, which will help in optimizing resource allocation and minimizing costs. We apply advanced machine learning techniques to turn the petabytes of data generated by our global infrastructure into actionable predictions and business insights used by capacity planners, internal service owners, and technical leaders daily. As an internal tooling team, engineers are expected to directly interact with customers to develop requirements and design, release and maintain distributed systems with visibility throughout Salesforce.

This is a fantastic opportunity for someone who is passionate about building scalable, resilient, distributed systems that collect, process, and analyze massive volumes of operational data. The skillset includes possessing strong data architecture, ETL, SQL and a proven track record working with enterprise metrics to build automated data pipelines with deep proficiency in Big Data Tech stack such as Spark, Trino, Hive, Airflow.

Responsibilities

* Develop, automate, enhance, maintain scalable ETL pipelines

* Independently design & develop resilient and reusable automation data frameworks.

* Responsible for end-to-end delivery including performance tuning, monitoring applications, analyzing logs and performing system operations.

* Evaluate and determine root cause and resolve production issues.

* Work with internal team members and external partners to support data collection and analysis and understand reporting needs

* Work and collaborate with global teams across AMER, and APAC.

Required Skills/Experience

* Bachelors degree in Computer Science.

* 5+ years of experience in data engineering, data modelling, automation and analytics

* Deep understanding of data engineering tech stack, database designs, associated tools, system components, internal processes and architecture.

* Experience working as a technical lead/solution architect in a customer-focused team

* Must be able to strategically communicate status and identify risks

* Self-starter, highly motivated, able to shift directions quickly when priorities change, think through problems to come up with innovative solutions and deliver against tight deadlines.

* Must be results oriented and able to move forward with minimal direction

* Hands-on expertise in building scalable data pipelines using best practices in data modelling and ETL processes

* Experience in distributed SQL analytics engines such as Spark and Trino

* Hands-on experience in Big Data technologies like Spark, Trino, Hive, Airflow

* Experience working in Agile and Scrum methodology, incremental delivery, and CI/CD

* Experienced in cloud provider platforms such as AWS, Azure, GCP

* Experience working with globally distributed engineering teams.

* Experience with data visualization tools like Tableau is a Plus

* Experience with full stack development is a Plus

Accommodations

If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form.

Posting Statement

At Salesforce we believe that the business of business is to improve the state of our world. Each of us has a responsibility to drive Equality in our communities and workplaces. We are committed to creating a workforce that reflects society through inclusive programs and initiatives such as equal pay, employee resource groups, inclusive benefits, and more. Learn more about Equality at www.equality.com and explore our company benefits at www.salesforcebenefits.com.

Salesforce is an Equal Employment Opportunity and Affirmative Action Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, or disability status. Salesforce does not accept unsolicited headhunter and agency resumes. Salesforce will not pay any third-party agency or company that does not have a signed agreement with Salesforce.

Salesforce welcomes all.

Confirm your E-mail: Send Email