Tokyo, Japan
9 days ago
Lead, Data Platforms, DD&T, Japan Corporate Function

By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge.

Job Description

Lead, Data PlatformsAbout Takeda”Better Health for People, Brighter Future for the World” is the purpose of a company. We aim to create a diverse and inclusive organization where people can thrive, grow and realize their own potential while enabling our purpose. We continue to innovate and drive changes that will transform the lives of patients. We’re looking for like-minded professionals to join us.Takeda is a global values-based, R&D-driven biopharmaceutical leader. We are guided by our values of Takeda-ism, which has been passed down since the company’s founding. Takeda-ism incorporates Integrity, Fairness, Honesty, and Perseverance, with Integrity at the core. They are brought to life through actions based on Patient-Trust-Reputation-Business, in this order.The OpportunityAs a Data Platforms leader, you will play a pivotal role in shaping Takeda's Data Platforms and Architecture vision and contributing to the company's strategy of becoming a Data-Driven Enterprise. You will work closely with business units and functions within Takeda’s Global Business and its data teams to strategically design data, processes, and technology to accelerate the delivery of life-saving products to the market. This ultimately enables Takeda to make better decisions that enhance the quality and efficiency of patient care. Your responsibilities will include developing data-driven solutions using current and next-generation technologies to meet evolving business needs. You will be expected to identify opportunities and propose potential technical solutions swiftly. Furthermore, you will design application systems that adhere to standard system development methodology and concepts for design, programming, backup, and recovery to deliver high-performance, reliable, and secure solutions.As part of our transformational journey on Data & AI in Operations, we are taking the steps to advance to Data Mesh architecture. The current Datalake gives all Operations units access to critical data and analytic tools at pace, accelerating their work on life-saving medicines. The vision of EDS is also accelerating Operations’ data strategy of making our data Findable, Accessible, Interoperable, and reusable. This is achieved by creating a distributed data architecture and managing our data and data products, which will serve as a centerpiece of this strategy and the future evolution of Data Science.ResponsibilitiesCreate best practices and thought leadership content for the federated delivery teams building data solutions and products on Enterprise Data platforms that cater to batch, streaming, and real-time data.Influence stakeholders at all levels through complex engagement models with the broader cloud ecosystem not limited but inclusive of AWS foundations for Infrastructure and data technologies, Databricks, Informatica, Kafka, Managed File Transfer, and 3rd party applications, ensuring they are excited by the Enterprise Data Services vision and solution strategy.Be a liaison to the business stakeholders and, in the first 30 – 90 days, study the current platform architecture and its dependencies on daily operations and uncover pain points that can be prioritized with operations support. Be an advocate for continuous improvements.Working closely with enterprise architects and business engagement leads to developing a holistic understanding of the evolving landscape of data platforms and how they align with business units and business functions within Japan.Be a 'champion’ for customers and colleagues by operating as an expert Engineer and trusted advisor for significant data analytics architecture, design, and adoption and scaling of the Datalake platform.Provide a roadmap for modernizing legacy capabilities inherent to the current platform. Support all data platform initiatives – Data Lake Strategy, Data Engineering and Platform development, Data Governance, Security Models, and Master Data Management.Establish a collaborative engineering culture based on trust, innovation, and a continuous improvement mindset. Utilize Industry best practices and agile methodologies to deliver solutions and extract efficiency through Continuous Development and Integration automation. Manage efforts to problem-solve engineering challenges and coordinate with project consultants and delivery/engagement managers.As a leading technical contributor who can consistently take a poorly defined business or technical problem, work it to a well-defined data problem/specification, and execute it at a high level, have a strong focus on metrics, both for their work's impact and its engineering and operations.Understanding the Data platform investments creates data tools for consuming services and uncovers opportunities for cost optimization. This assists the team in building and optimizing our platforms into an innovative unit within the company.Skills and QualificationsBachelor’s degree or higher in Computer Science/Information technology or relevant work experience.Knowledge of Enterprise ArchitectureBusiness level Japanese speaking proficiency.(Please note that this job requires business level Japanese language command not only in speaking, but also in business writing and reading.)Identify and highlight risks and issues within the project and escalate appropriately for resolution. Devise effective mitigation and escalation strategies for projects to address risks and issues.Assist in developing one or multiple product strategies and drive the product priority-setting in response to business needs aligned with IT architecture, deployment, and release management.Evaluate existing business processes and find opportunities to integrate and align a broad set of stakeholders' perspectives seamlessly.Support teams in transforming their ways of working, mindsets, and behaviors toward product-centricity and digital dexterity.Monitor portfolio progress and related milestones, identify gaps, and make strategic recommendations.8+ years of relevant work experience in data platforms, solutions, and delivery methodologiesDeep Specialty Expertise in at least one of the following areas:Experience scaling big data workloads that are performant and cost-effective.Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP, using best practices in cloud security and networking.5+ years’ experience in a customer-facing technical role with expertise in at least one of the following:Data ingestion, streaming technologies like Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging big data solutions.Experience with ETL/Orchestration tools (e.g., Informatica, Airflow, etc.)Advanced working SQL knowledge and experience working with relational databases, authoring (SQL), and working familiarity with various databases.Experience building and optimizing big data pipelines, architectures, and data sets.Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Working knowledge of message queuing, pub/sub-stream processing, and highly scalable 'big data' data stores.Outstanding communication and relationship skills, ability to engage with a broad range of partners, and ability to lead by influence.Nice to Have Technology Skills(Java, Python, Spark, Hadoop, Kafka, SQL, NoSQL, Postgres, and/or other modern programming languages and tools such as JIRA, Git, Jenkins, Bitbucket, and Confluence).Familiarity with the core technology stack, including Databricks Lakehouse (Delta Lake) or equivalent such as Big Query/Snowflake, SQL/Python/Spark, AWS, Prefect/Airflow.Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration, REST API, BI tools, and SQL Interfaces (e.g., Jenkins)Industry experience working with public cloud environments (AWS, GCP, or Azure) and associated deep understanding of failover, high availability, and high scalability.Data ingestion using one or more modern ETL computing and orchestration frameworks (e.g., Apache Airflow, Luigi, Spark, Apache Nifi, Flink, and Apache Beam).3+ years of experience with SQL or NoSQL databases: PostgreSQL, SQL Server, Oracle, MySQL, Redis, MongoDB, Elasticsearch, Hive, HBase, Teradata, Cassandra, Amazon Redshift, Snowflake.

Takeda Compensation and Benefits Summary:

Allowances: Commutation, Housing, Overtime Work etc.

Salary Increase: Annually, Bonus Payment: Twice a year

Working Hours: Headquarters (Osaka/ Tokyo) 9:00-17:30, Production Sites (Osaka/ Yamaguchi) 8:00-16:45, (Narita) 8:30-17:15, Research Site (Kanagawa) 9:00-17:45

Holidays: Saturdays, Sundays, National Holidays, May Day, Year-End Holidays etc. (approx. 123 days in a year)

Paid Leaves: Annual Paid Leave, Special Paid Leave, Sick Leave, Family Support Leave, Maternity Leave, Childcare Leave, Family Nursing Leave.

Flexible Work Styles: Flextime, Telework

Benefits: Social Insurance, Retirement and Corporate Pension, Employee Stock Ownership Program, etc.

Important Notice concerning working conditions:

It is possible the job scope may change at the company’s discretion.

It is possible the department and workplace may change at the company’s discretion.

Locations

Tokyo, Japan

Worker Type

Employee

Worker Sub-Type

Regular

Time Type

Full time
Confirm your E-mail: Send Email