EY is the only major professional services firm with a dedicated financial services practice (EY FSO) integrated in the EMEIA region. An international team of over 16.000 professionals is working across borders for our clients in the financial sector: Banking, Insurance, Payment Institutions and Wealth & Asset Management, in all service lines: Consulting, Tax, Transactions and Assurance. As we consider our people as the heart of EY, we hire and develop the most passionate people in their field to build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that, whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
The Opportunity
EY Belgium Modern Data Platform team supports our clients in defining and rolling out the right data architecture, data platform and infrastructure that support their needs, implement and maintain automated data pipelines, and infuse data through business intelligence and analytics. We are currently looking for capable and motivated professionals to join our team as Data Engineer.
Key Responsibilities
As Data Engineer, you will :
• Support in designing and rolling out the data architecture and infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
• Identify data source, design and implement data schema/models and integrate data that meet the requirements of the business stakeholders
• Play an active role in the end-to-end delivery of AI solutions, from ideation, feasibility assessment, to data preparation and industrialization.
• Work with business, IT and data stakeholders to support with data-related technical issues, their data infrastructure needs as well as to build the most flexible and scalable data platform.
• With a strong focus on DataOps, design, develop and deploy scalable batch and/or real-time data pipelines.
• Design, document, test and deploy ETL/ELT processes
• Find the right tradeoffs between the performance, reliability, scalability, and cost of the data pipelines you implement
• Monitor data processing efficiency and propose solutions for improvements.
• Identify and investigate potential data quality issues and explore ways to improve data quality and reliability
• Prepare Ad-Hoc queries for different business stakeholders
• Integrate new data management technologies and software engineering tools into existing structures.
• Be able to understand and use the wide variety of scripting languages and tools to integrate systems.
• Have the discipline to create and maintain comprehensive project documentation.
• Build and share knowledge with colleagues and coach junior profiles.
Skills and Attributes for Success
• Master’s degree in computer science, engineering, mathematics or another relevant subject
• 2+ years of experience in big data related software development
• Experience with data modeling, design patterns and building highly scalable and secured data platforms and infrastructures.
• Practical experience with Python data exploration, analysis and engineering libraries.
• Familiar with big data platforms like appliances (i.e.: Teradata), Hadoop, Spark
• Advanced knowledge of data structures and algorithms, networking, operating systems and DataOps tools.
• Ability to work with a wide variety of data types (i.e. structured and unstructured), extracting and integrating data from modern and legacy data platforms
• Meaningful experience in extracting and loading data from multiple database technologies like RDBMS (MS SQL Server, Oracle, PostgreSQL), MPP (Snowflake) and NoSQL (MongoDB, Neo4J)
• Experience and interest in Microsoft Azure Cloud platforms.
• Implement pipelines using a combination of technologies, including SQL, Airflow, Python, Alteryx, Kafka, and cloud data tech stack
• Commercial client-facing project experience is helpful
• Ability to work collaboratively in a team environment and effectively with people at all levels in an organization
• You love picking your brain on large datasets, and tackle each new problem with a can-do & pragmatic mindset
• You have strong communication and presentation skills and love to take full ownership of your projects and help clients in overcoming their challenges.
• You are fluent in Dutch and/or French, proficient business English
Can we dream for more?
• You have a strong business understanding of the financial sector (Banking, Insurance or Capital markets)
• You are proud to share relevant experience in coaching a team of Junior Data Engineers
• You have already found your way through cloud computing services, such as Azure, AWS or GCP
• You have a good overview of the (big—)data ecosystems and frameworks (Hadoop or other)
• You have experience with lean/agile software development
• You are familiar with NoSQL tools for exploiting unstructured information
• You are not afraid to support of business development and pre-sales activities
What working at EY Offers
• You will be part of a leading global professional services firm.
• You will be part of the EY family where everyone is willing to offer support and senior management is very accessible.
• You will join a dynamic and growing team with a great mix of young and experienced professionals focusing on financial services.
• You will get extensive trainings on technical matters, as well as soft skills and project management, and you will have access to new technologies and innovative equipment.
• We are proud of our flexible working arrangements, and we will support you to build a successful career and deliver excellent client service, without sacrificing your personal priorities.
• While our client-facing profession might require part-time working at client site and business traveling at times, we are committed to helping you achieve a lifestyle balance.