Bucharest - Dacia One, Romania
18 hours ago
ETL Developer | Wholesale Banking Data Lake @ING Hubs Romania

Discover ING Hubs Romania

ING Hubs Romania offers 130 services in software development, data management, non-financial risk & compliance, audit, and retail operations to 24 ING units worldwide, with the help of over 1700 high-performing engineers, risk, and operations professionals.

We started out in 2015 as ING’s software development hub – a distinct entity from ING Bank Romania – then steadily expanded our range to include more services and competencies.

Now we provide borderless services with bank-wide capabilities and operate from two locations: Bucharest and Cluj-Napoca.

Our tech capabilities remain the core of our business, with more than 1500 colleagues active in Data Management, Touchpoint Channels & Integration, Core Banking, and Global Products.

We enjoy a flexible way of working and a highly collaborative environment, where fair and constructive feedback is encouraged.

For us, impact isn't a perk. It's the driver of our work.

We are guided and rewarded by a shared desire to make the world a better place, one innovative solution at a time. Our colleagues make it their job to do impactful things and they love doing it in good company. Do you?

The Mission

Currently, Wholesale Banking Data Lake is on the verge of transitioning from a traditional data warehousing landscape to a big data platform that processes millions of files containing TBs of data with all the necessary (security) controls, monitoring and automation in place that make the solution scalable and in line with banking minimum standards. Within our environment, key critical processes run daily and support strategic ING bank wide initiatives which are used worldwide ranging from Russia to The United States to Australia and across Europe.

To assist us in taking the next step in our data platform journey, we are looking for colleagues that have the following key characteristics:

Extensive and proven technical background in data warehousing with a good understanding of its core elements such as data ingestion, data unification/modeling and data consumption as well interest in elements related to data governance such as data quality, data lineage and metadata management.

The ability to identify structural and preferably automated solutions to common technical challenges by taking the lead in drafting designs, refining the workload and guiding the implementation towards production together with an agile squad of DevOps engineers with a balance of junior, middle and senior profiles.

The ability and the ambition to learn new (open source) tools and technologies that will be part of our new platform such as Hive, Spark, HDFS and NiFi in order to assist with the gradual but steady migration of flows from the current data warehousing stack towards the new big data platform as well as assisting with new functionality such as streaming data ingestion, spark job execution and open metadata (Egeria) standards.

Your day to day

Even if you’ll start your day from the comfort of your home or drink your morning coffee in our office’s garden, your day will be quite similar when comes to tasks. Here are your daily responsibilities:

Develop ETL flows in IBM Infosphere DataStage based on functional requirements coming from the business teamsAnalyze data, design and build physical models at the various stages of the data pipeline: ingestion of data sources, mapping in data warehouse and creating data marts for consumptionManage data using Hadoop/Hive and Oracle Exadata as main database, following performance standardsHandle end-to-end deployment by using Microsoft Azure DevOps to build the packages and migrate the code to the appropriate environmentsPerform unit tests, end-to-end tests and regression tests to ensure quality code deliveryUse Linux as operating and file transfer system for the incoming/outgoing data filesMaintain the applications running in our Data Lake, monitor the team’s deliverables by incident management and bug fixingKeep the operational controls, which are part of the risk appetite of the bankParticipate in Agile Scrum ceremoniesCollaborate actively in the team of 6-8 members consisting of 3-4 other DevOps engineers and 2-3 Business AnalystsKeep yourself up to date with the latest industry standards and trends in the area of Data Management

What you’ll bring to the team

Experience:Experience with ETL processes and tools, IBM Infosphere DataStage preferablyExperience with Shell scriptingGood knowledge Oracle Database (database design, database optimizations and data analysis on large amounts of data)Ability to translate functional and non-functional requirements (e.g. Security, High Availability)Experience with pair programmingTech stack/ knowledge:Mandatory: Oracle database, Shell scripting, ETL toolingNice to have: Python, Kafka, Hadoop/HiveSoft skills:Excellent written and spoken English skillsAbility to translate functional and non-functional requirements (eg. Security, Scalability, Integration)Foreign languages: English (advanced)Education: nice to have Bachelor’s Degree (or higher) in an IT related field

If you want to deep dive into the processing of personal data conducted by ING Hubs Romania during the recruitment process and your rights related to it, read the privacy notices on our website (make sure to scroll until you reach the Data Protection section/ Candidates tab). 

Confirm your E-mail: Send Email