Let's Write Africa's Story Together!
Old Mutual is a firm believer in the African opportunity and our diverse talent reflects this.
Job Description
ROLE OVERVIEW
Build, enhance, and maintain our real-time data pipeline. Work with various infrastructure and operations teams to maintain our data infrastructure. As a senior engineer on the Data Streaming Platform team, you will be responsible for developing and maintaining data ingestion pipelines vital to the continued growth of the bank. You will collaborate with many teams across the bank to understand data needs and turn them into platforms and services, monitor and maintain the health of the data streams, and have an impact on the data ecosystem within Square.
Develop data products & data warehouse solutions in on-premises and cloud environments using cloud-based services, platforms and technologies. Dynamic and results-driven Lead Stream & Data Engineer with extensive experience in designing, developing, and deploying high-performance, re-usable streaming applications using Apache Kafka, Apache Flink and Java.
Proven expertise in building scalable data pipelines and real-time processing systems that enhance operational efficiency and drive business insights. Strong background in microservices architecture, cloud technologies, and agile methodologies.
KEY RESULT AREAS
Operational Delivery
· Build, enhance, and maintain our real-time data pipeline
· Work with various infrastructure and operations teams to maintain our data infrastructure
· Be self-driven in identifying and documenting feature gaps, and designing and implementing solutions to them
· Help build, modernize, and maintain services and tooling to ensure resiliency, fix data discrepancies, and enhance the customer experience
· Monitor daily execution, diagnose and log issues, and fix pipelines to ensure SLAs are met with internal stakeholders
· Mentor other engineers and help them grow; code reviews, guidance on best practices, leveraging your experience in the field
Technical Leadership
· Participate in the engineering and other discipline’s community of practice
· Share AWS knowledge and practical experience with community
· Challenge and contribute to development of architectural principals and patterns
Compliance
· Ensure solutions adhere to Olympus patterns, guidelines and standards
· Operate within project environments and participate in continuous improvement efforts
Delivery Management
· Follow and participate in defined ways of work including, but not limited to, sprint planning,
· backlog grooming, retrospectives, demos and PI planning
EXPERIENCE
· Experience of developing solutions in the cloud
· At least 2-3 years' experience with designing and developing streaming Data Pipelines for Data Ingestion or Transformation using AWS technologies
· Experience with distributed log systems, such as Kafka and AWS Kinesis
· redPanda or Confluence experience advantageous
· Programming Languages: Proficient in Java (Java SE 8/11), with a solid understanding of object-oriented programming principles and Python 3.
· Streaming Technologies: Extensive experience with Apache Kafka, including Kafka Streams API for real-time data processing, producer/consumer development, and stream management and expertise in Apache Flink.
· Desirable skills: Decodable, K8’s, Datavault, data warehouses and modelling.
· Frameworks: Deep knowledge of Spring Boot for building RESTful services and microservices architectures; adept at using Spring Cloud for distributed systems.
· Database Management: Skilled in integrating various databases (e.g., NoSQL ) with streaming applications to ensure efficient data storage and retrieval.
· Cloud Platforms: Hands-on experience deploying applications on AWS, utilizing services such as EC2, S3, RDS, and Lambda for serverless architectures.
Technical Leadership
· Participate in the engineering and other discipline’s community of practice
· Technical leadership and mentorship
· Developing and monitoring of data engineering standards and principles
· Lead technical delivery within teams and provide oversight of solutions
· Share AWS knowledge and practical experience with community
· Challenge and contribute to development of architectural principals and patterns
Delivery Management
· Follow and participate in defined ways of work including, but not limited to, sprint planning,
· backlog grooming, retrospectives, demos and PI planning
· Experience
· Experience of developing solutions in the cloud
· At least 5+ years' experience with designing and developing Data Pipelines for Data Ingestion or Transformation using AWS technologies
· Experience in developing data warehouses and data marts
· Experience in Data Vault and Dimensional modelling techniques
· Experience working in a high availability DataOps environment
· Proficiency in AWS services related to data engineering, such as AWS Glue, Athena, and EMR. Strong programming skills in languages like Python and Java.
· Experience in implementating CI/CD pipelines
· Github
· Designing and implementation of scalable streaming architectures using technologies such as Apache Kafka, Apache Flink, or AWS Kinesis to handle real-time data ingestion and processing.
QUALIFICATIONS
· Bachelor’s Degree in Computer Science or similar fields like Information Systems, Big Data, etc.
· AWS Data Engineer Certification would be advantageous
· Related Technical certifications
The appointment will be made from the designated group in line with the Employment Equity Plan of Old Mutual South Africa and the specific business unit.
Designs, builds, and maintains the infrastructure that supports data storage, processing, and retrieval. Works with large data sets and develops data pipelines that move data from source systems to data warehouses and other data storage and processing systems. OML roles mapped to this profile are: Data Engineer and Technical Lead: Back-end.ResponsibilitiesData ArchitectureDesign, implement, and evolve robust and effectively aligned architecture solutions that operate in the business ecosystem.
Data ManagementManage key aspects of the data management system. This includes being responsible for developing or operating key elements of the system.
Business Requirements IdentificationElicit complex business requirements using a variety of methods, such as interviews, document analysis, workshops, and workflow analysis, to express the requirements in terms of target user roles and goals.
Databases InstallationInstall and test the most complex databases and associated products to ensure they are suitable for use and meet customer requirements.
Infrastructure and Network Development and MaintenanceDesign and select business-critical storage, data center, and client/server environments to design solutions in line with industry best practice and provide a third-line point of escalation for appropriate global infrastructure solutions.
Information SecurityLead in detecting and analyzing security incidents, including attacks, breaches, and identified vulnerabilities, and remediate any security gaps in line with the security incident management procedure.
DocumentationCreate and maintain complex technical and/or user documentation to a high standard.
Technical Developments RecommendationDiscuss and recommend more complex or innovative technical developments to improve the quality of the website/portal/application software and supporting infrastructure to better meet users’ needs.
Analysis of "As Is" and "To Be"Document "as is" and "to be" processes and describe the changes required to migrate to the "to be" capability to record accurately the change required.
Operational ComplianceMaintain and renew a deep knowledge and understanding of the organization's policies and procedures and of relevant regulatory codes and codes of conduct, and ensure own work adheres to required standards. Or identify, within the team, patterns of noncompliance with the organization's policies and procedures and with relevant regulatory codes and codes of conduct, taking appropriate action to report and resolve these and escalating issues as appropriate.
Data Software DevelopmentDevelop existing and new data applications by analyzing and identifying areas for modification and improvement. Develop new applications to meet customer requirements.
Data Software RoadmapDefine and maintain a road map to facilitate data software development and ensure the development work is prioritized in line with business requirements.
Data Software MaintenanceMonitor, identify, and correct the most complex software defects to maintain fully functioning applications software.
Design and ConceptualizationWork effectively with cross-functional teams to conceptualize products and services, leveraging data to drive original design ideas and decisions.
Skills
Action Planning, Business Requirements Analysis, Computer Literacy, Database Administration, Database Reporting, Data Compilation, Data Controls, Data Management, Data Modeling, Executing Plans, Gaps Analysis, Information Technology (IT) Support, IT Architecture, IT Implementation, IT Network Security, Market Analysis, Test Case Management, User Requirements DocumentationCompetencies
Action OrientedBusiness InsightCultivates InnovationDrives ResultsEnsures AccountabilityManages ComplexityOptimizes Work ProcessesPersuadesEducation
NQF Level 9 – MastersClosing Date
21 January 2025 , 23:59The appointment will be made from the designated group in line with the Employment Equity Plan of Old Mutual South Africa and the specific business unit in question.
Old Mutual Limited is pro-vaccination and encourages its workforce to be fully vaccinated against Covid-19.
All prospective employees are required to disclose their vaccination status as part of the recruitment process.
Please refer to the Old Mutual’s Covid-19 vaccination policy for further detail. Kindly note that Old Mutual reserves the right to reinstate the requirement to vaccinate at any point if it is of the view that it is imperative to do so.
The Old Mutual Story!