Warsaw
5 days ago
Cloud Data Engineer

It's fun to work in a company where people truly BELIEVE in what they're doing!
 

We're committed to bringing passion and customer focus to the business.

About Us

Kyriba is a global leader in liquidity performance that empowers CFOs, Treasurers and IT leaders to connect, protect, forecast and optimize their liquidity. As a secure and scalable SaaS solution, Kyriba brings intelligence and financial automation that enables companies and banks of all sizes to improve their financial performance and increase operational efficiency. Kyriba’s real-time data and AI-empowered tools empower its 3,000 customers worldwide to quantify exposures, project cash and liquidity, and take action to protect balance sheets, income statements and cash flows. Kyriba manages more than 3.5 billion bank transactions and $15 trillion in payments annually and gives customers complete visibility and actionability, so they can optimize and fully harness liquidity across the enterprise and outperform their business strategy. For more information, visit www.kyriba.com.

Kyriba is seeking a talented Mid-level Cloud Data engineer to join our team of more than 200 engineers working together to bring innovative solutions to Kyriba clients around the globe. We are a team of passionate people motivated by agility, innovation and continuous improvement.


As a Cloud Data Engineer with at least five years of experience in Data Operation, you will be in charge of managing, improving and scaling our data pipelines. The ideal candidate will have previously shown the ability to design, implement and maintain new and existing databases, data pipelines and data warehouses, Additionally, experienced in monitoring, alerting, trending and dashboarding of data metrics. The Cloud Data  Engineer works closely with Cloud Engineering Operations, Product Developers, Product Owners and Customer Support on new architecture and feature introductions.



RESPONSIBILITIES

Create, maintain and execute on improvements to our data pipeline and databases as we scale our systems to meet customer demand

Enhance and extend database, data pipeline and data warehouse performance and scalability

Identify relevant business metrics and implement real time monitoring, alerting, trending and dashboarding.

Analyze and communicate data and database problems

Participate in performance and load test efforts

Assist in creation and maintenance of the test data environments and test data

Respond to non production and production incidents on a rotating basis

Rotating 2nd Level on call with a team of 3-4

Implement customer facing Service Requests involving data processing


SKILLS

B.S. Degree in Computer Science or related field

2+ years' experience in Data Warehousing/Data Lakes (Snowflake, DataBricks preferred)

2+ years' Experience developing infra-as-code (Hashicorp, Terraform, or equivalent)

2+ years' Experience with Docker, Kubernetes (or container orchestration equivalent)

3+ years' Experience with AWS, GCP , and developing CI/CD tooling

3+ years' experience with RDBMs (Postgres, Oracle)

2+ years' experience with data pipeline technologies (kafka, spark, flink)

3+ years' experience with Linux

3+ years' with AWS dbs and data stores

3+ years' with observability, monitoring and alerting tools (Splunk, Wavefront, Opsgenie preferred)

Experience with data migrations from one data platform to another

Experience working in an Agile environment

Experience with automating data and db deployments/upgrades (Liquibase preferred)

Experience with Bitbucket/Jira/Confluence

Understanding of web technologies, including web services, web application servers and RESTful APIs

Experience working in a 24X7X365 SaaS environment

Experience with HIPAA/PHI/GDPR data security and privacy

English is the working language. 

Basic experience with NoSQL (DynamoDB, MongoDB preferred) is a plus

Basic understanding of Finance and Treasury Management is a plus

Confirm your E-mail: Send Email