238 views
RecruitGarden
Do you want to join a USA-fintech product with a Ukrainian development team and a direct B2B contract? We are looking for a Data Engineer with Airflow and Spark to create innovations in the processing of big data in the banking sector.
Main Responsibilities
– manage data quality and integrity
– assist with building tools and technology to ensure that downstream customers can have faith in the data they're consuming
– maintain and refactor current pipelines, and create new ones
– troubleshoot, process, or optimize business-critical pipelines
– write SQL Query, to cover product requirements and their performance
Mandatory Requirements
– 3+ years of experience in data engineering creating or managing end-to-end data pipelines on large complex datasets.
– proficiency in PySpark or Spark
– expertise in Python
– fluency with MS SQL or other relational databases
– level of English: Upper-Intermediate
– deep expertise with Airflow
We offer
– work in an actively developing product
– friendly team
– salary above market
– great challenge and great opportunities