"Ukrposhta" is a national postal operator, a reliable and responsible employer. > service and convenience for the Client. A team that is always ready for new challenges and important missions!Right now we are looking for ETL/DWH DeveloperMain responsibilities:Create, maintain and optimize ETL -pipelines for integrating data from different sources into a data warehouse (DWH)Ensuring cleanliness, transformation and loading of data according to business requirementsResearch and understanding of dat
"Ukrposhta" is a national postal operator, a reliable and responsible employer.
> service and convenience for the Client. A team that is always ready for new challenges and important missions!
Right now we are looking for ETL/DWH Developer
Main responsibilities:
- Create, maintain and optimize ETL -pipelines for integrating data from different sources into a data warehouse (DWH)
- Ensuring cleanliness, transformation and loading of data according to business requirements
- Research and understanding of data sources (SQL, API, files , data flows)
- Working with large volumes of data to minimize process execution time
- Setting up and supporting ETL tasks in a schedule (for example, Cron, Apache Airflow, Talend Scheduler)
- Automation of checking the correctness and integrity of data
- Creating mechanisms for tracking and monitoring data quality
The successful candidate meets the following requirements:
- Education - Higher technical (preferably - computer science or related) and work experience in the field of information technology from 3 years
- Understanding Business Processes: ?Knowledge of how data is used to make business decisions
- Data Modeling: ?Knowledge OLAP/OLTP, construction of multidimensional data models
- Validation and data quality:?Tools and approaches for checking the correctness of data
- Organization of ETL processes:?Working with the schedule of tasks (scheduler): Cron , Apache Airflow or others
- Experience with ETL: from 2 years
- Development of DWH projects: real experience of warehouse implementation data
- Process automation: creating scripts/packages for ETL automation
- ETL tools and platforms: ?Pentaho Data Integration (PDI), Talend, Informatica, Apache Nifi, SSIS, or others
- Working with databases: ?SQL: high level of knowledge to write complex queries, procedures, triggers
- Databases: Oracle, PostgreSQL, MySQL, Microsoft SQL Server or others
- Knowledge of DWH (Data Warehouse) concepts such as star and snowball schemas
- Scripting languages: Python, Shell, Bash
- Programming languages: Java, Scala (for big data or complex transformations)
- Big data integration: ?Hadoop, Spark, Kafka (or similar streaming platforms)
- Performance optimization: ?Tweaking and optimization ETL processes, high-volume workand data (performance tuning)
- Visualization tools:?Tableau, Power BI, Metabase (optional, if you need to present business data)
- API integration:?Working with REST/SOAP API for data import/export
- Working with infrastructure:?Docker, Kubernetes for deploying solutions,?CI/CD tools (Jenkins, GitLab CI)
- Attention to details: especially important when working with large volumes of data, ?teamwork: collaboration with DBAs, data analysts, BI developers
- Communication: ability to clearly explain technical solutions to non-technical colleagues, ?quick learning of new tools and approaches?< /li>
We offer:
- Official employment in large, growing company
- Stable salary
- Bonus for performance of planned indicators
- Training and professional development
- Career opportunities growth
Attach and send your resume using the link below.
TOGETHER TO VICTORY WITH UKROPOSTA!