Main duties:
Building, optimizing and maintaining ETL/ELT pipelines for processing data from various sources (API, files, manually filled data);
Implementation of processes of data collection, cleaning, normalization, transformation and enrichment;
Integration with cloud services Microsoft Azure (Functions, Blob Storage, Logic Apps, etc.);
Deployment and maintenance of pipelines in the cloud environment (CI/CD, logging, monitoring);
Working with PostgreSQL: schema design, query optimization, ind
Main duties:
- Building, optimizing and maintaining ETL/ELT pipelines for processing data from various sources (API, files, manually filled data);
- Implementation of processes of data collection, cleaning, normalization, transformation and enrichment;
- Integration with cloud services Microsoft Azure (Functions, Blob Storage, Logic Apps, etc.);
- Deployment and maintenance of pipelines in the cloud environment (CI/CD, logging, monitoring);
- Working with PostgreSQL: schema design, query optimization, indexing;
- Preparation of structured datasets for the BI team and building reports in Power BI;
- Collaboration with analysts and product team to ensure data availability and create data products;
- Analysis and elimination of errors in data processing logic, maintaining the stability of pipelines;
- Participation in the development of data storage architecture, recommendations regarding the structure of the database;
- Documentation of the structure of pipelines, data sources, database schemas and processing business logic.
Professional skills:
- 3+ years of experience as a Data Engineer or Back-End Engineer with a focus on working with data;
- Practical experience in developing ETL/ELT processes;
- Sure knowledge of PostgreSQL or other relational DBMS: creation of structures, queries, optimization;
- Understanding and experience working with cloud services (AWS, Azure, GCP);
- Knowledge of Python or another language for writing data processing scripts;
- Experience with BI tools (Power BI, Tableau): preparation of datasets and metrics
- Experience of working with GitHub, GitLab, task management systems;
- Understanding the principles of building a reliable and scalable data pipeline.
Personal skills:
- Ability to work both in a team and independently;
- Desire to constantly learn and improve your skills;
- Ability to quickly adapt to the situation, quickly make informed decisions;
- Organization, responsibility, stress resistance;
- Discipline, absence of bad habits (alcohol, drug, gaming addiction, etc.).
Send your resumes or call the official number to schedule an interview.