COMFY is a leading omnichannel retailer of household appliances and electronics in Ukraine with 115 stores, which is one of the top three e-commerce platforms.We are looking forLead Data Engineer who will be responsible for creating and scaling the company's data infrastructure that supports all business analytics, ML and AI solutions.The focus is on platform-level data architecture: how data is collected, move, transform and become a reliable basis for making business decisions in real time.Thi
COMFY is a leading omnichannel retailer of household appliances and electronics in Ukraine with 115 stores, which is one of the top three e-commerce platforms.
We are looking forLead Data Engineer who will be responsible for creating and scaling the company's data infrastructure that supports all business analytics, ML and AI solutions.
The focus is on platform-level data architecture: how data is collected, move, transform and become a reliable basis for making business decisions in real time.
This is a role for an engineer who builds not individual pipelines, but a system where:
- batch and streaming work as a whole;
- data is available fast, stable, and predictable;
- analytics and ML can scale without reworking the foundation.
The key value of the role is to transform data infrastructure from "technical support" to a critical part of the business operating model.
Main tasks:
- Data Platform construction: design and implementation (DWH/Lakehouse), batch + streaming data processing; development of scalable pipelines, real-time integrations (orders, stock, pricing), ensuring high availability, fault tolerance, performance
- Implementation: Data catalog, data quality checks, data lineage, logging and alerting. Single Source of Truth Guarantee
- Technical leadership of the Data Engineers team: team development, architectural solutions, code review, mentoring and task prioritization
- Cooperation with Analytics CTO, BI and Data Science: quick access to data and ML features, optimization of cloud costs (storage and compute)
We expect
- 6+ years of experience in Data Engineering
- Mandatory experience building DWH / Lakehouse, scaling data pipelines
- Technical skills —SQL, Python; ETL / ELT: Airflow, dbt, Kafka, Spark; DWH: BigQuery / Snowflake / Redshift; Cloud: GCP / AWS / Azure; Git, CI/CD, Terraform (preferred)
- Experience in retail or e-commerce, marketplace will be an advantage
- System thinking, end-to-end responsibility, transparent communication with business and focus on stability.
We offer
- Influential role in data infrastructure development
- Working with large volumes of data and real business cases
- Strong team and support from the business side
- Competitive compensation and development opportunities