Next job

Middle Data Engineer in Niks

25 March

12 views

Niks

Niks

0
0 reviews
Without experience
Kharkiv
Full-time work
We are looking for an engineer who not only writes ETL, but also builds data infrastructure: from data lakes and cloud platforms to streaming systems and scalable pipelines.MISSION:To build and optimize the "base" of data for our clients' businesses. Your mission goes beyond simply moving data; you will ensure pipelines are sustainable, scalable and cost effective (FinOps). You will be the bridge between infrastructure and data, taking responsibility for the technical implementation and helping

We are looking for an engineer who not only writes ETL, but also builds data infrastructure: from data lakes and cloud platforms to streaming systems and scalable pipelines.

MISSION:

To build and optimize the "base" of data for our clients' businesses. Your mission goes beyond simply moving data; you will ensure pipelines are sustainable, scalable and cost effective (FinOps). You will be the bridge between infrastructure and data, taking responsibility for the technical implementation and helping to shape the architecture of modern data platforms.

WHAT YOU WILL DO:

  • Design and implement pipelines to load, clean and transform data into Data Lakes and Data Warehouses.
  • Actively monitor and optimize pipelines to improve productivity and reduce costs. You understand the difference between a query that “just works” and an effective query.
  • Configure deployment pipelines and manage environment parameters using IaC (Terraform) and CI/CD tools.
  • Implement data quality checks (eg Great Expectations, dbt tests) and ensure data integrity.
  • Communicate with stakeholders to clarify requirements and provide technical advice and conduct code review for junior engineers.

WHAT IS IMPORTANT TO US:

  • Python or Scala (Advanced/OOP), SQL (Advanced).
  • Databricks, Snowflake, BigQuery, Synapse, Redshift.
  • Azure Data Factory (ADF), Airflow, Dagster.
  • Architecture).
  • Experience with Databricks (cluster management, notebooks), Azure Data Factory (pipelines, data flows) or Snowflake.
  • Proficient knowledge of Python (OOP, functional programming) and strong SQL skills for complex transformations.
  • Practical experience with at least one major cloud platform (AWS, GCP, Azure) and its data services (eg Kinesis/Lambda, Dataflow/BigQuery, Synapse).
  • Basic understanding of Infrastructure as Code (Terraform). The ability to independently deploy and modify infrastructure resources.
  • The ability to communicate and argue technical solutions directly to customers and the team.

WILL BE A PLUS:

  • Experience in setting up complex modules in Terraform or Pulumi.
  • Ppractical experience with Delta Lake or Apache Iceberg functions (Time Travel, Schema Evolution).
  • Data modeling experience for NoSQL databases (MongoDB, DynamoDB).

WHAT YOU WILL GET:

  • Competitive compensation according to experience and expertise.
  • Influence technology choices, architecture and priorities.
  • Strong environment of engineers, architects and analysts.
  • Modern stack: Spark, Kafka, Snowflake, Databricks, Airflow, dbt, Docker.
  • Career track with transparent reviews and growth plan.
  • Best practices: code review, CI/CD, pipeline testing.
  • Challenges — big data, real-time streams, complex integration.
  • Flexibility - remote-first and work-life balance.
  • Rotation between projects and domains.
  • Development - certifications, platforms, knowledge sharing.
  • Internationality - global clients and multinational teams.


Join NIX — and help us build data architectures that define the future of business.

Without experience
Kharkiv
Full-time work
Want to get related jobs?
New job openings in your Telegram
Subscribe
We use cookies
accept