Next job

Senior Data Engineer in Niks

25 March

17 views

Niks

Niks

0
0 reviews
Without experience
Kharkiv
Full-time work

Translated by Google

Become a data strategy leader for our global clients. We are looking for a Senior Data Engineer who will act as a reliable technical advisor - a person who is able to design complex Lakehouse-architectures, reasonedly defend technical solutions to stakeholders and drive presale activities.MISSION:Your mission is to design scalable, secure and cost-effective data platforms that solve critical business tasks. You will conduct technical audits, define best practices for the team and build the funda

Become a data strategy leader for our global clients. We are looking for a Senior Data Engineer who will act as a reliable technical advisor - a person who is able to design complex Lakehouse-architectures, reasonedly defend technical solutions to stakeholders and drive presale activities.

MISSION:

Your mission is to design scalable, secure and cost-effective data platforms that solve critical business tasks. You will conduct technical audits, define best practices for the team and build the fundamental infrastructure that enables advanced analytics and the implementation of AI/GenAI capabilities for our customers.

WHAT YOU WILL DO:

  • Lead the design and implementation of scalable pipelines and Lakehouse architectures. Act as a key technical expert (Design Authority) on projects.
  • Solve the most difficult technical challenges — optimization of highly loaded streaming pipelines, debugging complex Spark-jobs and development of universal frameworks.
  • Participate in technical assessments, audits of existing systems and preparation of proposals (estimations). Explain to customers the ROI of technical modernization.
  • Ensure that all solutions are ready for production: secure, covered by monitoring, cost-effective (FinOps) and documented.
  • Define coding standards, conduct code reviews and mentor Middle/Junior engineers to develop Data Engineering directly.

WHAT IS IMPORTANT TO US:

  • Python or Scala (Expert level/Design Patterns), SQL (Expert level).
  • Databricks (Unity Catalog), Snowflake, BigQuery, Synapse.
  • Spark/PySpark (Deep understanding of internal mechanisms, tuning, streaming), dbt (Enterprise patterns).
  • Data Mesh, Lakehouse (Delta Lake/Iceberg), Lambda/Kappa.
  • Advanced Terraform/IaC, CI/CD, Docker/Kubernetes.
  • Feature Stores, Vector Databases, MLOps basics.
  • Expert in Python (design patterns, library development) and SQL. Ability to optimize code written by others.
  • Deep knowledge of Databricks (Spark memory management, partitioning strategies) or Snowflake (Warehouse tuning, RBAC, Zero-Copy Cloning). You understand how these tools work "under the hood".
  • Ability to design end-to-end solutions, choose the right tools (for example, argue "Why Snowflake and not Redshift in this case?") and defend your decisions in front of the client's management.
  • Expert knowledge of AWS, GCP or Azure. Deep understanding of networks (VPC, PrivateLink), security (IAM) and integration limits.
  • Experience of participation in Presales, technical audits or discovery phases. Ability to transform business needs into technical specifications.
  • Understanding the principles of data preparation for Machine Learning (Feature Engineering, pipelines for LLM/RAG).

Would be a plus:

  • Gany experience with Kafka, Kinesis or Spark Structured Streaming.
  • Experience with vector databases (Pinecone, pgvector, Weaviate) or frameworks like LangChain.
  • Professional cloud certifications (eg AWS Solutions Architect Pro, Databricks Certified DE Professional).
  • Advanced data modeling for DynamoDB, Cosmos DB or MongoDB.

Translated by Google

Without experience
Kharkiv
Full-time work
Want to get related jobs?
New job openings in your Telegram
Subscribe
We use cookies
accept