Next job

Middle Data Engineer (DP) in Sigma Software

Posted more than 30 days ago

321 views

Sigma Software

Sigma Software

0
0 reviews
Without experience
Prague
3 серпня 2024 Middle Data Engineer (DP) Львів, Івано-Франківськ, Луцьк, Тернопіль, Черкаси, Чернівці, Варшава (Польща), Краків (Польща), Прага (Чехія), віддалено We are seeking a motivated and self-driven individual to join our dynamic team. Our team highly values employees’ freedom, independence in decision-making, and desire to deeply understand clients’ requests and identify the root of the problem. We believe that people who embrace this mindset are destined to succeed as acclaimed

3 серпня 2024

Middle Data Engineer (DP)

Львів, Івано-Франківськ, Луцьк, Тернопіль, Черкаси, Чернівці, Варшава (Польща), Краків (Польща), Прага (Чехія), віддалено

We are seeking a motivated and self-driven individual to join our dynamic team. Our team highly values employees’ freedom, independence in decision-making, and desire to deeply understand clients’ requests and identify the root of the problem. We believe that people who embrace this mindset are destined to succeed as acclaimed professionals and driving forces in data development in Ukraine.

СUSTOMER
Our Client is a community-powered fashion marketplace with over 30 million registered users across more than 150+ countries. It’s a platform for discovering and celebrating personal style while promoting sustainable fashion by extending the life of millions of garments. Founded in 2011 and headquartered in London, with offices in Manchester and New York, our Client employs around 400 people. In 2021, it became a wholly owned subsidiary of Etsy but continues to operate independently. The company is committed to diversity, inclusion, and fair recruitment processes, supporting visa sponsorship for certain roles and skill sets.

Job Description

  • Contributing to new technology investigations and complex solution design, supporting a culture of innovation by considering matters of security, scalability, and reliability, with a focus on building out our ETL processes
  • Working with a modern data stack, coming up with well-designed technical solutions and robust code, and implementing data governance processes
  • Working and professionally communicating with the customer’s team
  • Taking responsibility for delivering major solution features
  • Participating in the requirements gathering and clarification process, proposing optimal architecture strategies, and leading the data architecture implementation
  • Developing core modules and functions, designing scalable and cost-effective solutions
  • Performing code reviews, writing unit and integration tests
  • Scaling the distributed system and infrastructure to the next level
  • Building data platform using power of AWS cloud provider

Qualifications

  • 3+ years of strong experience with Python as a programming language for data pipelines and related tools
  • Familiarity and understanding of distributed data processing with Spark for data pipeline optimization and monitoring workloads
  • Proven strong track record of building data transformations using data build tools
  • Excellent implementation of data modeling and data warehousing best practices
  • Experience working with Looker with a developer proficiency (not user), and with LookML
  • Strong Data Domain background — understanding of how data engineers, data scientists, analytics engineers, and analysts work to be able to work closely with them and understand their needs
  • Good written and spoken English communication skills
  • Familiarity with software engineering bestpractices: testing, PRs, Git, code reviews, code design, releasing

WOULD BE A PLUS

  • Data certifications in Data Engineering or Data Analytics
  • 2 or more years of experience with Databricks and Airflow
  • Experience with DAGs and orchestration tools
  • Experience in developing Snowflake-driven data warehouses
  • Experience in developing event-driven data pipelines

    ______________________________________________________________
    Our Client is a public fashion marketplace with more than 30 million registered users in more than 150 countries of the world. It is a platform for finding and highlighting personal style, which contributes to the sustainable development of fashion by extending the life of millions of items of clothing. Founded in 2011, with headquarters in London and offices in Manchester and New York, our Client's company has about 400 employees. In 2021, it became a wholly owned subsidiary of Etsy, but continues to operate independently. The company is committed to diversity, inclusion and fair recruitment processes, supporting visa sponsorship for specific roles and skill sets.

    We are looking for a motivated and goal-oriented individual to join our dynamic team. Our team highly values ​​the freedom of employees, independence in decision-making and the desire to deeply understand customer requests and identify the root of the problem. We believe that people who share this way of thinking are destined to succeed as recognized professionals and driving forces of data development in Ukraine.

    Responsibilities
  • Participation in the research of new technologies and the development of complex solutions, supporting a culture of innovation, taking into account issues of security, scalability and reliability, with an emphasis on building of our ETL processes
  • Working with a modern data stack, developing well-thought-out technical solutions and reliable code, as well as implementing data management processes
  • Working and communicating professionally with the customer's team
  • Responsibility for the implementation of the main functions of the solution
  • Participation in the process of gathering and clarifying requirements, proposing optimal architectural strategies and leading the implementation of data architecture
  • Development of basic modules and functions, designing scalable and economical effective solutions
  • Performing code reviews, writing unit and integration tests
  • Scaling the distributed system and infrastructure to the next level
  • Building a data platform using the capabilities of the cloud provider AWS < p>

    Requirements
  • 3+ years of experience with Python as a data pipeline programming language and related tools
  • Knowledge and understanding of distributed data processing using Spark to optimize data pipelines and monitor workloads
  • Proven track record of building data transformations using data architecture tools
  • Excellent implementation of data modeling and data warehousing best practices
  • Experience with Looker at the developer (non-user) level and with LookML
  • Strong experience inand Data Domain - understanding how data engineers, data scientists, analytics engineers and analytics work to be able to work closely with them and understand their needs
  • Good written and spoken English skills
  • Knowledge of software engineering best practices: testing, PR, Git, code review, code design and release

An advantage would be:

  • Certifications in Data Engineering or Data Analytics
  • 2 or more years of experience with Databricks and Airflow
  • Experience with DAG and orchestration tools
  • Experience in developing data warehouses based on Snowflake
  • Experience in developing event-driven data pipelines

Without experience
Prague
Want to get related jobs?
New job openings in your Telegram
Subscribe
We use cookies
accept