The position has been closed by the company
Find similar jobs

Middle Data Engineer in Data Science UA

Posted more than 30 days ago

881 views

3 applications   0 responses

0
0 reviews
Dana Kondrevych
Dana Kondrevych
IT Recruiter
3000 - 4600$
2 years
Intermediate
Remote work

Translated by Google

We have 3 Data Engineer positions in a well-known English company

About the client: Our client is a new generation of data service providers, specializing in data consulting and data-driven digital marketing, dedicated to transforming data into business impact across the entire value chain of organizations. Their broad range of data-driven solutions in data consulting and digital marketing are designed to meet clients' specific needs, always conceived with a business-centric approach and delivered with tangible results. Their data-driven services are built upon the deep AI expertise they've acquired with their 1000+ client base around the globe. The client has 1,000 employees across 20 offices who are focused on accelerating digital transformation.


As a Data Engineer, you will work with one of the biggest clients. We need Data engineers to join and strengthen the current team and deliver high-quality service for clients. The client already has a certain scope of tasks for the first couple of months.


The first scope — creating ETL pipelines and connecting outsourced platforms. Work with Could Run and App Engine, as well as other services in GCP. Build automated data processing and data checks. GCP stack. Scale existing data science models, industrialized code of newly created models, and apply MLOps best practices.


The second scope — changing data pipelines on GCP. Bugs that arise from this transition are impeding business teams to have access to the BI reports. The company is looking for a person who would be able to solve bugs on an ad-hoc basis and ensure business continuity during transition months.


The third scope — build ETL pipelines. Following manual data extraction and processing to build BI dashboards. Build automatic pipelines in GCP and standardize manual extracts as much as possible. Data Engineer will be working in pairs with Data Analyst, who will build dashboards using the data extracted and processed by Data Engineer.


Qualifications:

— 2+ years as a Data engineer.

— 2+ years of experience working with Python.

— Having experience with GCP is a priority, any cloud experience is a must.

— Experience building ETL pipelines and working with manual extracts.

— A bachelor's degree in Computer Science or related.


After completing these projects, we can proceed with the next projects. The company has numerous clients, so it will be fascinating. If you're interested in joining the project work at one of the biggest Data consulting companies in Europe, we are waiting for you.


We are looking for:

● A Doer: you get things done and inspire your teams to do the same.

● An Analyst: you LOVE data and think every company should make their decisions with facts.

● A Pragmatist: you have a hacker mindset and always find the quick wins.

● A Mentor: your clients and teams naturally seek advice.

● An Adventurer: you're an entrepreneur constantly looking for problems to solve.


We offer:

— Warm and friendly working environment;

— Cool AI-based projects;

— Opportunities to participate in professional events and conferences;

— Paid vacation and sick leaves;

— Benefits package;

— Flexible schedule and the ability to work remotely.

---


About the client: Our client is building a new generation of data service providers specializing in data consulting and data-driven digital marketing focused on transforming data that impacts business across the entire value chain of organizations. Their wide range of data consulting and digital marketing solutions are designed to meet specific client needs, always designed with a business-oriented approach and deliver tangible results. Their data-driven services are built on deep AI expertise gained from their base of over 1,000 customers worldwide. The company has 1,000 employees in 20 offices focused on accelerating digital transformation.


As a Data Engineer , you will work with one of the largest clients. We are looking for Data Engineers to join and strengthen our current team and deliver high quality customer service. The client already has a certain volume of tasks for the first couple of months.


The first scope of tasks is the creation of ETL pipelines and the connection of outsourcing platforms. Working with Could Run and App Engine, as well as other services in GCP. Creation of automated data processing and data verification. The GCP stack. Scaling existing data science models, industrializing the code of newly created models and applying MLOps best practices.


The second scope of tasks is the change of data pipelines on GCP. Errors that occur as a result of this transition deprive the business group of access to BI reports. The company is looking for someone who can troubleshoot on an interim basis and ensure business continuity during the transition months.


The third scope of tasks is the construction of ETL pipelines. Perform manual data extraction and processing to create BI dashboards. Building automatic pipelines in GCP and maximum standardization of manual extracts. The Data Engineer will work in tandem with a Data Analyst who will create dashboards using the data captured and processed by the Engineer.


Requirements:

— 2+ years of experience as a Data Engineer.

— 2+ years of Python experience.

— Experience with GCP will be an advantage, working with any cloud environment is mandatory.

— Experience in creating ETL pipelines and working with manual extracts.

— Higher education — bachelor's degree in "Computer Science" or similar fields.


After completing these projects, we can start the next ones. Since the companies have many clients, the projects will be diverse. If you are interested in joining the project work in one of the largest consulting companies in Europe, we are waiting for you.


We are looking for:

● Doer: You will get things done and inspire your teams to do the same.

● Analytics: You LOVE data and believe every company should make fact-based decisions.

● Pragmatics: You have a hacker mindset and always find quick solutions.

● Mentor: Your clients and teams will need advice.

● Adventurer: You are an entrepreneur who is constantly looking for problems to solve.


We offer:

— A warm and friendly atmosphere at work;

— Cool AI-based projects;

— Possibility of participation in professional events and conferences;

— Paid holidays and sick leave;

— Package of benefits;

— Flexible schedule and the ability to work remotely.


Specializations: Data Science, Python, AWS
Keywords: engineer, middle, Office/Remote of your choice, data

Translated by Google

3000 - 4600$
2 years
Intermediate
Remote work
Want to get related jobs?
New job openings in your Telegram
Subscribe
We use cookies
accept