Next job

Senior GCP Data Engineer in Link Group

Posted more than 30 days ago

135 views

Link Group

Link Group

5
1 review
Without experience
Full-time work

Translated by Google

The offer includes stable employment, the possibility of working remotely or in the office, and a wide package of educational and development benefits. The recruitment process includes interviews with HR, technical and customer interviews. Minimum 6 years of experience in data and analytics. Minimum 2 years of experience with GCP . Knowledge of GCP cloud services (BigQuery, Cloud Functions, Cloud Run, Workflows, Dataflow). Experience with GIT. Knowledge of SQL and at least one of the pr

The offer includes stable employment, the possibility of working remotely or in the office, and a wide package of educational and development benefits. The recruitment process includes interviews with HR, technical and customer interviews.

  • Minimum 6 years of experience in data and analytics.
  • Minimum 2 years of experience with GCP .
  • Knowledge of GCP cloud services (BigQuery, Cloud Functions, Cloud Run, Workflows, Dataflow).
  • Experience with GIT.
  • Knowledge of SQL and at least one of the programming languages: Java, Python.
  • Knowledge of DWH, Data Lake, Lakehouse design principles.
  • Strong analytical and problem-solving skills.
  • Very good communication in English, both verbal and written.
  • Preferred:

    • Experience with Docker, Kubernetes, Terraform.
    • Experience with data frameworks like dbt, ETL tools like Talend, and data transfer platforms like Fivetran.
    • Knowledge of CI/CD processes.
    • Experience in Agile methodology.
    • Ability to work both independently and in a team.

    The offer includes stable employment, the possibility of working remotely or in the office, and a wide package of educational and development benefits. The recruitment process includes interviews with HR, technical and customer interviews.

    ,[Creating scalable frameworks to implement data retrieval processes, data warehouses, data lakes and data processing solutions for new and existing clients. ,Building and maintaining effective, scalable and reliable data architectures for ,analysis and reporting. ,Implementing data quality checking and validation processes to,ensure accuracy and consistency. , Maintaining and updating documentation and data dictionaries. , Collaborate with cross-functional teams to translate business requirements into technical solutions. ] Вимоги: GCP, BigQuery, Cloud Functions, Cloud Run, Workflows, Dataflow, Git, SQL, Java, Python, DWH, Data Lake, Lakehouse, Docker, Kubernetes, Terraform, dbt, ETL, Fivetran, CI/CD Інструменти: Agile . Bonusi та переваги: ​​Sport subscription, Private healthcare, Small teams.

Translated by Google

Without experience
Full-time work
Want to get related jobs?
New job openings in your Telegram
Subscribe
We use cookies
accept