Next job

Lead Data Engineer (Romania / Ukraine) in AmorServ

18 November

21 views

AmorServ

AmorServ

0
0 reviews
Without experience
Full-time work
Role: Lead Data EngineerLocation: Romania, Ukraine (Remote)Years of Experience: 5-7 yearsPay: $84,000 - $90,000 PARequired Skill: Python, SQL,Databricks, Snowflake, ETL, AWS, GCP, Airbyte, Postgres, Kafka, Sisense, CircleCI, Grafana, Kubernetes (EKS)Language Required: English C1 LevelJob DescriptionThis role emphasizes creating robust, scalable, and secure data lake, ingestion, and distribution systems. As a key leader, you will work across the data stack and collaborate with cross-functional te
Role: Lead Data EngineerLocation: Romania, Ukraine (Remote)Years of Experience: 5-7 yearsPay: $84,000 - $90,000 PARequired Skill: Python, SQL,Databricks, Snowflake, ETL, AWS, GCP, Airbyte, Postgres, Kafka, Sisense, CircleCI, Grafana, Kubernetes (EKS)Language Required: English C1 LevelJob DescriptionThis role emphasizes creating robust, scalable, and secure data lake, ingestion, and distribution systems. As a key leader, you will work across the data stack and collaborate with cross-functional teams, setting high standards in data governance, quality, and security.Key ResponsibilitiesData Lake and Pipelines: Design and implement features for ingestion, enrichment, and transformation, ensuring availability, reliability, and scalability.System Design and Project Alignment: Drive system design, creating alignment and feasibility for projects across teams.Full Data Stack: Work across ETL processes, data warehousing, visualization, and cloud infrastructure.Data Management and Governance: Define and implement best practices to ensure data quality, security, and consistency.Optimization: Develop and optimize Spark jobs, notebooks, and pipelines in Databricks.Collaboration and Mentorship: Partner with the Chief Architect, mentor engineers, and support DataOps culture.Stakeholder Communication: Communicate data strategies and plans to stakeholders across the organization.Required Skills And ExperienceData Engineering: 5+ years in data pipeline design and implementation using Databricks and Python (or PySpark).SQL & Visualization: Proficiency in SQL and visualization tools such as Sisense or Power BI.Cloud Platforms: Experience with AWS or GCP, focusing on cloud-native data engineering.ETL and Data Governance: Expertise in ETL processes and data governance principles to maintain quality and consistency.Must-Have:Python, SQL, Databricks, ETL processes AWS or GCPVisualization tools (Sisense or similar)Airbyte or comparable ETL toolsTerraform for environment templating, CI/CD tools (CircleCI, GitHub ActionsNice to Have:Kedro, Kafka, and data mesh designPostgres, Terraform, CircleCI, GrafanaKnowledge of microservices and data modelingAdditional SkillsTechnical: Designing large-scale data systems, SQL/NoSQL databases, and familiarity with streaming services like Kafka.Soft Skills: Strong client-facing abilities, strategic planning, and stakeholder management.Our Tech StackPython, SQL, Databricks, Snowflake, Airbyte, Postgres, Kafka, Sisense, CircleCI, Grafana, Kubernetes (EKS)(Knowledge of deprecated systems like Druid or Datadog is a plus) Show more Show less Посадовий рівень Старший середній рівень Тип зайнятості Повний робочий день Посадові обов’язки Інформаційні технологіїIndustries IT services and IT Consulting
Without experience
Full-time work
Want to get related jobs?
New job openings in your Telegram
Subscribe
We use cookies
accept