What you will doWork closely with the data engineering team and report to the business process engineering teamBe a key person developing (ingestion, creating ETL pipelines, deployment pipelines, dashboards, etc.) and maintaining data lake attention. OLAP (multi-dimensional database model) and Tabular with DAX were needed only to make a backward analysis and re-create the logic using a data lake. If the dashboards will be developed using Power BI, then you should be responsible for knowing this
What you will doWork closely with the data engineering team and report to the business process engineering teamBe a key person developing (ingestion, creating ETL pipelines, deployment pipelines, dashboards, etc.) and maintaining data lake attention. OLAP (multi-dimensional database model) and Tabular with DAX were needed only to make a backward analysis and re-create the logic using a data lake. If the dashboards will be developed using Power BI, then you should be responsible for knowing this technology since Tabular Model and DAX are also used in Power BI in the same way as in Tabular Cubes.Support data scientists with data preparation and deploymentContribute to data lake applications and enable new AI applicationsProvide maintenance and support for the production environments by resolving urgent requests on timeEager to deliver a production-ready, stable systemCollaborate to create reusable IAC modules used in other data lakesDesign, develop, and maintain reports based on Power BI/Grafana/Quicksight dashboardsDesign, develop, maintain, and support ETL processes (SSIS) + DWH (inclusive in the cloud)Perform data modeling, including the design and development of logical/physical layers and ETL processes, ensuring it is both scalable and extendable (on-premises/on-cloud)Technical analysis of existing solutions (tech design)Prepare and support technical documentationYour profileEnglish: professional working proficiency (to collaborate in writing and verbally with English-speaking stakeholders)Extensive experience with cloud technology (incl. AWS: Lambda, Glue, S3, Athena, etc.)Experience with infrastructure as code (Terraform, etc.).Extensive experience (3+ years) with business intelligence solutions, SQL programming (2016/2019/…), RDBMS, and an understanding of the principles of SQL code optimization.Good experience with ETL/orchestration/data pipeline/IoT/data streaming toolsWorking experience with Visual Studio and SQL Management StudioTechnical understanding of data architectures and data warehousing principlesWorking knowledge of Spark, Python, and SQL (T-SQL); experience with SSIS, SSAS, and/or SSRS is a plusWorking experience with BI tools like Power BI, DAX, Grafana, Quicksight, etc.Experience with Databricks is a bonusExperience in Azure DevOps, Azure BI platform, and Git workflows is an assetAt least a bachelor's degree in computer science, mathematics, engineering, information technology, or a related technical fieldPractical, hands-on mindset and able to work independentlyGood communication with other teams; team playerEager to learn and quick at grasping new concepts and technologiesLocation and type of contractKyiv, UkraineFull-timeHybridMid-senior level
Show more
Show less
Посадовий рівень
Старший середній рівень
Тип зайнятості
Повний робочий деньJob responsibilities
Research
Industries
Software development and printing services