I have deep knowledge of Relational and NoSQL databases, Big Data, Streaming and Orchestration tools. I have working experience with main programming languages for data processing such as Python and Scala. I have comprehensive expertise in building data-driven applications such as: Data Lake, DWH, OLTP, ETL. I’m AWS certified, also I have a bunch of Microsoft certifications in database development.
Here is list of some common responsibilities I do on projects: database architecture, optimisation data part of application, Big Data processing, building Data Lake, ETL pipelines, etc. Aside of my main responsibilities, I was mentoring other team members and conducted knowledge sharing.
I am responsible, always look forward to develop my skills and obtain new areas of expertise.
Project Description: Designing data platform for one of biggest tour operator in Europe.
Responsibilities: ▪ Designing data platform ▪ Architecture vision of new features ▪ Requirements' analysis and discussion ▪ Code review ▪ Optimization of existing process, creating documentation of processes ▪ Mentoring and knowledge sharing with other team members ▪ Communication with customer ▪ documentation, ticket creation and estimation Project Team Size: 10-20 team members
Tools & Technologies: Kafka ecosystem (connect, schema registry, stream), Spark, Flink, Iceberg Pulumi, Snowflake, AWS services, Kubernetes
Project Description: As Data Architect I was involved in different projects in such domains as retail, gambling and health care.
Main approaches were Data Lake, DWH, ETL pipelines, streaming and Batch processing.
Responsibilities: ▪ Architecture vision of new features ▪ Requirements' analysis and discussion ▪ Code review ▪ Optimization of existing process, creating documentation of processes ▪ Mentoring and knowledge sharing with other team members ▪ Communication with customers ▪ documentation, ticket creation and estimation Project Team Size: 8-10 team members
Tools & Technologies: Scala, Spark, Kafka, AWS services, Python, Aurora PostgreSQL, Redshift
Project Description: Tool allows to follow all company-employee relationship – from getting CV, assigning training, salary review and many more.
Responsibilities: ▪ Database development and optimization ▪ ETL development. ▪ Big Data processing using Spark ▪ Streaming data using Kafka Project Team Size: 10-15 team members
Tools & Technologies: AWS services, Aurora PostgreSQL, Spark, Kafka, Airflow, Java, Python
Project Description: Advertising company build analytics platform that its clients be able to analyze their ads activities.
Responsibilities: ▪ Data lake development ▪ DWH development ▪ ETL development
Tools & Technologies: AWS services, PostgreSQL, Spark, Airflow, Python.
Project: AWS Schema Conversion
Project Description: Tool makes database migrations by automatically converting the source database schema and a majority of the custom code to a format compatible with the target database.
Responsibilities: ▪ Development of database object’s migration rules ▪ Created rules for DB pairs: Oracle – Redshift, MS SQL – Redshift, Greenplum - Redshift
Project Team Size: 20-30 team members
Tools & Technologies: MS SQL, AWS Redshift, Oracle, Greenplum, Java, Python.