big data engineer - 22959

Posted more than 30 days ago
Specialization: Python
Open to offers now
5 400 $
Poland
Krakow
2 years
Upper Intermediate
Considering options
Full-time office work
Part-time work
Remote work (full-time)
Freelance (one-time projects)
Relocate
Data Engineer proficient in Python, ETL, PySpark, Airflow, PostgreSQL, Redshift, AWS services, and Agile methodologies, I am dedicated to leveraging data-driven insights to empower organizations and drive their success. With a strong foundation in technical science and a Doctor of Philosophy degree, I bring advanced technical expertise to my work.
Education
Lviv Polytechnic National University
Master. Faculty: Institute of Chemistry and Chemical Technologies
January 2010 - December 2013

PhD degree in technical science.


Experience
Python AQA
September 2015 - August 2017
  • Installed, configured, and debugged test equipment and test cases at customer sites, ensuring optimal network functionality and performance.
  • Performed testing of network automation frameworks, providing comprehensive support to customers throughout the testing process.
  • Applied testing methodologies to test various networking protocols at the L2/L3 layers, ensuring accurate and efficient network operations.
  • Analyzed complex network systems and protocols to identify root causes of defects, enabling targeted issue resolution and improved network performance.
  • Created and maintained comprehensive documentation for an open-source test framework dedicated to testing network equipment, hosted on GitHub.

Tools and technologies: Python, Bash, pytest, Docker, LinuxOS, GitHub, Ixia, WireShark, Sphinx, ReadTheDocs.


Python AQA, TechLead of Performance testing
August 2017 - September 2020
  • Defined requirements and identified appropriate performance testing strategies and approaches for web applications.
  • Collaborated with developers to diagnose and fine-tune the performance of high-loaded web applications.
  • Utilized collaborative project management tools, load generators, and other monitoring tools for performance monitoring.
  • Implemented and maintained back-end and UI performance tests and mock services to simulate highly loaded services.
  • Supported and mentored teammates in the role of a tech lead, providing guidance and expertise.
  • Designed and maintained project documentation to ensure accurate and up-to-date information.

Tools and technologies: Python, Locust, Behave, Docker, AWS, NewRelic, Kibana, Git, Grafana, InfluxDB, MongoBD, GoCD, BitBucket.


Data Engineer
Newfire Global Partners
September 2020
  • Collected and cleansed large datasets, applying data wrangling techniques to ensure data quality and consistency.
  • Implemented distributed data processing using Spark, allowing for faster and more efficient analysis of big data.
  • Developed ETL pipelines using Airflow, ensuring the smooth flow of data from various sources to the target database.
  • Collaborated with project managers and analysts to understand their data requirements and deliver optimized data pipelines that supported their KPI tracking needs.
  • Actively participated in project documentation design and maintenance, ensuring that all processes and procedures were well-documented and easily accessible to team members.

Tools and technologies: Python, ETL, PySpark, Postgres, Redshift, Alembic, Airflow, AWS Services, Jenkins, BitBucket.


Similar candidates
Open to offers now
5 400 $
Poland
Krakow
2 years
Upper Intermediate
Considering options
Full-time office work
Part-time work
Remote work (full-time)
Freelance (one-time projects)
Relocate
We use cookies
accept