198 views
Reply.io
6 серпня 2024
віддалено
We are a product company working in the global market. Right now we are a team of 80+ amazing Ukrainians working 100% remotely from day 1. We have 4000+ customers all over the world using our platform every day. And now we are looking for a great Customer Support Agent to join our great team :)
As a DevOps Data Engineer, you should be passionate about automating tasks with Python and maintaining seamless data operations. You’ll write scripts, develop pipeline jobs in DataBricks, and troubleshoot infrastructure issues. Your role involves managing AI and data infrastructure, optimizing analytics workflows, and ensuring data integrity and security. You’ll collaborate with other engineers, monitor system performance, and continuously improve automation processes. If you love staying updated with the latest in DevOps and data engineering, this role is perfect for you.
Requirements
• 1+ years of experience in DevOps or related roles
• Middle-level proficiency of, or willingness to learn, Python Core on short notice
• Middle-level proficiency in:
— GitHub and Git
— data engineering tools and libraries, including Pandas and PySpark
— databases (DB), including querying and managing data
— Apache Airflow for workflow management
— DataBricks for managing data pipelines and jobs
— Elasticsearch and Kibana
• Strong troubleshooting skills, with the ability to diagnose and fix infrastructure issues
• Good understanding of maintaining existing infrastructure and internal analytics
• Ability to work independently and collaboratively within a team
• Effective communication skills, both written and verbal
• Proactive and detail-oriented mindset
• Passion for continuous learning and keeping up-to-date with the latest technologies
Responsibilities• Writing scripts in Python to automate various tasks and processes
• Developing and maintaining pipeline jobs in DataBricks
• Troubleshooting and fixing issues related to infrastructure and data processing
• Maintaining DATA and sales AI infrastructure and processes
• Maintaining existing infrastructure to ensure smooth operations
• Managing internal analytics workflows under the supervision of Apache Airflow
• Supporting and maintaining jobs in DataBricks to ensure efficient data processing
• Collaborating with other team members to optimize data workflows
• Monitoring system performance and implementing necessary improvements
• Ensuring data integrity and security across all systems and platforms
• Documenting processes and maintaining comprehensive records of changes and updates
• Participating in code reviews and providing constructive feedback
• Assisting with the deployment of new features and updates
• Continuously improving automation scripts and processes for better efficiency and reliability
• Staying updated with the latest developments in DevOps and data engineering fields
We offer
• Working with the leading product on the US and Global market
• Real opportunity for professional growth and self-improvement
• Work from home or co-working space (we cover the co-working costs)
• We provide the latest equipment for your comfortable work
• Access to internal business literature, training, seminars
• Unlimited number of sick leaves
• Awesome colleagues and a pleasant working atmosphere