Introduction
As a DevOps Engineer, you will play a critical role in building, optimizing, and maintaining data infrastructure. You will work closely with data engineers, data scientists, and other cross-functional teams to support the development, deployment, and monitoring of data pipelines and applications.
Your Role and Responsibilities
- Implement and optimize CI/CD pipelines to automate the deployment of data pipelines, applications, and other platform components.
- Manage and monitor cloud-based environments, ensuring high availability and performance for data processing and analytics tasks.
- Troubleshoot and resolve issues related to data pipelines, services, and infrastructure components.
- Employ best practices for security, scalability, and data governance in all aspects of the platform.
- Work with modern data technologies, including Snowflake, Databricks, and DBT, to streamline data processing workflows.
Required Technical and Professional Expertise
- Proven experience as a Platform/DevOps Engineer, preferably in a data-centric environment.
- Experience with technologies such as Kafka, Snowflake, Databricks, and DBT is a strong advantage.
- Proficiency in cloud platforms such as Azure (experience with AWS or Google Cloud is a plus).
- Strong knowledge of infrastructure as code (Terraform preferred) and configuration management tools.
- Solid understanding of CI/CD concepts and experience with Github Actions, Circle CI, or similar tools.
- Strong scripting skills (Python, Bash, or similar) for automating tasks and creating tools.
- Knowledge of monitoring and logging tools (e.g., DataDog) for maintaining platform health.