
DevOps Engineer (Hybrid data-platform)
Віддалена робота КиївResponsibilities
- Support and development data-platform (Kubernetes, Apache Spark, Kafka, Airflow, Minio/AWS S3);
- Support for a team of data engineers and analysts.
Skills
- Understanding the advantage of GitOps/IaC over manual work.
- Kubernetes, Helm, Kustomize, ArgoCD.
- Docker (Docker BuildKit), or Alternative Containers Build Tools.
- TeamCity, Azure DevOps or similar CI/CD tool.
- Minio
Will be a plus
- Kerberos, Active directory.
- Apache Spark in Kubernetes, Apache Kafka.
- Experience with any OLAP DB.
- Apache Airflow.
- Security in K8s, HashiCorp Vault, Oauth, OpenID, Keycloak.
- Experience with at least one of the most popular programming languages, such as Java, Kotlin, Python, Golang, Scala, etc.
- Argo Workflows, Argo Events, Argo Rollouts.
- Terraform, Ansible, K8s operators
Technologies that we use
- Kubernetes, Kustomize, Helm, ArgoCD, Cilium, Longhorn, Prometheus, Grafana.
- TeamCity, Kotlin DSL.
- Airflow, Spark, Kyuubi, Hive Metastore, Minio/S3, Redis, Postgresql, Elasticsearch.
We offer
- The opportunity to work on a large-scale project from scratch. One of largest data lake project in the Ukraine. Our teammates active contributing to community BigData tools like as AirFlow.
- We are not tied to the office, willing to work remotely.
- Health insurance.
- Compensation of sports clubs and foreign language schools.
- Internal training (IT and not only);