77.997 vacatures

16 mrt 2026

Kafka Developer

Branche Zie onder
Dienstverband Zie onder
Uren Zie onder
Locatie Eindhoven
Opleidingsniveau Zie onder
Organisatie Kryptos Technologies limited
Contactpersoon Zie onder

Informatie

Tasks

Kafka Developer

Location:Eindhoven, Netherlands (Onsite)

12months+

Key Responsibilities

Pipeline Development: Take ownership of the CDC ingestion framework utilizing Kafka connectors (Debezium, Iceberg sink, S3 sink).
Containerized Infrastructure Management: Deploy and manage Debezium and Kafka Connect workers using Docker containers orchestrating on AWS ECS (Elastic Container Service) and ECR.
Data Lake Integration: Manage data ingestion into AWS S3, utilizing Parquet and Apache Iceberg formats.
Infrastructure as Code: Use Terraform to provision and manage AWS resources supporting the data platform.
CI/CD: Build and maintain deployment pipelines using GitHub and GitHub Actions.
Operational Excellence: Monitor pipeline health, troubleshoot connectivity issues, and ensure the reliability of the Kafka ecosystem.
Optional: Support and optimize workflow orchestration using Airflow where applicable.

Requirements

Skills Required:

Apache Kafka & Kafka Connect: Multiple years of hands-on experience configuring, deploying, and managing Kafka Connect clusters in a production environment.
Containerization: Extensive experience with Docker is required. You must be comfortable building images and managing container lifecycles.
AWS Compute: Proven experience running containers on AWS ECS and managing images via AWS ECR.

Tech Stack Summary

Streaming: Apache Kafka, Kafka Connect, Debezium
Compute/Containerization: AWS ECS, AWS ECR, Docker
Storage/Format: AWS S3, Apache Iceberg, Parquet
DevOps: Terraform, GitHub Actions
Languages: Python, Bash
Optional Orchestration: Apache Airflow

Omschrijving

Tasks

Kafka Developer

Location:Eindhoven, Netherlands (Onsite)

12months+

Key Responsibilities

Pipeline Development: Take ownership of the CDC ingestion framework utilizing Kafka connectors (Debezium, Iceberg sink, S3 sink).
Containerized Infrastructure Management: Deploy and manage Debezium and Kafka Connect workers using Docker containers orchestrating on AWS ECS (Elastic Container Service) and ECR.
Data Lake Integration: Manage data ingestion into AWS S3, utilizing Parquet and Apache Iceberg formats.
Infrastructure as Code: Use Terraform to provision and manage AWS resources supporting the data platform.
CI/CD: Build and maintain deployment pipelines using GitHub and GitHub Actions.
Operational Excellence: Monitor pipeline health, troubleshoot connectivity issues, and ensure the reliability of the Kafka ecosystem.
Optional: Support and optimize workflow orchestration using Airflow where applicable.

Requirements

Skills Required:

Apache Kafka & Kafka Connect: Multiple years of hands-on experience configuring, deploying, and managing Kafka Connect clusters in a production environment.
Containerization: Extensive experience with Docker is required. You must be comfortable building images and managing container lifecycles.
AWS Compute: Proven experience running containers on AWS ECS and managing images via AWS ECR.

Tech Stack Summary

Streaming: Apache Kafka, Kafka Connect, Debezium
Compute/Containerization: AWS ECS, AWS ECR, Docker
Storage/Format: AWS S3, Apache Iceberg, Parquet
DevOps: Terraform, GitHub Actions
Languages: Python, Bash
Optional Orchestration: Apache Airflow

Solliciteer direct