Domäne
Data Engineering
Skill-Profil
Kafka Streams, Flink, Debezium CDC: real-time data processing
Rollen
2
wo dieser Skill vorkommt
Stufen
5
strukturierter Entwicklungspfad
Pflichtanforderungen
8
die anderen 2 optional
Data Engineering
Stream Processing
17.3.2026
Wählen Sie Ihr aktuelles Level und vergleichen Sie die Erwartungen.
Die Tabelle zeigt, wie die Tiefe von Junior bis Principal wächst.
| Rolle | Pflicht | Beschreibung |
|---|---|---|
| Backend Developer (Scala) | Understands Kafka Streams topology basics: sources, processors, and sinks. Reads existing stream processing code and follows established patterns for stateless transformations. | |
| Data Engineer | Pflicht | Understands Kafka Streams fundamentals including KStream/KTable duality. Writes simple stream consumers and producers for data pipeline ingestion stages. |
| Rolle | Pflicht | Beschreibung |
|---|---|---|
| Backend Developer (Scala) | Implements stateful stream processing with Kafka Streams using windowed aggregations and joins in Scala. Tunes RocksDB state stores for throughput and manages changelog topics. | |
| Data Engineer | Pflicht | Builds real-time ETL pipelines with Kafka Streams for data transformation and enrichment. Implements exactly-once semantics and monitors consumer lag across processing stages. |
| Rolle | Pflicht | Beschreibung |
|---|---|---|
| Backend Developer (Scala) | Pflicht | Designs distributed stream processing architectures with Kafka Streams for Scala microservices. Implements custom SerDes, interactive queries for state store access, and partition-level parallelism strategies. |
| Data Engineer | Pflicht | Designs end-to-end streaming data architectures with Kafka Streams for high-throughput pipelines. Orchestrates complex event processing, implements schema evolution strategies, and optimizes for backpressure handling. |
| Rolle | Pflicht | Beschreibung |
|---|---|---|
| Backend Developer (Scala) | Pflicht | Defines stream processing standards for Scala team: choosing between Kafka Streams, FS2-Kafka and Akka Streams for specific use cases. Reviews stream processor topologies, implements exactly-once processing patterns, configures consumer lag and stream processing latency monitoring. |
| Data Engineer | Pflicht | Defines streaming standards: Kafka Streams vs Flink, windowing policies, state management. Implements consumer lag and processing latency monitoring. Chooses between exactly-once and at-least-once. |
| Rolle | Pflicht | Beschreibung |
|---|---|---|
| Backend Developer (Scala) | Pflicht | Shapes stream processing strategy for Scala platform: real-time data pipeline architecture through Kafka Streams/Flink, state management standards. Makes decisions on stream processing cluster scaling, defines SLA for end-to-end latency and integration with Data Mesh approach. |
| Data Engineer | Pflicht | Designs data platform streaming architecture: Kafka Streams for lightweight processing, Flink for complex CEP, hybrid batch+streaming. Defines lambda vs kappa architecture by scenario. |