Domain
Data Engineering
Skill Profile
Kafka Streams, Flink, Debezium CDC: real-time data processing
Roles
2
where this skill appears
Levels
5
structured growth path
Mandatory requirements
8
the other 2 optional
Data Engineering
Stream Processing
3/17/2026
Choose your current level and compare expectations. The items below show what to cover to advance to the next level.
The table shows how skill depth grows from Junior to Principal. Click a row to see details.
| Role | Required | Description |
|---|---|---|
| Backend Developer (Scala) | Understands Kafka Streams topology basics: sources, processors, and sinks. Reads existing stream processing code and follows established patterns for stateless transformations. | |
| Data Engineer | Required | Understands Kafka Streams fundamentals including KStream/KTable duality. Writes simple stream consumers and producers for data pipeline ingestion stages. |
| Role | Required | Description |
|---|---|---|
| Backend Developer (Scala) | Implements stateful stream processing with Kafka Streams using windowed aggregations and joins in Scala. Tunes RocksDB state stores for throughput and manages changelog topics. | |
| Data Engineer | Required | Builds real-time ETL pipelines with Kafka Streams for data transformation and enrichment. Implements exactly-once semantics and monitors consumer lag across processing stages. |
| Role | Required | Description |
|---|---|---|
| Backend Developer (Scala) | Required | Designs distributed stream processing architectures with Kafka Streams for Scala microservices. Implements custom SerDes, interactive queries for state store access, and partition-level parallelism strategies. |
| Data Engineer | Required | Designs end-to-end streaming data architectures with Kafka Streams for high-throughput pipelines. Orchestrates complex event processing, implements schema evolution strategies, and optimizes for backpressure handling. |
| Role | Required | Description |
|---|---|---|
| Backend Developer (Scala) | Required | Defines stream processing standards for Scala team: choosing between Kafka Streams, FS2-Kafka and Akka Streams for specific use cases. Reviews stream processor topologies, implements exactly-once processing patterns, configures consumer lag and stream processing latency monitoring. |
| Data Engineer | Required | Defines streaming standards: Kafka Streams vs Flink, windowing policies, state management. Implements consumer lag and processing latency monitoring. Chooses between exactly-once and at-least-once. |
| Role | Required | Description |
|---|---|---|
| Backend Developer (Scala) | Required | Shapes stream processing strategy for Scala platform: real-time data pipeline architecture through Kafka Streams/Flink, state management standards. Makes decisions on stream processing cluster scaling, defines SLA for end-to-end latency and integration with Data Mesh approach. |
| Data Engineer | Required | Designs data platform streaming architecture: Kafka Streams for lightweight processing, Flink for complex CEP, hybrid batch+streaming. Defines lambda vs kappa architecture by scenario. |