Technical Talks

View All

CDC Stream Processing with Apache Flink

Timo Walther Timo Walther | Principal Software Engineer @ Confluent, PMC @ Apache Flink | Confluent

An instant world requires instant decisions at scale. This includes the ability to digest and react to changes in real-time. Thus, event logs such as Apache Kafka can be found in almost every architecture, while databases and similar systems still provide the foundation. Change Data Capture (CDC) has become popular for propagating changes. Nevertheless, integrating all these systems, which often have slightly different semantics, can be a challenge. In this talk, we highlight what it means for Apache Flink to be a general data processor that acts as a data integration hub. Looking under the hood, we demonstrate Flink's SQL engine as a changelog processor that ships with an ecosystem tailored to processing CDC data and maintaining materialized views. We will discuss the semantics of different data sources and how to perform joins or stream enrichment between them. This talk illustrates how Flink can be used with systems such as Kafka (for upsert logging), Debezium, JDBC, and others.

Timo Walther
Timo Walther
Principal Software Engineer @ Confluent, PMC @ Apache Flink | Confluent

Timo Walther is a long-term member of the management committee and among the top committers in the Apache Flink project. He studied Computer Science at TU Berlin and participated in the Database Group. Timo worked as a software engineer at Data Artisans and lead of the SQL team at Ververica. He was a Co-Founder of Immerok which was acquired by Confluent in 2023. In Flink, he is working on various topics in the Table & SQL ecosystem to make stream processing accessible for everyone.