Organizations have been on a long journey aiming to unlock the value of the data stored in their operational systems by building business intelligence solutions on top of centralized data management architectures with data warehouses and data lakes. Despite the energy and investments poured into these initiatives, we have mostly failed to leverage the full potential. ThoughtWorks Zhamak Dehghani's observations on these traditional approaches' failure modes inspired her to develop an alternative big data management architecture that she aptly named the Data Mesh. It represents a paradigm shift that draws from modern distributed architecture and is founded on the principles of domain-driven design, self-serve platform, and product thinking with data. In the last decade, Apache Kafka has established a new category of data management infrastructure for data in motion that has become the defacto standard in modern distributed data architectures.
In this session we will explore how the foundational principles for the Data Mesh paradigm can be supported by a Kafka-based event streaming architecture that is designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization.