During EDA Summit 2022, Solace chief technology solutions officer Sumeet Puri did a short session on the topic of Kafka Mesh. It was so popular, he and fellow colleague Thomas (TK) Kunnumapurath ran a follow-up session on Sept 29th to explain the concept in a bit more detail and demonstrate how it works.
You can watch the full video here, or read on for an in depth recap of the session.
Before the session got started, the audience was polled to determine which version of Kafka they were currently using in their environment. Unsurprisingly, the results showed Apache Kafka was the most prevalent at 69%, followed by Confluent as 28%, Amazon MSK at 10%, and the remainder were using either Cloudera or Azure Event Hubs.
Sumeet made some fair points about how enterprises today are more distributed than ever, with edge and cloud gaining traction alongside existing core systems and assets.
With the acknowledgement that network is now faster than disk in most cases, and thousands (or millions) of devices are needing connectivity, streaming is becoming the de facto paradigm for data exchange throughout organizations. And for many, Kafka is their streaming platform of choice.
But these new patterns introduce new challenges:
Kafka Mesh is a network of event brokers that natively connect to Kafka clusters and solve these challenges noted above. Not only does a Kafka Mesh connect different versions and islands of Kafka clusters but also connects to non-Kafka applications such as microservices, IoT devices, serverless functions, API platforms, and iPaaS platforms. It ultimately liberates your Kafka data streams throughout the enterprise in a cloud agnostic manner.
Sumeet then spoke about some sample use cases where a Kafka Mesh proves useful:
Another attendee poll was presented, asking the audience what challenges they face with their Kafka implementations. The results showed the following:
Sumeet wrapped up with the main benefits a Kafka Mesh can bring to your organization.
But how is all this possible?
Thomas (TK) then took over the presentation and proceeded to explain the critical first step of transforming Kafka records into routable events the Kafka Mesh can intelligently route throughout the network. This is done by extracting the Kafka topic metadata and mapping it into a routable topic structure.
As this diagram illustrates, subscribing applications ask intelligent questions such as “how do I get orders that only originate in the US”, or “how can I look at all orders for pencils from anywhere?”
TK proceeded to demonstrate how easy it is to setup a in the Kafka Mesh and subscribe to a particular Kafka stream. In the example above, he uses GLOBAL.ORDERS.
He used the PubSub+ “Try Me” function to mimic a client application that might subscribe to a subset, or a filtered view of the GLOBAL.ORDERS stream. In this example he subscribed to US/ACME/*, and UK/TJX/* respectively and in each case only received the records from the GLOBAL.ORDERS stream that matched those filtered topics.
In these examples, the client (or consuming) application is only sent the records that match the topic subscription without the need for any further filtering application, as would normally be necessary in a Kafka only environment.
TK closed his section by echoing Sumeet and showing some key benefits of Kafka Mesh users get when working in the manner described in the previous section:
Sumeet closed out the session with a great analogy on how to think of event-based subscriptions. Consider how you effectively subscribe to various #hashtags in Twitter in order to see a filtered view of everything in your twitter stream (ie. #vegan, #crypto etc…). That is pretty much the same concept as subscribing to granular topics in a pub-sub based event-driven system (order/shipped/*/sku123/> , order/cancel/> etc..).
As the leader of Solace’s analyst relations program, Roger has leveraged his technical background and analyst relations experience to develop a solid technical understanding of event-driven architecture, event streaming and event management that he uses to help analysts understand the many ways EDA and events can enable, complement, and accelerate the latest trends in computing and technology.
After kicking off his career as a programmer and professional services consultant, Roger shifted his focus to analyst relations with business intelligence innovator Cognos and IBM’s Business Analytics and Global Business Services consulting division when IBM acquired them, and for 20 years now he’s been working alongside Gartner, Forrester, IDC and other analysts to help them understand his employers’ position in the enterprise IT landscape.
Married for 30 years, Roger is the proud father of 3 adult children and enjoys DIY projects around the house and spending quiet time at the lake.[position] => [url] => https://solace.com/blog/author/roger-sabourin/ ) )