An architect once told us that managing his Kafka event streams was like playing a game of data whack-a-mole. He couldn’t determine which event streams were available, which applications were consuming which events, who owned or produced the events, and what schemas they were using. So many events popping up and no way to visualize or manage what was going on underneath it all.
If you were at the Kafka Summit 2020 virtual conference, you were probably already introduced to our new toolset that can discover event streams running through your Kafka brokers and import them into our graphical tool PubSub+ Event Portal. It’s a tool that allows you to visualize and manage your data flow architecture between applications and across lines of business. What we demonstrated at the summit was the beta version, and today we are announcing that these new capabilities for Kafka event streams are now generally available! Our developers have been working feverishly, incorporating feedback from our beta users, including a very large financial service provider, to bring it to the finish line.
I don’t want to bore you with all the details of how and what PubSub+ Event Portal for Kafka can do to stop that endless game of whack-a-mole and give Kafka users some peace of mind, but in case you don’t follow our blog (we’re all busy these days, right?), here’s a brief summary of what you can do with these new capabilities:
Does that sound like something that could save you some time? For more technical detail on each capability, I encourage you to read the post by my colleague Ush Shukula: Using PubSub+ Event Portal for Support and Visualization of your Kafka Streams.
If you’re more of a visual learner, Jonathan Schabowsky (senior architect in the office of the CTO) shows PubSub+ Event Portal for Kafka in action in the demo How to discover, visualize, catalog, share and reuse your Kafka Streams.
If you’re already convinced, you can always try it out for free by going to our PubSub+ Event Portal for Kafka sign-up page and get started today!
To walk you through the process of using our discovery agent to scan a Kafka cluster (provided by Solace) to import and view the discovered data, we created a codelab that you can follow step by step.
These are the steps that the codelab will walk you through:
You can also opt for skipping ahead and scan your own Kafka clusters to populate PubSub+ Event Portal with your event-driven architecture and see what you’ve been missing!