Why You Need to Look Beyond Kafka for Operational Use Cases
PART 1

The Need for Filtering
and In-Order Delivery

Why You Need to Look Beyond Kafka for Operational Use Cases
PART 2

The Importance of
Flexible Event Filtering

Why You Need to Look Beyond Kafka for Operational Use Cases
PART 3

Securing the
Eventing Platform

Why You Need to Look Beyond Kafka for Operational Use Cases
PART 4

Streaming with
Dynamic Event Routing

Video

Topic vs. Topic: Solace PubSub+ and Apache Kafka

Video

Event Topics: Kafka vs Solace Implementation

You use Kafka.
You’re probably wondering why you need Solace PubSub+

Apache Kafka excels at event streaming for storage and analytic use cases like:

  • Log aggregation for analytics
    Many people use Kafka as a replacement for a log aggregation solution. Log aggregation typically collects physical log files off servers and puts them in a central place (a file server or HDFS perhaps) for processing. Kafka abstracts away the details of files and gives a cleaner abstraction of log or event data as a stream of messages. This allows for lower-latency processing and easier support for multiple data sources and distributed data consumption. In comparison to log-centric systems like Scribe or Flume, Kafka offers equally good performance, stronger durability guarantees due to replication, and much lower end-to-end latency.
  • Event stream processing
    Many users of Kafka process data in processing pipelines consisting of multiple stages, where raw input data is consumed from Kafka topics and then aggregated, enriched, or otherwise transformed into new topics for further consumption or follow-up processing. For example, a processing pipeline for recommending news articles might crawl article content from RSS feeds and publish it to an “articles” topic; further processing might normalize or deduplicate this content and publish the cleansed article content to a new topic; a final processing stage might attempt to recommend this content to users. Such processing pipelines create graphs of real-time data flows based on the individual topics. Starting in 0.10.0.0, a light-weight but powerful stream processing library called Kafka Streams is available in Apache Kafka to perform such data processing as described above. Apart from Kafka Streams, alternative open source stream processing tools include Apache Storm and Apache Samza.
  • Event sourcing
    Event sourcing is a style of application design where state changes are logged as a time-ordered sequence of records. Kafka’s support for very large stored log data makes it an excellent backend for an application built in this style.
  • Website activity tracking
    The original use case for Kafka was to be able to rebuild a user activity tracking pipeline as a set of real-time publish-subscribe feeds. This means site activity (page views, searches, or other actions users may take) is published to central topics with one topic per activity type. These feeds are available for subscription for a range of use cases including real-time processing, real-time monitoring, and loading into Hadoop or offline data warehousing systems for offline processing and reporting. Activity tracking is often very high volume as many activity messages are generated for each user page view.
  • Aggregating statistics from distributed applications for metrics / monitoring
    Kafka is often used for operational monitoring data. This involves aggregating statistics from distributed applications to produce centralized feeds of operational data.
    From: http://kafka.apache.org/uses

PubSub+ Platform excels at event streaming and event management for transactional and operational use cases like:

  • Event-driven microservices
    PubSub+ Platform uniquely supports the adoption of event-driven microservices – a software development technique in which an application is broken down into a set of services that function in concert to fill a request. PubSub+ Event Portal gives developers and architects the tools they need to design and discover events and event-driven microservices, and to define, model and visualize event-driven systems. PubSub+ Event Broker enables event-driven, robust and secure interactions between the services, with built-in support for fine-grain filtering, fine-grain authentication/authorization, mature session-based transactions, support for multiple open standard protocols and APIs (including Spring Cloud Streams), and more. Learn more about PubSub+ for microservices
  • Hybrid/multi-cloud event distribution and management
    Enterprises choose PubSub+ Event Broker for event streaming across hybrid cloud (public or private cloud + data center) and multi-cloud (multi-public cloud or a combination of public and private cloud) architecture for its enterprise grade features functionality, which are built into the event broker, and which can be extended across a distributed enterprise via event mesh (created by connecting event brokers across environments). Support for a variety of open standard protocols and APIs, and for multiple message exchange patterns, make it straightforward to connect and enable event-driven communication between legacy applications and cloud native applications and services. Built-in dynamic routing, high availability, disaster recovery, WAN optimization, security, performance with TLS, and topic filtering, ensure the right data is streamed to the right location in a secure and efficient manner, and all of these features and processes can be managed from a centralized console. Complimenting PubSub+ Event Broker, PubSub+ Event Portal enables full lifecycle management of events within a hybrid/multi-cloud ecosystem. Learn more about PubSub+ for hybrid cloud
  • Event-driven integration
    Enterprises use PubSub+ to ‘event-enable’ their integration platform as a service (iPaaS), where event brokers route business events emitted by applications and devices from across the distributed enterprise to a standardized integration toolset that simplifies common integration tasks such as data transformation and connectivity. PubSub+ Platform is ideally suited for this use case. PubSub+ Event Broker can be deployed in any and all environments (public cloud, private cloud, data center), and connected to form an event mesh that underpins an enterprise’s integration layer. With the broker’s support for multiple open standard protocols and APIs, and for multiple message exchange patterns, PubSub+ enabled event mesh can listen for and transmit events emitted by a wide variety of applications and devices (legacy, cloud native, IoT etc.) to an iPaaS, whenever and however might be necessary. PubSub+ also has purpose-built connectors for iPaaS provider like Dell Boomi.Learn more about PubSub+ for integration
  • IoT enablement
    Many enterprises use PubSub+ Event Broker to support massively scaled IP-addressed applications, and the IoT ecosystems that surround them. PubSub+ Event Broker can terminate hundreds of thousands of Web or MQTT connections per broker, letting massively scaled open Web/Mobile applications stream data to analytics engines. A group of connected brokers can maintain continuous connections with millions of devices, and advanced features and functionality in the broker enable a variety of messages (event notifications, commands, requests etc.) to be intelligently filtered and routed between back-end systems, web applications, and devices.Learn more about PubSub+ for IoT
  • Real-time applications
    Known for its low and predictable latency, robustness and security, PubSub+ Event Broker (available as an appliance in addition to software and as-a-service) has been used to support mission critical applications for over a decade, in verticals like capital markets, gaming, and public transportation (air, rail, sea).Learn more about PubSub+ for real-time apps
  • Event discovery, visualization and cataloging
    With any large enterprise, there’s more at play than just Kafka native applications. You’ve got cloud providers, legacy systems, and IoT devices that have event streams that need to be managed. PubSub+ Event Portal for Kafka can manage these heterogeneous event streams and make it easy for your enterprise to discover, visualize, catalog, and share your Apache Kafka event streams, including those from Confluent and Amazon MSK.Learn more about Event Portal for Kafka

Here’s why you may need both
Solace and Kafka brokers

  1. Running real-time analytics across the hybrid cloud?

    Running real-time analytics across the hybrid cloudConnect a Kafka event stream to PubSub+ Event Broker natively or using external connectors to route a filtered set of information to a cloud analytics engine. PubSub+ Event Broker keeps bandwidth and consumption low by using fine-grained filtering to deliver exactly and only the events required.

  2. Want to ingest high volume web and mobile device data to Kafka for aggregation and analytics?

    Want to ingest high volume web and mobile device data to Kafka for aggregation and analytics PubSub+ Event Broker supports MQTT connectivity at massive scale, enabling reliable, secure and real-time communications with tens of millions of devices or vehicles so you can stream data to Kafka for aggregation or analytics. And PubSub+ Event Broker supports a variety of popular and open standard protocols and APIs, so you can stream events to Kafka from all your applications, running in all kinds of cloud and on-premises environments.

  3. Trying to stream events recorded in Kafka to connected devices or vehicles?

    Trying to stream events recorded in Kafka to connected devices or vehicles?In addition to supporting the inbound aggregation of events from millions of connected devices, PubSub+ supports bi-directional messaging and the unique addressing of millions of devices through fine-grained filtering. For example, with Solace and Kafka working together you could send a tornado warning alert to a specific vehicle, or all vehicles in or approaching the affected area. The Sink Connector allows PubSub+ to send record events placed in a single Kafka topic to whatever vehicles satisfy a given condition or topic, whether that’s as general as being in the tri-county area or as specific as a single vehicle.

Learn more
about PubSub+ Connectors for Kafka
  Take me
to PubSub+ Connector Hub

And if you need to discover, visualize, and catalog your Kafka event streams, then add PubSub+ Event Portal

You may be thinking, “Event Portal sounds great, but we use Kafka brokers, not Solace”.

Well, we’ve got good news! PubSub+ Event Portal has a flexible discovery agent, and you don’t need Solace event brokers to use it, it can discover your Kafka streams.

With any large enterprise, there’s more at play than just Kafka native applications. You’ve got cloud providers, legacy systems, and IoT devices that have event streams that need to be managed. PubSub+ Event Portal for Kafka can manage these heterogeneous event streams and make it easy for your enterprise to discover, visualize, catalog, and share your Apache Kafka event streams, including those from Confluent and Amazon MSK.

Learn more about Event Portal for Kafka

Quick Start Guides

Solace PubSub+ Connector
for Kafka: Sink

Quick Start Sink Connector

Solace PubSub+ Connector
for Kafka: Source

Quick Start Source Connector

Related resources

Still have questions?