Summary: The capability to event-enable and stream SAP data across the distributed enterprise is becoming increasingly important, but most enterprises are not well equipped to do so. A solution is to use integration technology that can be embedded in SAP systems to emit events to an event broker; a network of event brokers can then connect across environments to create an event mesh. The event mesh enables events to flow between SAP and non-SAP applications, on premises and in the cloud, dynamically and in real-time.
“Event-driven architecture (EDA) is a design paradigm in which a software component executes in response to receiving one more event notification. EDA is more loosely coupled than the client/server paradigm because the component that sends the notification doesn’t know the identity of the receiving components at the time of compiling.” —Gartner
Let’s start with why event-driven design patterns and event-driven architecture (EDA) are becoming increasingly important to enterprises everywhere:
Customers (both external and internal to the enterprise) expect and demand to be able to interact with the business in real-time.
The promise of an event-driven architecture is to make businesses more real-time in their operations and customer interactions. This is done by enabling every software component in the system to publish and subscribe to event notifications in real-time (where an ‘event’ is a change of state: data is created, modified or deleted).
In Gartner’s terms, “EDA delivers some critical aspects of digital business: real-time intelligence and responsiveness, extensibility of applications, global scale, and ‘lossless’ business activity analysis.” The value of data diminishes over time, and businesses that want to make the most of it will put it in motion the instant it is created. You can read more about the business case for event-driven architecture here.
Specifically, for enterprises with SAP estates, there are a number of use cases/projects that may be on your radar which would benefit from an event-driven design paradigm. Here are a few:
Each of these use cases (and there are many more) are best served with an event-driven architecture. But adopting an EDA, particularly one that includes legacy SAP estates on premises, can be challenging.
In the cloud, event-driven data movement between SAP applications can be enabled with SAP Enterprise Messaging, SAP’s core event streaming fabric. Most SAP S/4HANA modules can publish events to SAP Enterprise Messaging in the cloud.
On premises, however, the process to extract and event-enable SAP data from ERP (exp. ECC modules for Materials Management and Sales & Distribution) can be much more difficult. You might think to use SAP middleware like PI or PO to do this, but these technologies were not designed to support EDA:
Alternatively, you might try to leverage existing ESB/messaging technology like TIBCO BW/EMS or IBM WebSphere/MQ, but in these cases you are limited to accessing data that SAP APIs expose, and in the manner (request-reply, batch processing) that the APIs allow.
In both cases, even if you somehow managed to event-enable SAP systems in a data center, there’s the added challenge of efficiently streaming those events to the many applications that might be interested in receiving them. The SAP and non-SAP applications may run in other data centers, plants, or in clouds, all of which can be globally distributed.
Then there are the questions that might not be burning for you today, but are likely to be right around the corner: how do you manage and optimize the flow of events in your system? How do you enable developers to design and discover events? How do you efficiently life-cycle manage events so you can re-use them, monitor their use, and deprecate them when they are no longer needed?
So, in trying to adopt an EDA that incorporates SAP data, you’re really facing challenges on multiple fronts:
Below, I’ll suggest a solution to all three challenges, but I’ll address the second challenge first, because that really gets to the heart of enabling EDA for the distributed enterprise.
An event mesh is a foundational layer of technology for event streaming across the distributed enterprise. It is an architecture layer that allows events from one application to be dynamically routed and received by any other application, no matter where the applications are deployed (no cloud, private cloud, public cloud).
You create an event mesh by deploying and connecting event brokers (modern messaging middleware) across all your environments – wherever you have applications that need access to events created throughout the enterprise.
Solace is a leading proponent of the event mesh concept. The concept has been highlighted as a “digital business platform priority” by Gartneri. Solace PubSub+ Event Broker is available as run-anywhere software, purpose-built hardware, and as a managed cloud service.
Event mesh in action: event mesh is an architectural layer that can enable events to be transmitted in real-time, between all your distributed technologies.
Here’s how it works: an event broker will be deployed in an environment, and various applications will connect to it. Some will publish events (and other data) to it, and some will subscribe to various events. Then, as events are published and pushed to the broker, applications subscribing to those events will receive them, without delay. That’s the story for a single, local environment – it’s a basic publish-subscribe messaging paradigm.
Now imagine this happening in multiple environments—including data centers, public clouds, private clouds, and across different geographies and cloud regions. Because the event brokers running in the different environments are interoperable, they can be connected seamlessly. The nature of the network is such that when a new event publisher or subscriber registers its interest/intent with a local broker, all the brokers in the system know about it. This enables the mesh to intelligently route events between producer and consumer applications; no matter where the applications touch the mesh, events will be routed automatically, dynamically, and in the most efficient way possible.
All of this works like a dream as long as you have a way to connect your application to a local event broker where it can register for events it wants to publish or receive. But as noted above, there are numerous challenges to enabling this for SAP applications on premises. You will need some form of integration technology to get on-prem SAP system events onto the mesh.
ASAPIO Cloud Integrator (ACI) is an “add-on” for SAP systems that enables data integration between on-premises and cloud-based applications. Unlike SAP PI/PO or legacy ESBs/messaging, ACI is not middleware; it runs embedded on SAP systems and retrieves data natively. With ACI you can set-up native event triggers on an SAP object and push the events to an event broker/event mesh in real-time.
ACI uses SAP “change pointers” to extract events from SAP systems as they occur. Anything that SAP change pointers support can be emitted as events by ECC, including master data events, PLM updates, delivery notifications, and work orders. Here’s a visual representation of ACI and Solace PubSub+ event brokers at work:
You can learn more about ACI and how it works with PubSub+ Event Broker here.
It likely won’t be until you’ve started to deploy EDA for select use cases that you’ll run into some basic questions and roadblocks that slow broader adoption. Questions like:
The fact of the matter is that adopting an EDA that extends beyond one or a few select use cases can be challenging today. There is a lack of tooling and infrastructure, the likes of which support the status quo (REST/API-based architecture).
But new tools and models for deploying and managing EDA are emerging.
In his post, Event-Driven Architecture Demands Event Lifecycle Management, Senior Architect and EDA guru Jonathan Schabowsky recommends four categories of tools/capabilities that would enable faster and more efficient adoption of EDA: tools for (1) event discovery, (2) event definition (3) event development and (4) operations.
In a nutshell, Jonathan says that enterprises need tools and infrastructure similar to that associated with API management, but designed specifically for the event-driven, asynchronous world. Things like event brokers (act like an API gateway), an event portal (similar to a service portal but for events), and code generators make the creation of event-driven applications faster and easier.
Solace PubSub+ Platform includes components to address many of these needs, including an event portal and event mesh management solutions. PubSub+ Event Portal gives developers and architects tools to design, describe, and discover events within their systems. One is able to see the relationships between applications and events, making event-driven applications and microservices easier to design, deploy, and evolve. Download this datasheet to learn more or check out this demo video.
Event-driven design patterns and architectures are becoming increasingly important to enterprises that want to enable real-time B2B and B2C interactions. But adopting EDA is not easy today, particularly for enterprises looking to incorporate legacy SAP estates. Event mesh, event brokers, and embeddable integration technology are tools that can help accelerate EDA adoption for any enterprise, including those powered by SAP.
You may be wondering where to start.
Denis King, Solace’s Chief Executive Officer, recommends six steps:
In my next blog, I’ll provide more of the technical details on how to deploy PubSub+ and ASAPIO Cloud Integrator together to improve business processes.
iSource: Gartner “The Key Trends in PaaS and Platform Architecture”, 28 Feb 2019, Yefim Natis, Fabrizio Biscotti, Massimo Pezzini, Paul Vincent