Integrate an open-source, distributed event streaming platform that handles real-time data feeds, Apache Kafka, into your existing SAP Business Suite or S/4 Hana Private or Public Cloud Landscape.
The following diagram suggests a scenario with Apache Kafka as a message broker, integrating Non-SAP Business Application events with SAP Business Systems. For easy scalability outside of the SAP landscape the Kafka message queue client is implemented deployed in docker containers to a Hyperscaler such as Amazon. A 2nd option is to connect Integration Suite iFlows to Kafka directly via the standard adapter.
SAP Integration Suite contains the event processing logic and transforms the incoming messages into SAP interfaces. For data replication scenarios such as orders or products, it's possible to map the events to standard formats like IDocs in order to have standard tools available to reprocess or modify in the target system in case of failure.
For a smaller integration (such as submitting an opt-in without changing an entire master data record) the usage of a RAP application deployed in SAP BTP ABAP Environment is a modern option. The RAP application can be designed to perform applications calls to standard or custom BAPI's in the backend system.
A simpler scenario without the extra RAP layer is to make RFC Adapter calls to the backend system directly from the Integration Suite iFlow.
When business events are originated in the SAP Backend and need to be published to Apache Kafka: Rest Adapter interfaces in SAP Integration Suite can be called from the Business backend to submit the event data (for example "goods issue" performed in the warehouse). The iFlow forwards the message a a Rest API supplied by the Kafka Queue Client which publishes the message to Apache Kafka server.
YUnit has experience in supporting a major B2C SAP Business Suite implementation landscape in their migration to event enabled processes and therefore knows about the challenges and solutions in making such projects a success.