Message Brokering: Kafka vs DataHub

I’ve done through a Mendix presentation on microservices which goes over how to use Kafta as a message broker for sending/receiving data between applications. I’ve also recently tried out the DataHub feature in mendix as well.   They both seem to serve the same purposes. Is Datahub just a more offering that basically replaces Kafka? Is Kafka more robust in that it lets you create custom microflows to handle data being updated.. or does DataHub also do this as well (without any sort of entity level update triggers)?
2 answers

Data Hub is a product that offers different tools to help you connect your apps to other apps and services:

  • An important component is the Data Hub catalog which helps you discover what is available in your landscape.
  • For actual communication Data Hub aims support different protocols. Main focus at the moment is on synchronous communication using REST Odata. External Entities allow you to work with REST/Odata endpoints without having to build integration microflows. Odata specifies how REST APIs should be implemented that support filtering, sorting and pagination. This is used by the Mendix platform to automatically get the right data when using an External Entity.
  • Business Events offer a way to do asynchronous communication. Application can publish events, and these will be delivered in near real-time to subscribed applications. This means that you no longer need to poll to discover interesting data changes. Business Events use Kafka under the hood, but this is completely hidden from the low-code developer. This is the big difference with the Kafka module, where a developer needs to have Kafka knowledge in order to use it correctly.


Business Events and External Entities can be used together where Business Events will inform applications in real-time about relevant events, and the actual data is then pulled through External Entities.


Here’s a document in the evaluation guide about the “business events” capability coming to Data Hub: 


You’re right, it’s based on Kafka and offers similar functionality to the Kafka module. You can define your logic for emitting and processing messages in microflows with both solutions. The main benefits of the capability in Data Hub are:

  • There are native platform actions to publish and subscribe to events
  • Data Hub acts as a directory service to help you find published events
  • Mendix manages the message broker service (Kafka) so you don’t have to