Mendix <> Kafka Including Avro Schemas

0
Hi there, Just wondering if anybody has any experience using Avro schema’s in conjunction with the Mendix Kafka module? I am not sure how to approach this implementation, so any advise would be helpful. Kind regards,
asked
2 answers
4

It took a bit longer than expected due to incorrect asumptions, but eventually we managed to get it working fairly easily. Our solution consists of three parts:

  1. Mendix
  2. Mendix supplied Kafka module
  3. Custom Java code to create AVRO byte arrays

 

The idea behind our solution is to use as many standard Mendix components as possible and to have a generic implementation that can be reused in any project. Therefore, we create a JSON message in Mendix, using the standard export mappings and we convert these JSON message to AVRO (to a GenericRecord) in our custom Java code. One thing to note here is that AVRO expects explicit null values for empty elements for which no default value is specified in the schema, but Mendix supports this out of the box with the option ‘send empty values’ set to ‘yes, as null’ in the export mapping.

 

The custom Java code uses three main libraries:

  1. https://mvnrepository.com/artifact/io.confluent/kafka-schema-registry-client
  2. https://mvnrepository.com/artifact/io.confluent/kafka-avro-serializer
  3. https://mvnrepository.com/artifact/tech.allegro.schema.json2avro/converter

 

We implemented both an approach which uses the schema registry and an approach which doesn’t use a schema registry. Without a schema registry, you will have to manually supply a schema from the Mendix application to the custom Java code, and you will have to fiddle with magic bytes and schema id’s manually.

 

We had to modify the Mendix supplied Kafka module, but the changes were fairly straightforward. I duplicated a number of files and modified them to be able to handle avro messages:

  • SendSynchronous and SendAsynchronous
  • StartConsumer and StartProducer
  • StopProducer
  • KafkaConsumerRepository and KafkaProducerRepository
  • ConsumerRunner

 

The modifications in these files were mainly concerned with a different type: KafkaProducer<String, String> to KafkaProducer<String, byte[]>  kafkaProducer. In the Send* and ConsumerRunner, I implemented the conversion from the JSON message generated in Mendix to the AVRO message generated by our custom Java code. Once you have modified these files, the configuration options provided by the module are enough to make this work.

 

Hope this helps with your implementation!

answered
0

Hi @Kranz Bertram,

 

We are starting a PoC where we’re struggling with similar questions. Have you finished your developments ?

We’re using Confluent on our projects. So would be interisting now if nowadays we have something standard to use.

 

answered