Introduction

In this post I will go through my demo on how to use the contract first methodology for creating event driven applications. This is going to be a straight up installation instruction, and code with some explanation. I cover most of the basics in the last post. 

This is a simple money transfer application, which receives transfer requests from a restful endpoint.The ultimate goal is to process a single transfer request into transaction records on each account, for better bookkeeping purposes.  The structure of the system is a typical event driven implementation, where you have Topics that are responsible for accepting events, and have services subscribing or publishing to the topics.  In the demo, we have one topic that stores the Transfer Request events, and one stores the Account Record events. 

contract development workflow

Solution

Now, let’s setup the foundation for every EDA. Kafka cluster, go ahead and download Kafka, unzip. 

Start up the Kafka Cluster 

And create the topics

And create the topics needed.

From the last post, we know each topic would have a schema describing its data type, format and serialization/serialization mechanisms. These are stored in the Apicurio Registry.  Start the Apicurio Registry locally with Kafka as the persistence store. 

Start the Apicurio Registry:

After successfully starting the Apicurio Registry, go to your browser http://localhost:8080/ui/artifacts upload the schema for both topics. (Note we are only doing value this time, which is a common case, because keys often have simple value text representation.) 

In the browser upload the Protobuf Schema with name demo-protobuf:

Upload the Avro schema with name demo-avro:

After uploading the two schema, this is what you will see the two schema that describe how each topic would consume the events. 

apicurio registry

Creating Rest to Protobuf camel route:   protobuf camel route

Start by creating a new Camel project

Update the pom.xml file with all the needed dependencies and plugins:

Couple of things I want to highlight here: 

  1. The Apicurio Registry libraries that we are referencing in Kafka configuration, for serial/deserializing data into kafka. (And identify the strategy of what to do with the schema)

  2. The Apicurio Registry maven plugins for downloading Topic schema from the registry. 

  3. The Protobuf  maven plugins that generate POJO from the schema so it’s easier to handle the data in Java. 

Run

to download and generate the POJO from Protobuf schema, notice a TransactionProtos.java will appear in the source folder. Now you know the SCHEMA of what the topic is accepting.

Add the following route in the MyRouteBuilder.java

  1. Accepting as REST endpoint 

  2. Simply maps the input JSON streams into Protobuf using the Camel protobuf components

  3.  Sent to the webtrans Kafka topic that serializes Pojo using the Apicurio  Protobuf libraries. 

With configuration in the application.properties file

Start the application: 

Move on to the second camel application, this application picks up the transfer request and splits it into two account records. This time we get two contracts to satisfy. 

apicurio registry

Start by creating another new Camel project 

Update the pom.xml file with all the needed dependencies and plugins:

Highlights: 

  1. The Apicurio Registry libraries that we are referencing in Kafka configuration, for serial/deserializing data into kafka. (And identify the strategy of what to do with the schema)

  2. The Apicurio Registry maven plugins for downloading Topic schema from the registry. This time we are downloading two schemas for two endpoints

  3. The Protobuf  maven plugins that generate POJO from the schema so it’s easier to handle the data in Java. 

  4. The Avro maven plugins to generate POJO from the schema. 

Run

to download and generate the POJO from Protobuf and AVRO schema. In the source code folder, you know the SCHEMA of what the topic is sending through the protobuf TransactionProtos.java  POJO and the SCHEMA of what the topic is accepting through AVRO Transaction.java POJO.

Add the following route in the MyRouteBuilder.java

  1. Subscribing to the Kafka topic 

  2. Convert the byte stream into the POJO. 

  3. Simply work with the POJO and fill in the values needed. 

  4. Sent to the transrec Kafka topic that serializes Pojo using the Apicurio AVRO serialization libraries. 

With configuration in the application.properties file, 

Start the application: 

The third one is similar to the others, it subscribes to the account transaction topic, and places the record into MongoDB with the account name as key. 

apicurio registry

Login into MongoDB, and currently nothing in the database, 

Since the steps in POM and properties are similar, I won’t repeat it again. But you can find the pom file and configuration file here in my repo. 

Taking a look at the simple MyRouteBuilder.java in this application, 

  1. Subscribing to the Kafka topic , deserialized with Avro deserializer.

  2. Convert the input stream into String

  3. Since the stream is a valid JSON, I can directly send it to Mongo components in Camel. 

Start the application: 

At the point, you are ready to send the transfer request: 

You will be able to see the result in the MongoDB

That is it! Find the example repo here

See the demo in action.

 

For the concept, visit my previous blog: 

https://dzone.com/articles/contract-first-development-the-event-driven-way



Source link

Write A Comment