Reactive or non-blocking processing is in high demand, but before adopting it, you should understand its thread model. For a thread model, two things are very important: knowing thread communication and execution flows. In this article, I will try to go in-depth for both these aspects.
What Is Reactive Programming?
There are several definitions on the web. The Wiki definition is a bit theoretical and generic. From a threading perspective, my version is “Reactive Programming is the processing of the asynchronous event stream on which you can observe”
You can find much more discussion about reactive programming on the web, but for now, let’s stick to our topic of Reactive Thread Models. Let’s start with a very simple reactive use case, where we want to return the sum of an integer array. What this means is that our main request thread should not get blocked while processing the sum of an integer array.
You might also like: Understanding Reactor Pattern: Thread-Based and Event-Driven
Let’s start by creating a simple WebServer and trying to depict the same.
Here, we are creating a socket server, opening a socket, and keeping the socket alive until the asynchronous processing is complete. Asynchronous processing is happening by calling the nonBlockingSum and passing the consumer function or lambada as observable. Once the sum is ready, our function/lambada will get a callback. From callback, we return the sum value to the client via socket.
So if you call the URL, http://localhost:9090, in parallel/sequence, you will get the following response:
The above is just to depict the reactive stuff. You should use netty/undertow/servlet 3.1 as the reactive webserver. Now let’s get somewhat deep and try to understand the following flows:
- Blocking Call
- Non blocking call
- Non blocking call with thread execution
- Serial business flow processing
- Parallel business flow processing
We are going to use the Spring WebFlux, which is built on top of the Reactor framework for reactive programming. Let’s cover sections 1 and 2 in this article and other sections in Part 2 so that it will be very easy to understand.
We are going to write a simple sum method and make it reactive using the supplier function.
1) Blocking Call
As shown in the diagram, the request thread is getting blocked until the computation of the sum is completed. If we execute, the code will get the following response:
This clearly shows that the blocking call waited until the sum execution was completed.
2) Non Blocking Call
Here, the request thread is not blocked, and the execution of the sum is shifted to a thread allocated from the thread pool. Callback and function/lambada is also executed on the same thread. If we execute, the code will get the following response:
This clearly shows that the request thread didn’t wait until the sum was computed. Also, the consumer and sum were processed in the same thread.
Section 3, 4 and 5 will be covered in Part 2, and you can get the code on GitHub.