...
You can see the difference if you change the async=true option to async=false
in the src/main/resources/META-INF/spring/camel-client.xml
file.
Running
You will need to compile this example first:
...
To stop the server hit ctrl + c
Sample output
When the client is running it outputs all requests and responses on the screen.
As the client is single threaded it will send the messages in order, e.g. from 0 to 99.
Code Block |
---|
[ org.apache.camel.example.client.CamelClient.main()] +++ request +++ INFO Exchange[BodyType:String, Body:Message 0]
[ org.apache.camel.example.client.CamelClient.main()] +++ request +++ INFO Exchange[BodyType:String, Body:Message 1]
[ org.apache.camel.example.client.CamelClient.main()] +++ request +++ INFO Exchange[BodyType:String, Body:Message 2]
[ org.apache.camel.example.client.CamelClient.main()] +++ request +++ INFO Exchange[BodyType:String, Body:Message 3]
[ org.apache.camel.example.client.CamelClient.main()] +++ request +++ INFO Exchange[BodyType:String, Body:Message 4]
[ org.apache.camel.example.client.CamelClient.main()] +++ request +++ INFO Exchange[BodyType:String, Body:Message 5]
[ org.apache.camel.example.client.CamelClient.main()] +++ request +++ INFO Exchange[BodyType:String, Body:Message 6]
...
|
As the HTTP server is simulating some time to process each message its replies will likely come after all the client have send all 100 messages. When they arrive they come back out of order
Code Block |
---|
[ Camel thread 2: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 7]
[ Camel thread 3: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 27]
[ Camel thread 1: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 2]
[ Camel thread 6: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 5]
[ Camel thread 4: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 3]
[ Camel thread 7: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 28]
[ Camel thread 5: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 24]
[ Camel thread 8: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 9]
|
And as you can see they are being handled by different threads, as we have configured using a poolSize=10
option.
Running synchronous
If we on the other hand change to synchronous mode, that means we will use the single thread for the entire routing and it will be blocked while waiting for the reply from the HTTP server. To see this in action change the async="true"
to async="false"
.
The output is then as expected a request, reply and so forth. And of course the throughput is much lower as we are only handle a single message at the time and blocked while waiting for the HTTP server reply.
Code Block |
---|
[ main] +++ request +++ INFO Exchange[BodyType:String, Body:Message 4]
[ main] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 4]
[ main] +++ request +++ INFO Exchange[BodyType:String, Body:Message 5]
[ main] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 5]
[ main] +++ request +++ INFO Exchange[BodyType:String, Body:Message 6]
[ main] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 6]
[ main] +++ request +++ INFO Exchange[BodyType:String, Body:Message 7]
[ main] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 7]
[ main] +++ request +++ INFO Exchange[BodyType:String, Body:Message 8]
[ main] +++ reply +++ INFO Exchange[BodyType:String, Body:Bye Message 8]
|