Throttling JMS Messages in your Enterprise Integration

Tanya Madurapperuma
7 min readJul 14, 2021

Introduction

Throttling is an approach of regulating the traffic to avoid overloading of requests to a particular application or a backend.

Out of the many advantages of throttling, protecting the application/backend from bursting, improving performance, optimizing the available resources are a few.

Use-case

Let’s consider about the different operational counters in an immigration services centre. Your visa documents will be collected at one counter. At another counter your biometric data is collected. You’ll have a face to face interview in a different counter. Visa processing fees have to be paid in another counter.

A person who is expecting to get his visa processed, has to visit all these different counters and there can be different sizes of queues into these different counters.

Immigration Service Center Use-case

Assume a use-case where the immigration services centre wants to optimize their resources to these different counters based on different factors like queue sizes, time taken to serve a single request etc. They’ll be observing the behaviors at regular intervals and decide on the resource allocation. So if we model each of these queues as JMS queues, then at regular intervals, we want to control from which queues we consume messages during that time window, how many messages we want to consume and which queues we want to pause consuming messages etc.

In such situations, JMS throttling comes into the picture to help in regulating the traffic of a JMS queue. If we rephrase that according to this use-case, JMS throttling helps us regulate the traffic to a particular counter in the Immigration Services Centre.

In this article, I’ll walk you through implementing this business logic using WSO2 Micro Integrator which is an open-source, configuration-driven integrator. Make sure you use an updated Micro Integrator distribution to verify the below solution.

Solution Overview

The basic flow of the solution would look like below.

Integration Constructs

  • A Schedule task is used to run an execution which will take care of consuming messages from certain JMS queues and pausing message consumption in certain JMS queues. Schedule task can be implemented to run based on a configured cron expression.
  • Proxy services are used to model the behavior of consuming JMS messages from a queue and to do any further processing. So there will be a dedicated proxy service for each of the JMS queues.

Steps

  • All the JMS proxies will be at unloaded state at the server start up. We can use the startOnLoad=”false” property to configure this behavior.
  • And then we have to configure a schedule task which is running at every N th granularity.
  • We have the capability of attaching a sequence to the configured scheduled task. In this sequence, we have to set the throttle limits of the chosen JMS queues as system variables. And then activate the appropriate proxies for those JMS queues using the management API.
  • If we consider the second iteration of our schedule task, certain set of proxies will be in activated state and the proxies which we want to activate on the second iteration will be different from those which are activated at the moment. Hence at each iteration before activating any chosen proxy, we will have to deactivate all the proxies.
  • Once the proxy is in the activated state, the proxy will start consuming messages from the configured JMS queue.
  • In your use-case, if you don’t require to dynamically change the list of queues which you want to consume messages at given intervals, then activating/deactivating proxies is not needed. You can load all the proxies at start up and then configure the throttle limits using system variables.
  • Further more if your throttle limits are also not dynamic, then setting up system variables also not needed. You can directly configure the throttle limit at the deployment time using “jms.proxy.throttle.count” parameter.

Solution Walkthrough

We start by creating a Maven Multi Module Project (MMM) using the Integration Studio which is the low code editor for WSO2 Micro Integrator.

And then we have to create three types of sub projects inside this MMM Project as follows.

  • ESB Config Project — This will contain all the integration artifacts for the actual transformation/mediation.
  • Composite Exporter Project — This helps us to bundle the integration artifacts which are inside the ESB config project and export them into an actual server.
  • Connector Exporter Project — If we are using any connectors for our project, then we need this to use those connectors in our project. All the available connectors can be explored at the connector store.

Let’s now walkthrough the integration artifacts inside the ESB config project.

Proxy Services

You can right click on the ESB config project to create a new proxy service.

In this proxy service we are consuming messages from a Tibco EMS queue called QueueA and publishing those messages to another Tibco EMS queue called DestQueueA without doing any intermediate processing.

But if you need to perform any processing before publishing them into the DestQueueA, you can perform those transformations inside the inSequence of the proxy service.

Single proxy service is used to consume messages from a single queue, hence we need separate proxy services for each queues.

At the bottom of the proxy, you can notice the deployment time configuration of the Tibco EMS queue ( queue which we are planning to consume messages from ) are defined as parameters. There we are have set the default throttling limit using the “jms.proxy.throttle.count” parameter.

<parameter name=”jms.proxy.throttle.count”>1</parameter>

Modifying the throttle limit at Runtime

Since we want to update this throttling limit at runtime, we have also defined a system variable which we will be using to read the updated throttle limit as “jms.proxy.throttle.count.systemProperty”

<parameter name="jms.proxy.throttle.count.systemProperty">QueueAThrottleLimit</parameter>

So at the runtime we can set the new throttle limit as follows using a system property.

<property description="throttle limit" value="10" name="QueueAThrottleLimit" scope="system" type="STRING"/>

We can do this setting in a sequence outside our JMS proxy service. You may have noticed that, throttle limit was set to 1 at the deployment time and we are modifying it to be 10 at the runtime.

What happens underneath ?

So what happens underneath is, the thread which is consuming messages, is put into sleep after consuming the defined number of messages. For example if you have set the throttle limit as 30 messages per minute, then say we consume 30 messages in first 20 seconds ( as throttle mode is batch), after that the thread will be put into sleep for the next 40 seconds ( 1 min = 60 seconds ).

Publishing messages to another queue

We have provided the connection url to the destination queue which we want to publish the consumed messages as follows.

<address uri="jms:/DestQueueA?transport.jms.ConnectionFactoryJNDIName=QueueConnectionFactory&amp;java.naming.factory.initial=com.tibco.tibjms.naming.TibjmsInitialContextFactory&amp;java.naming.provider.url=tcp://127.0.0.1:7222&amp;transport.jms.DestinationType=queue">

Schedule Task

Coming back to the implementation of our use-case, we now have to create a schedule task to run on a defined interval. Right click on the ESB config project and add a schedule task.

When creating the schedule task, we have the ability to attach a sequence to it. Then at every defined interval, the attached sequence will be executed. In the sample provided below, proxyDeployingSeq will be executed at every minute.

Sequences

Now we have to implement the logic inside the proxyDeployingSeq to set the throttle limit system variables and activate/deactivate the required proxies.

So as described under steps section above, we are calling the management api to deactivate all the proxies. In-order to call the management api, we first have to obtain an access token.

Afterwards, we have to read the desired queues which we want to consume messages from a defined source such as a file, a database, an excel sheet etc. We can export the required connectors such as file connector etc to read the information using the Connector exporter project.

And then, we will activate the respective proxy services for those queues using the same management api after setting the throttle limits as system variables.

Conclusion

By using WSO2 Micro Integrator you have the capability of setting throttling limits to JMS queues at deployment time as well as at runtime.

--

--