Message Queues

Messaging Queues

Messaging queues are a form of middleware that handle messages (data packets) between applications. They ensure that messages sent from a producer service are properly received by a consumer service, even if the consumer is not ready to process them immediately.

Pros and Cons of Message Queues

Message queues have an only-once delivery goal, which can be thought of as a back pressure mechanism if you’re dealing with microservices. As such, message queues are common for managing tasks and workloads where you only want each task processed once.

  1. Message queues only deliver each message once, to a single consumer.
  2. Message queues process on a first-come, first-serve basis.
  3. Message queues may not deliver in the same order messages are queued.
  4. Examples of message queues include RabbitMQ, ActiveMQ, and IronMQ.

Running ActiveMQ in local machine for development purposes

Sometimes, we have to run ActiveMQ in development machines. This could be because, the applications need to to start-up successfully. If ActiveMQ is not available in local, the start-up would fail.

Download it from here: https://activemq.apache.org/components/classic/documentation/download-archives

To start and stop ActiveMQ, navigate to the ActiveMQ’s bin folder and run the following command:

./activemq start

./activemq stop

You can navigate to the UI of the ActiveMQ server by going to the following url: 127.0.0.1:8161. You will be prompted to login: username is admin and password is admin. In the UI, we can see information about the ActiveMQ server we are running, the queues, number of messages enqueued and dequeued, topics, and more.

Questions

Rate throttling

Suppose that an application is picking up messages from a queue, and relying on a backend application to process them. If the backend service is slow, how is this application supposed to behave? How would it know that it should stop picking up messages from the queue? Will it automatically stop picking up too many messages from the queue because of limited threadpool and limited resources? Is there an annotation that we can use on the listener in the application to maintain a threshold for the number of messages that this application can process from the queue simultaneously?

And, if the client applications keep putting messages in the queue, wouldn’t the queue eventually become full? When does the “back pressure” mechanism come into play? Does it happen automatically or is any configuration necessary?

Reading material

  1. https://sudhir.io/the-big-little-guide-to-message-queues
  2. https://stackoverflow.com/questions/21363302/rabbitmq-message-order-of-delivery

Links to this note