HCL SW Blogs
Select Page

Let us understand first about AWS Simple Queue Service (SQS), how Amazon SQS works, it is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications.

aws sqs

AWS SQS eliminates the complexity and overhead associated with managing and operating message-oriented middleware and empowers developers to focus on differentiating work. Using SQS, you can send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available.

SQS offers two types of message queues.

  • Standard queues offer maximum throughput, best-effort ordering, and at-least-once delivery.
  • SQS FIFO queues are designed to guarantee that messages are processed exactly once, in the exact order that they are sent.


So, to enable with Amazon SQS service ,we have developed AWS SQS plugin, using which we can send message for both standard queue as well as FIFO queue and also receive messages from both queue types, either by printing it on job log or saving it in a file. We can also use message output for different plugin as an input parameter. In this way our workload automation users get benefits of SQS service.

Now let us see how this plugin works,


  • AWS Account
  • AWS IAM credentials (AWS Key ID, AWS Secret Key).


Log in to the Dynamic Workload Console and open the Workload Designer. Choose to create a new job and select “AWSSQS” job type in the Cloud section


Workload Designer


aws sqs 95002



Establishing connection to the AWS server:

In the connection tab, specify the Access Key ID, Secret Access Key and AWS Region fields to connect to AWS server from Workload Automation end and click on Test Connection. A success message will get displayed when the connection is successful.


  • Access Key ID : This attribute is associated with the AWS account and is mandatory.
  • Secret Access Key : This attribute is associated with the AWS account and is mandatory.
  • AWS Region : This attribute is related to the location in which topics are created. This field is also mandatory.


Connection Tab

connection tab access key



  • Queue URL

Provide the queue URL. The queue URL is required to send, receive, and delete queue messages


Action Tab


action tab queue


  • Choose Operation to Perform

Publish Message

You can publish the message to FIFO queue or Standard queue.

For FIFO queue, below information needs to be provided.


Duplication ID

The token used for deduplication of sent messages. If a message with a particular message deduplication ID is sent successfully, any messages sent with the same message deduplication ID are accepted successfully but are not delivered during the 5-minute deduplication interval.

Group ID

The tag that specifies that a message belongs to a specific message group. Messages that belong to the same message group are always processed one by one, in a strict order relative to the message group.

You can send message by manually inputting the message or through selecting a file.

action tab publish message body input from file

Receive Message


receive messages

Action Tab


You can receive message by pulling one at a time or more than one and provides a feature to store the message to file.


Polling Timeout:


It is used as timeout for polling in case the queue is empty. Basically, the job will wait for a new message to be in the queue for a maximum period defined in Polling Timeout Field. the value is ms and its default is 30000ms


Track/Monitor Job:


track monitor job


sqs rcv

sqs send demo

Job Properties – Message Successfully Received

sqs rcv properties


Job Properties – Message Successfully Sent

sqs send demo properties


Therefore, AWS SQS plugin in Workload Automation is a best fit for those who are looking to manage queue service to optimize the waiting period for any service.


Are you curious to try out the Amazon SQS plugin? Download the integrations from the Automation Hub and get started OR drop a line at santhoshkumar.kumar@hcl.com.


Author’s BIO


author rooparani karutiRooparani Karuti – Senior Test Specialist at HCLSoftware

Working as senior test specialist for Workload Automation-Plugin Factory team in HCLSoftware Lab, Bangalore. Having 9 years of experience in both manual and automation test scenarios across various domains. ​

LinkedIn: https://www.linkedin.com/in/roopa-karuti-080b7627/


paolo canofariPaolo Canofari – Developer at HCLSoftware

Works as a Developer for the Workload Automation L3 Support in HCLSoftware, Rome. Having skills on Java, Python, C/C++, Angular 8 and Docker/Kubernetes deployments.

LinkedIn: www.linkedin.com/in/paolo-canofari

Comment wrap
Further Reading
Filters result by