Let us understand first about AWS Simple Queue Service (SQS), how Amazon SQS works, it is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications.

AWS SQS eliminates the complexity and overhead associated with managing and operating message-oriented middleware and empowers developers to focus on differentiating work. Using SQS, you can send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available.

SQS offers two types of message queues.

  • Standard queues offer maximum throughput, best-effort ordering, and at-least-once delivery.
  • SQS FIFO queues are designed to guarantee that messages are processed exactly once, in the exact order that they are sent.

 

So, to enable with Amazon SQS service ,we have developed AWS SQS plugin, using which we can send message for both standard queue as well as FIFO queue and also receive messages from both queue types, either by printing it on job log or saving it in a file. We can also use message output for different plugin as an input parameter. In this way our workload automation users get benefits of SQS service.

Now let us see how this plugin works,

Pre-req:

  • AWS Account
  • AWS IAM credentials (AWS Key ID, AWS Secret Key).

 

Log in to the Dynamic Workload Console and open the Workload Designer. Choose to create a new job and select “AWSSQS” job type in the Cloud section

 

Workload Designer

 

 

Connection

Establishing connection to the AWS server:

In the connection tab, specify the Access Key ID, Secret Access Key and AWS Region fields to connect to AWS server from Workload Automation end and click on Test Connection. A success message will get displayed when the connection is successful.

 

  • Access Key ID : This attribute is associated with the AWS account and is mandatory.
  • Secret Access Key : This attribute is associated with the AWS account and is mandatory.
  • AWS Region : This attribute is related to the location in which topics are created. This field is also mandatory.

 

Connection Tab

 

Action

  • Queue URL

Provide the queue URL. The queue URL is required to send, receive, and delete queue messages

 

Action Tab

 

 

  • Choose Operation to Perform

Publish Message

You can publish the message to FIFO queue or Standard queue.

For FIFO queue, below information needs to be provided.

 

Duplication ID

The token used for deduplication of sent messages. If a message with a particular message deduplication ID is sent successfully, any messages sent with the same message deduplication ID are accepted successfully but are not delivered during the 5-minute deduplication interval.

Group ID

The tag that specifies that a message belongs to a specific message group. Messages that belong to the same message group are always processed one by one, in a strict order relative to the message group.

You can send message by manually inputting the message or through selecting a file.

Receive Message

 

Action Tab

 

You can receive message by pulling one at a time or more than one and provides a feature to store the message to file.

 

Polling Timeout:

 

It is used as timeout for polling in case the queue is empty. Basically, the job will wait for a new message to be in the queue for a maximum period defined in Polling Timeout Field. the value is ms and its default is 30000ms

 

Track/Monitor Job:

 

JOB Log

Job Properties – Message Successfully Received

 

Job Properties – Message Successfully Sent

 

Therefore, AWS SQS plugin in Workload Automation is a best fit for those who are looking to manage queue service to optimize the waiting period for any service.

 

Are you curious to try out the Amazon SQS plugin? Download the integrations from the Automation Hub and get started OR drop a line at santhoshkumar.kumar@hcl.com.

 

Author’s BIO

 

Rooparani Karuti – Senior Test Specialist at HCL Software

Working as senior test specialist for Workload Automation-Plugin Factory team in HCL Software Lab, Bangalore. Having 9 years of experience in both manual and automation test scenarios across various domains. ​

LinkedIn: https://www.linkedin.com/in/roopa-karuti-080b7627/

 

Paolo Canofari – Developer at HCL Software

Works as a Developer for the Workload Automation L3 Support in HCL Software, Rome. Having skills on Java, Python, C/C++, Angular 8 and Docker/Kubernetes deployments.

LinkedIn: www.linkedin.com/in/paolo-canofari

Comment wrap
Further Reading
Automation | November 18, 2020
Manage your message delivery system by using Amazon Simple Notification Service (SNS) with Workload Automation
Amazon Simple Notification Service (SNS) is a managed service that provides message delivery from publishers to subscribers (also known as producers and consumers). Publishers communicate asynchronously with subscribers by sending messages to a topic, which is a logical access point and communication channel. Clients can subscribe to the SNS topic and receive published messages using a supported protocol, such as Amazon SQS, AWS Lambda, HTTP, email, mobile push notifications, and mobile text messages.
Automation | November 18, 2020
Manage Your Blue Prism Process by using Blue Prism Plugin with Workload Automation
Before moving into Blue Prism plugin, let us understand what it is all and how it is benefits our workload automation users.Note: Blue Prism is an experimental plugin in our Automation Hub and it may support limited features.
Automation | November 6, 2020
All you need to know about how to setup SAP connection in Workload Automation
The SAP batch access method enables communication between an external SAP system and Workload Automation and provides a single point of entry for automating the launching of jobs, monitoring the status of jobs, and managing exceptions and recovery.“ …Which are the steps you need to follow to setup faster your Workload Automation environment to the target SAP system?...”
a/icon/common/search Created with Sketch.