Let us understand first about AWS Simple Queue Service (SQS), how Amazon SQS works, it is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications.

AWS SQS eliminates the complexity and overhead associated with managing and operating message-oriented middleware and empowers developers to focus on differentiating work. Using SQS, you can send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available.

SQS offers two types of message queues.

  • Standard queues offer maximum throughput, best-effort ordering, and at-least-once delivery.
  • SQS FIFO queues are designed to guarantee that messages are processed exactly once, in the exact order that they are sent.

 

So, to enable with Amazon SQS service ,we have developed AWS SQS plugin, using which we can send message for both standard queue as well as FIFO queue and also receive messages from both queue types, either by printing it on job log or saving it in a file. We can also use message output for different plugin as an input parameter. In this way our workload automation users get benefits of SQS service.

Now let us see how this plugin works,

Pre-req:

  • AWS Account
  • AWS IAM credentials (AWS Key ID, AWS Secret Key).

 

Log in to the Dynamic Workload Console and open the Workload Designer. Choose to create a new job and select “AWSSQS” job type in the Cloud section

 

Workload Designer

 

 

Connection

Establishing connection to the AWS server:

In the connection tab, specify the Access Key ID, Secret Access Key and AWS Region fields to connect to AWS server from Workload Automation end and click on Test Connection. A success message will get displayed when the connection is successful.

 

  • Access Key ID : This attribute is associated with the AWS account and is mandatory.
  • Secret Access Key : This attribute is associated with the AWS account and is mandatory.
  • AWS Region : This attribute is related to the location in which topics are created. This field is also mandatory.

 

Connection Tab

 

Action

  • Queue URL

Provide the queue URL. The queue URL is required to send, receive, and delete queue messages

 

Action Tab

 

 

  • Choose Operation to Perform

Publish Message

You can publish the message to FIFO queue or Standard queue.

For FIFO queue, below information needs to be provided.

 

Duplication ID

The token used for deduplication of sent messages. If a message with a particular message deduplication ID is sent successfully, any messages sent with the same message deduplication ID are accepted successfully but are not delivered during the 5-minute deduplication interval.

Group ID

The tag that specifies that a message belongs to a specific message group. Messages that belong to the same message group are always processed one by one, in a strict order relative to the message group.

You can send message by manually inputting the message or through selecting a file.

Receive Message

 

Action Tab

 

You can receive message by pulling one at a time or more than one and provides a feature to store the message to file.

 

Polling Timeout:

 

It is used as timeout for polling in case the queue is empty. Basically, the job will wait for a new message to be in the queue for a maximum period defined in Polling Timeout Field. the value is ms and its default is 30000ms

 

Track/Monitor Job:

 

JOB Log

Job Properties – Message Successfully Received

 

Job Properties – Message Successfully Sent

 

Therefore, AWS SQS plugin in Workload Automation is a best fit for those who are looking to manage queue service to optimize the waiting period for any service.

 

Are you curious to try out the Amazon SQS plugin? Download the integrations from the Automation Hub and get started OR drop a line at santhoshkumar.kumar@hcl.com.

 

Author’s BIO

 

Rooparani Karuti – Senior Test Specialist at HCL Software

Working as senior test specialist for Workload Automation-Plugin Factory team in HCL Software Lab, Bangalore. Having 9 years of experience in both manual and automation test scenarios across various domains. ​

LinkedIn: https://www.linkedin.com/in/roopa-karuti-080b7627/

 

Paolo Canofari – Developer at HCL Software

Works as a Developer for the Workload Automation L3 Support in HCL Software, Rome. Having skills on Java, Python, C/C++, Angular 8 and Docker/Kubernetes deployments.

LinkedIn: www.linkedin.com/in/paolo-canofari

Comment wrap
Further Reading
article-img
Automation | June 20, 2022
The Dynamic Workload Console is the one-stop automation platform for users across the business
The Dynamic Workload Console (DWC) has become a core platform for workload automation, providing visibility into everything all in one place.“The designing of a job stream is a key operation for schedulers and application developers to or interconnect business applications and achieve governance and control,” Zaccone said. “Our idea with the new Workload Designer is to empower what we had and push it to an advanced level to provide everything is needed to our customers.” 
article-img
Automation | May 24, 2022
Ensuring Passwordless Job Schedulation with CyberArk Integration
CyberArk is an identity and access manager offering a wide set of identity security capabilities. You can use it to submit Workload Automation jobs without specifying any passwords, which are retrieved from the CyberArk vault.
article-img
Automation | May 19, 2022
Continuous Security and Quality Updates on Dynamic Workload Console 10.1
After the biggest launch of Workload Automation 10.0.1 release in 1Q of 2022 (see the Workload automation original Video), what can we expect in 2022? Big news! Our first release refreshing for Dynamic Workload Console 10.0.1 is ready. Let’s answer the 5 WH questions.
Close
Filters result by
Sort:
|