Automate anything, Run Anywhere is our motto!

What better place to start making it real than from our very own Workload Automation test environment?

We have been applying it in our Test Lab for Workload Automation version 9.5 since its general availability in 2019, and to all the subsequent fix packs, including the last one, fix pack 3.

Let’s start discovering how we can leverage a Workload Automation deployment to run automation jobs that build our code and prepare packages for both our HCL and IBM brand offerings. The automation jobs trigger several automation suites running for each brand.

Every day, we run about 20,000 automated test scenarios that cover all the Workload Automation components (Dynamic Workload Console, master domain manager and its backup, the Rest APIs layer, dynamic agents and fault-tolerant agents).

Rest assured that we always uphold backward compatibility and detect instability issues before, rather than later! The main goal for us is to prevent injecting defects and instability to the base functionalities used by the most part of our customers.

Let’s take a deep dive into the automation world!

How many test automation suites?

Fig. 1 – Automation suites daily flow

 

The Workload Automation solution includes the following test automation suites:

  • BVT suites (Build Verification test that validates the health of the most recent available product component packages)
  • Installation and upgrade suite
  • Rest APIs suite
  • Dynamic Workload Console (web-based User Interface) suite
  • Server suite
  • Agent suite

 

The latest results of the automation suites are available on our Jenkins site and they are gathered in the Aggregate Test Report. The Aggregate Test Report is a giant matrix where each row represents a Jenkins job that groups a list of test scenarios belonging to a specific suite and the columns correspond to the date when the Jenkins job runs. The color of each cell of the Aggregate Test Report is updated each time the Jenkins job completes its daily or weekly run. The color indicates the percentage of failed test cases with respect to the total number of them.

The matrix has continually evolved since the test phase of the version 9.5 General Availability through until the test phase of the latest fix pack.

Fig. 2 – Extract of the Aggregate Test Report

 

Let’s figure out the number of the scenarios that automatically run every day for each automation suite.

 

BVT automation suites

BVT suites

#test cases description
Docker Installation 2 Runs a subset of the scenarios of the CVT installation suite for docker images.
Server on-prem         341 Runs a subset of the scenarios of the CVT suite on a standalone server with different agents.
Server Docker         277 Runs a subset of the server suite scenarios for docker container images on a simple topology (a master, backup master and agents) with different agents.
Dynamic Workload Console      5 Runs a subset of the scenarios of the CVT suite on a standalone console that is connected to a standalone server.
Agent   1036 Runs the full set of CVT scenarios on a standalone agent.
Rest API  266 Runs a subset of the scenarios of the CVT suite.
Docker server for CI/CD container pipeline 277

Runs a subset of server scenarios for docker container images on a simple topology (a server, backup server and agents). This Jenkins job validates the server and the agent containers for the CI/CD pipeline made available to fix the security vulnerabilities in the latest fix pack release.

  2204

 

Fig. 3 – The BVT Aggregate Test Report section

Installation and upgrade automation suites

Installation and Upgrade

#test cases description
Dynamic agent, z-centric agent and fault-tolerant agent (IBM brand) 731 ·       Runs fresh installation coverage and variation scenarios across all supported OS. The scenarios vary in the installation input parameters that are used. They can be default or non-default, as well as, required or optional across all supported OS.

·       Runs coverage and variation restore scenarios across all supported OS.

Dynamic agent and z-centric agent (HCL brand) 557 ·       Runs fresh installation coverage and variation scenarios across all supported OS. The scenarios vary in the installation input parameters that are used. They can be default or non-default, as well as, required or optional across all supported OS.

·       Runs coverage and variation restore scenarios distributed across all supported OS.

Server (IBM and HCL brand) 913 ·       Runs coverage and variation installation scenarios distributed across multiple, supported OS and relational DB types, for the master domain manager, backup master domain managers and dynamic domain managers. The scenarios vary in the installation input parameters that are used. They can be default or non-default, as well as, required or optional.

·       Runs coverage and variation upgrade/downgrade scenarios that install a previous version and apply a new FP version or update from the previous FP version to the new one for some relational DB types.

·       Runs installation and upgrade scenarios with SSL customization and custom certificates.

·       Runs installation and upgrade scenarios for WA docker images saved in an internal shared repository.

Console (IBM and HCL brand) 335 Runs coverage and variation installation and upgrade scenarios distributed across multiple, supported OS and relational DB types for the console component.
  2536  

 

Fig. 4 – The agent and Z Workload Automation agent (z-centric) Aggregate Test Report section

Fig. 5 – The Server installation and upgrade Aggregate Test Report section

Fig. 6 – The console installation and upgrade Aggregate Test Report section

 

REST APIs automation suite

Rest APIs  suites

#test cases description
Rest API CVT 2504 The automation suite runs a set of CVT test cases on odd days to cover the basic core functionalities of the product exposed by the master domain manager Rest API.
Rest API CVT Extended 2462 The automation suite runs an additional set of CVT test cases on even days to cover new functionalities of the product exposed by the master domain manager Rest API, which are added over time, release after release, or fix pack after fix pack.
  4966

 

Fig. 7 – The Rest APIs Aggregate Test Report section

 

Server automation suite

Server  suites

#test cases description
Server (IBM brand) 3348 The suite implements a set of coverage and variation test cases that cover the major functionalities of the server component. The available Jenkins jobs that run the suite cover the available OS and RDBMS for the server for the IBM brand. It also implements the test of the Docker server image.
Server (HCL brand) 1340 The suite implements a set of coverage and variation test cases that cover the major functionalities of the server component. The available Jenkins jobs that run the suite cover the available OS and RDBMS for the server for the HCL brand. It also implements the test of the Docker server image.
  4688

 

Fig. 8 – The server Aggregate Test Report section

 

Dynamic Workload Console automation suite

Console Automation suite

#test cases description
Console 206 x 4 By exploiting the functionality of the HCL One test tools, we implemented a set of coverage and variation test cases (206) that exercise the major functionalities of the Dynamic Workload Console. We created 4 jobs where we mix the available OS and supported RDBMS for the console for both the IBM and HCL brand. The tests are executed using the most widely used browsers supported by Workload Automation.
  824

 

Fig. 9 – The Console Aggregate Test Report section

 

Dynamic agent automation suite

Dynamic Agents automation   suites (aka LAR)

#test cases description
Dynamic agent CVT suites (IBM brand) 4273 Runs a set of coverage and variation scenarios for the dynamic agent that includes the scenarios for embedded job plug-ins or Automation Hub plug-ins. The available Jenkins jobs cover all the supported OS for the IBM brand.
Dynamic agent CVT suites (HCL brand) 1272 Runs a set of coverage and variation scenarios for the dynamic agent that includes the scenarios for embedded job plug-ins or Automation Hub plug-ins. The available Jenkins jobs cover all the supported OS for the HCL brand.

 

  5545

 

Fig. 10 – The agent suites Aggregate Test Report section

 

What is the flow of the daily WA test automation suites?

 

Fig. 11 – Detailed Continuous testing journey

 

Every day, when a new product build is available, the automatic Jenkins workflow starts and performs the following steps:

  • The code for the test automation suites is extracted from the code repository (GitHub) and then a build for each test automation suite is performed.
  • The Workload Automation daily build and the build for the test automation suites are combined into the test repository.
  • The Workload Automation daily build is installed automatically on all the test machines that are part of the test automation infrastructure.
  • The test automation suites are deployed automatically on all the test machines that are part of the test automation infrastructure.
  • The execution of the test automation suites is invoked automatically.
  • At the end of each execution of the test automation suites, test logs are collected, and test results are published on the Jenkins Aggregate Report to be analysed by a test automation specialist.

 

During the development phase (when the new product features are developed), the test automation suites are enriched to also cover the features that are being developed over time. The main objective of test automation during this phase is to increase the test coverage to include the new functionalities that are being added in the product version under test.

During the formal System Verification Test (SVT) phase, the new test automation test cases, that were created during the development phase, are merged into the pre-existing test automation suites.

The daily runs of these enriched test automation suites check the stability of the code, gain greater coverage as new tests are added to the regression package, and reduce test time compared to manual testing. The main objective of test automation during this phase is to demonstrate the stability of the product and to prevent potential backward compatibility issues affecting ideally the base features.

 

At the end of this journey, there is specific success criteria that must be met to consider the quality of the product good enough to be released on the market and that is, the single test automation suites must have a success rate that is higher than the results obtained for the previous release:

 

At the end of the Workload Automation 9.5 Fix Pack 3 development cycle, the success criteria were satisfied with the following values:

  • Dynamic Workload Console suite success rate: 100%
  • Installation suite success rate: 100%
  • Rest API Suite success rate: 99%
  • Agent suite success rate: 99%
  • Server suite success rate: 99%

 

Moreover, the test automation framework described in this article allowed the Workload Automation team to find and fix around 240 defects during the 9.5 Fix Pack 3 development cycle.

 

We hope this article is helpful in understanding the ecosystem in which the WA solution shapes up and evolves to satisfy our customer needs.

 

Learn more about Workload Automation here and get in touch with us writing at HWAinfo@hcl.com

 

 AUTHORS:

Serena Girardini

Serena is the Test and UX manager for the Workload Automation product in distributed environment. She joined IBM in 2000 as a Tivoli Workload Scheduler developer and she was involved in the product relocation from the San Jose Lab to the Rome Lab during a short-term assignment in San Jose (CA). For 14 years, Serena gained experience in Tivoli Workload Scheduler distributed product suite as a developer, customer support engineer, tester and information developer. She covered for a long time the role of L3 fix pack release Test Team Leader and, in this period, she was a facilitator during critical situations and upgrade scenarios at customer sites. In her last 4 years at IBM, she became IBM Cloud Resiliency and Chaos Engineering Test Team Leader. She joined HCL in April, 2019 as an expert Tester and she was recognized as the Test Leader for the product porting to the most important Cloud offerings in the market. She has a Bachelor’s Degree in Mathematics.

Linkedin: https://www.linkedin.com/in/serenagirardini/

 

Valentina Fusco

Valentina joined HCL in July 2019 as a Junior Software Developer where she began working with the Verification Test team as an Automation tester of the Workload Automation suite on both  distributed and cloud-native environments. She has a Master’s Degree in Electronic Engineering.

Linkedin: www.linkedin.com/in/valentinafusco1

 

 

Giorgio Corsetti

Giorgio works as Performance Test Engineer in the Workload Automation team. In his role, he works to identify bottlenecks in the architecture under analysis assessing how the overall system works managing specific loads and to increase customer satisfaction providing feedback about performance improvements when new product releases become available through technical document publications. Since April 2018, he has the added responsibility of the Test Architect role for the Workload Automation product family. Giorgio has a degree in Physics and is currently based in the HCL Software Rome software development laboratory.

Linkedin: https://www.linkedin.com/in/giorgio-corsetti-8b13224/

 

 

Comment wrap
Further Reading
article-img
Automation | February 1, 2021
How to move your Workload Automation on-premises deployment to cloud
To improve productivity and to save IT management costs and time, our customers are starting to move to the cloud environment and are moving heavy workloads from a set of on-premises machines to a resilient cloud infrastructure. So, our principal mission is to facilitate, secure, and support moving a workload automation deployment, from an on-premises infrastructure (minimum WA 95 FP3 version required) to any cloud infrastructure where Workload Automation (WA) is already supported.Link for the video guide: https://youtu.be/7AQHgCnpqLc
article-img
Automation | January 14, 2021
Safeguarding Carryforwards during a Migration to WA 9.5
A question which everyone would have in mind while Upgrading to WA 9.5 is to manage all the Carryforwards that are present in old Production Plan on the older Version of the Master and to migrate them to the newer Master Server . This Blog aims to sort this problem once and for all to ensure seamless transition to IWS 9.5 without any hassles .As you would already know if you are reading this Blog that WA 9.5 comes with a whole set of New Features and most noticeable Change Architecturally is to move to Liberty as a Middleware in place of Websphere Application Server, JazzSM for both the Engine as well the DWC Profile.
article-img
Automation | January 14, 2021
Manage your Azure Resource by using Azure Resource Manager with Workload Automation
Let us begin with understanding of Azure what it is all about before moving to our Azure Resource Manager plugin and how it benefits our workload automation users. Azure is incredibly flexible, and allows you to use multiple languages, frameworks, and tools to create the customised applications that you need. As a platform, it also allows you to scale applications up with unlimited servers and storage.
Close