Let us begin with understanding of Google Cloud Storage what it is all about before moving to our GCP Cloud Storage plugin and how it is benefits to our workload automation users.

Cloud Storage is a service for storing your objects in Google Cloud. An object is an immutable piece of data consisting of a file of any format. You store objects in containers called buckets.




After the creation of a project, user can create Cloud Storage bucketsupload objects to buckets, and download objects from buckets. User can also grant permissions to make data accessible to specified members, or – for certain use cases such as hosting a website – accessible to everyone on the public internet.

Here is how the Cloud Storage structure can apply to a real-world case:

  • Organization: Your company, called Example Inc., creates a Google Cloud organization called exampleinc.org.
  • Project: Example Inc. is building several applications, and each one is associated with a project. Each project has its own set of Cloud Storage APIs, as well as other resources.
  • Bucket: Each project can contain multiple buckets, which are containers to store your objects. For example, you might create a photos bucket for all the image files your app generates and a separate videos bucket.
  • Object: An individual file

Once user upload objects to Cloud Storage, user have fine-grained control over how user can secure and share the data. Here are some ways to secure the data inside Cloud Storage:

  • Identity and Access Management
  • Data encryption
  • Authentication
  • Bucket Lock
  • Object Versioning

Let us begin with our plugin part with job definition parameters,

Log in to the Dynamic Workload Console and open the Workload Designer. Choose to create a new job and select “GCP CloudStorage” job type in the Cloud section.



Establishing connection to the Google Cloud server:

Use this section to connect to the Google Cloud.

Service Account – The service account associated to your GCS account. Click the Select button to choose the service account in the cloud console.

Note: This attribute is required. The service account is the identity of the service, and the service account’s permissions control which resources the service can access.

Project ID – The project ID is a unique name associated with each project. It is mandatory and unique for each service account.

Test Connection – Click to verify if the connection to the Google Cloud works correctly.




In Action tab specify the bucket name and operation which you want to perform.

  • Bucket Name – Specify the name of the bucket in which the objects are stored. Click the Selectbutton to choose the bucket name defined in the cloud console.
  • Select Operations – Use this section to either upload or download objects.
  • Upload Object – Click this radio button to upload objects to the console.
  • Object Name – Enter the name of the object to be uploaded or the path of the file stored. Click the Selectbutton to choose the object name defined in the cloud console.
  • Source File(s) – Displays the path of the source file. You can use the filter option to streamline your search.
  • If a file already exists – Select an appropriate option for the application to perform if the uploaded file already exists in the console.
  • Replace – Selecting this option replaces the already existing file in the console.
  • Skip – Selecting this option skips the upload of the selected file in the console.
  • Download Object – Click this radio button to download the objects from the console.
  • Object Name – Enter the name of the object to be downloaded. Click the Selectbutton to choose the object name defined in the cloud console.
  • GCP Cloud File(s) – Displays the path of the source file. You can use the filter option to streamline your search.
  • Delete from GCS – Select this check box to delete the downloaded object from the google cloud console.
  • File Path – Provide the location to download objects. Click the Selectbutton to choose the destination path.


Submitting your job

It is time to Submit your job into the current plan. You can add your job to the job stream that automates your business process flow. Select the action menu in the top-left corner of the job definition panel and click on Submit Job into Current Plan. A confirmation message is displayed, and you can switch to the Monitoring view to see what is going on.

Monitor Page

Job Log Details


WorkFlow Page



Are you curious to try out the GCP CloudStorage plugin? Download the integrations from the Automation Hub and get started or drop a line at santhoshkumar.kumar@hcl.com.

Comment wrap
Further Reading
Automation | June 20, 2022
The Dynamic Workload Console is the one-stop automation platform for users across the business
The Dynamic Workload Console (DWC) has become a core platform for workload automation, providing visibility into everything all in one place.“The designing of a job stream is a key operation for schedulers and application developers to or interconnect business applications and achieve governance and control,” Zaccone said. “Our idea with the new Workload Designer is to empower what we had and push it to an advanced level to provide everything is needed to our customers.” 
Automation | May 24, 2022
Ensuring Passwordless Job Schedulation with CyberArk Integration
CyberArk is an identity and access manager offering a wide set of identity security capabilities. You can use it to submit Workload Automation jobs without specifying any passwords, which are retrieved from the CyberArk vault.
Automation | May 19, 2022
Continuous Security and Quality Updates on Dynamic Workload Console 10.1
After the biggest launch of Workload Automation 10.0.1 release in 1Q of 2022 (see the Workload automation original Video), what can we expect in 2022? Big news! Our first release refreshing for Dynamic Workload Console 10.0.1 is ready. Let’s answer the 5 WH questions.
Filters result by