Let us begin with understanding of Google Cloud Function plugin and how it benefits our workload automation users.

“GCP is a plugin that helps you to invoke GCP Cloud Functions in a synchronous way and also monitor the job”.

GCP requires no servers to provision, manage, or upgrade. It automatically scales up based on the load. It has Integrated monitoring, logging, and debugging capability. It has built-in security at role and per function level based on the principle of least privilege. It also has the networking capabilities for hybrid and multi-cloud scenarios.

Serverless Computing:

  • Serverless computing is an execution model for the cloud.
  • A cloud provider dynamically allocates and then charges the user for only the resources needed to execute a particular piece of code.
  • No server management. Provisioning and maintenance are taken care by provider.
  • Minimizes cost: Developers are only charged for the resources they use.
  • Serverless architectures are inherently scalable.
  • Quick deployments and updates are possible.

GCP Cloud Functions

The GCP Cloud Function includes the following features:

  • Google Cloud Platform(GCP) Cloud Functions is a function-as-a-service platform.​
  • It allows to develop, run and manage application functionalities without the complexity of infrastructure associated with  developing an app.​
  • Functions automatically scale and are highly available and fault-tolerant.
  • Cloud Functions can be written in jsPythonGoJava.NETRuby, and PHP programming languages, and are executed in language-specific runtimes.
  • It has integrated monitoring, logging, and debugging capability.
  • It helps you monitor the solution from a single point of control.
  • There are two distinct types of cloud functions:
    • HTTP functions
    • Event-driven functions

Plugin Use Case / Functionalities

InvokeInvoke GCP Cloud Functions in a synchronous way.

    Monitor – Get function output state: retrieve function output object if it is executed successfully or catch errors if the execution failed.

Figure 1 GCP Cloud Function Console

Figure 2 GCP Cloud Function Console

Installation & Prerequisite

Must have access to​:

  • GCP Console​
  • Service Account Credentials

Let us begin with our plugin part with job definition parameters

GCP Cloud Function Plugin

Log in to the Dynamic Workload Console and open the Workload Designer. Choose to create a new job and select “GCP Cloud Function Plugin” job type in the Cloud section.

Figure 3 Job Definition

Connection Ta

Establishing connection to the GCP Cloud.

Connection Info

Use this section to connect to Google Cloud.

GCP Default CredentialsSelect this option if you want system to fetch default credentials from system configuration.

Note: Environment variable has to be configured for default access GOOGLE_APPLICATION_CREDENTIALS.

GCP Server CredentialsSelect this option to enter the GCP credentials manually.

Service AccountThe service account associated to your GCP Cloud Function account. Click the Select button to choose the service account in the cloud console. Select an item from the list, the selected item is displayed in the Service Account field.

Note: This attribute is required. The service account is the identity of the service, and the service account’s permissions control which resources the service can access.

Project IDThe project ID is a unique number associated with each project. It is mandatory and unique for each service account.

Test Connection – Click to verify if the connection to the Google Cloud works correctly.

Figure 4 connection tab

GCP Default Credentials

  • Google Cloud Application Default Credentials (ADC) are not credentials. ADC is a strategy to locate Google Cloud Service Account credentials.
  • If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set, ADC will use the filename that the variable points to for service account credentials. This file is a Google Cloud Service Account credentials file in JSON format.
  • A JSON file in a location known to the Gcloud command-line tool.
  • On Windows, it is %APPDATA%/gcloud/application_default_credentials.json.
  • On other systems, it is $HOME/.config/gcloud/application_default_credentials.json.
  • The default service account for credentials if the application running on Compute Engine, App Engine.
  • If the previous steps do not find valid credentials, ADC will fail, and an error occurs.

Figure 5 Connection Tab

Action Tab

Use this section to define invocation options for GCP cloud function.

Invoking Information


Region where GCP function is located (for example us-east-2).

Function Name

Click the Select button to specify the name of the GCP cloud function to be invoked. The selected name appears in the Function Name field.

Function Input Event

Input From File

Select this option if GCP cloud function input event is stored inside a file (for example: text, json).

File Path

This field is available only if the Input from File has been checked. Provide here the absolute path to the file which contains the event payload. Event payload must be a valid JSON.

Manual Input

Select this option to provide the GCP cloud function input event manually.

Input Body

This field is available only if the Manual Input has been checked. Provide here the json to send as function input while invoking GCP cloud function.

Save Function Output to File

Select the check box to save the function output.

Output File Path

Provide the location to save the function output files.

Figure 6 Action Tab

Figure 7 Action tab – Application List

Figure 8- Action tab – File Path

Submitting your job

It is time to Submit your job into the current plan. You can add your job to the job stream that automates your business process flow. Select the action menu in the top-left corner of the job definition panel and click on Submit Job into Current Plan. A confirmation message is displayed, and you can switch to the Monitoring view to see what is going on.

Figure 9 Job Log

Figure 10 Monitor Job

Are you curious to try out the GCP Cloud Function plugin? Download the integrations from the Automation Hub and get started or drop a line at santhoshkumar.kumar@hcl.com

Learn more about HCL Workload Automation here or drop us a line at HWAinfo@hcl.com?

Automate Anything, Run Anywhere!

Comment wrap
Further Reading
Automation | June 20, 2022
The Dynamic Workload Console is the one-stop automation platform for users across the business
The Dynamic Workload Console (DWC) has become a core platform for workload automation, providing visibility into everything all in one place.“The designing of a job stream is a key operation for schedulers and application developers to or interconnect business applications and achieve governance and control,” Zaccone said. “Our idea with the new Workload Designer is to empower what we had and push it to an advanced level to provide everything is needed to our customers.” 
Automation | May 24, 2022
Ensuring Passwordless Job Schedulation with CyberArk Integration
CyberArk is an identity and access manager offering a wide set of identity security capabilities. You can use it to submit Workload Automation jobs without specifying any passwords, which are retrieved from the CyberArk vault.
Automation | May 19, 2022
Continuous Security and Quality Updates on Dynamic Workload Console 10.1
After the biggest launch of Workload Automation 10.0.1 release in 1Q of 2022 (see the Workload automation original Video), what can we expect in 2022? Big news! Our first release refreshing for Dynamic Workload Console 10.0.1 is ready. Let’s answer the 5 WH questions.
Filters result by