In this blog, we are going to walk you through enabling session affinity for the Dynamic Workload Console (console) deployed in a Google Kubernetes Engine (GKE) cluster that uses an HTTP(S) load balancer network service that leverages an instance of Google Cloud SQL for SQL Server managed database. In a GKE cluster environment, a backend service defines how the HTTP(S) cloud load balancing network service distributes incoming traffic. By default, the method for distributing new connections uses a hash calculated based on five pieces of information: The client's IP address.The source port.The load balancer's internal forwarding rule IP address.The destination port.The protocol. You can modify the traffic distribution method for HTTP(S) traffic by specifying a session affinity option.
As human beings, we have always been fascinated by the unknown, and we need to understand it, interpret it, and draw results from it. Today, however, living in an increasingly hyper-technological, interconnected world, where we collect billions and billions of information and data, we have difficulty because our ability to acquire data exceeds our ability to give meaning to it. Besides data itself, visualization becomes crucial in driving the root cause analysis, explaining concepts, and extracting useful insights from data. Also, visualization can help provide data that is understandable by non-data experts.
Zowe and its major components Web UI, API Mediation Layer, and CLI, are likely to become the new interface for the next generation of mainframers.The Zowe framework is the bridge that connects modern applications with the mainframe by providing easier interoperability and scalability among products and solutions offered by multiple vendors. Developers, testers, operators, and any other professional in the mainframe realm can easily create their tools to automate those tasks that usually would be done manually or through mainframe-native tools. They can build, modify, and debug z/OS applications even with limited z/OS expertise.
The SAP batch access method enables communication between an external SAP system and Workload Automation and provides a single point of entry for automating the launching of jobs, monitoring the status of jobs, and managing exceptions and recovery.“ …Which are the steps you need to follow to setup faster your Workload Automation environment to the target SAP system?...”
Are you familiar with docker-compose in the Workload Automation (WA) deployment process? It’s about time you started using it to deploy Workload Automation containers. The 3 “S” joint venture of WA, along with Docker Compose, stand by the following slogans: Simplicity?Not more than 5 steps. Speed?Just enough time for a coffee break. Security?No worries, we take care of everything.
Using Workload Automation integrated with SAP®, you can create, schedule, and control SAP jobs and monitoring you SAP landscape. SAP jobs run on application servers that host work processes of type batch. Critical batch jobs are run in specific time frames, on specific application servers. With SAP Basis version 6.10 and later, application servers can be assigned to server groups. With Workload Automation, you can assign a server group to a job and leveraging the Job Throttling feature it can manages all SAP background processes from several applications on one or more servers in heterogeneous environments. In this way, when a job is launched, the SAP system runs it on an application server that belongs to the specified group, balancing the workload among the various application servers.
Event Rules are an extension of Workload Automation (WA) capabilities that enable events occurring external to the scheduling environment to trigger actions on scheduling objects within WA. An ideal use of this capability is to detect the arrival of a file and then trigger an action to submit a Job Stream containing jobs to process the data contained in that file. This capability has been available for a while and is widely used. In this article, a hidden feature is explored where the name of the file and other properties related to the file are passed as variables to a Variable Table associated to the Job Stream being submitted as ac action where any Job within that Job Stream can retrieve those variables and process the data in the file
Can’t get enough of automating your business processes? We have what you are looking for! The Ansible plug-in is available on Automation Hub, download it to empower your Workload Automation environment.