Oracle Data Integrator provides a fully unified solution for building, deploying, and managing complex data warehouses or as part of data-centric architectures in a SOA or business intelligence environment. In addition, it combines all the elements of data integration-data movement, data synchronization, data quality, data management, and data services-to ensure that information is timely, accurate, and consistent across complex systems.

Oracle Data Integrator (ODI) features an active integration platform that includes all styles of data integration: data-based, event-based and service-based. ODI unifies silos of integration by transforming large volumes of data efficiently, processing events in real time through its advanced Changed Data Capture (CDC) framework and providing data services to the Oracle SOA Suite. It also provides robust data integrity control features, assuring the consistency and correctness of data. With powerful core differentiators – heterogeneous E-LT, Declarative Design and Knowledge Modules – Oracle Data Integrator meets the performance, flexibility, productivity, modularity and hot-pluggability requirements of an integration platform.

 

In order to leverage the benefits out of ODI plugin in workload automation, we have classified in to two categories.

  1. Oracle Data Integrator Scenario
  2. Oracle Data Integrator Load Plan

1.Orace Data Integrator Scenario:

A scenario is the partially-generated code (SQL, shell, etc) for the objects (interfaces, procedures, etc.) contained in a package.

When a component such as an ODI interface or package has been created and tested, you can generate the scenario corresponding its actual state.

Once generated, the scenario’s code is frozen, and all subsequent modifications of the package and/or data models which contributed to its creation will not affect it.

It is possible to generate scenarios for packages, procedures, interfaces or variables. Scenarios generated for procedures, interfaces or variables are single step scenarios that execute the procedure, interface or refresh the variable.

2.Oracle Data Integrator Load Plan:

Oracle Data Integrator is often used for populating very large data warehouses. In these use cases, it is common to have thousands of tables being populated using hundreds of scenarios. The execution of these scenarios must be organized in such a way that the data throughput from the sources to the target is the most efficient within the batch window. Load Plans help the user organizing the execution of scenarios in a hierarchy of sequential and parallel steps for these type of use cases.

ODI load plan is an executable object in ODI that can contain a hierarchy of steps that can be executed conditionally, in parallel or in series. The leaves of this hierarchy are Scenarios. Packages, interfaces, variables, and procedures can be added to Load Plans for executions in the form of scenarios.

ODI Scenario:

Log in to the Dynamic Workload Console and open the Workload Designer. To create a new job, select “Oracle Data Integrator Scenario” job type in the Cloud section.

 

Establishing connection to the ODI Studio 12c:

In the connection tab specify the URL, username, password and work repository path of the configuration to let workload Automation interact with ODI and click Test Connection. A confirmation message is displayed when the connection is established.

Create the scenario:

In the Action Tab specify the scenario Details to create the scenarios.

Provide the Scenario name, version, context, log level, session and synchronous details.

Submitting your job:

Submit your job into the current plan. You can add your job to the job stream that automates your business process flow. Select the action menu in the top-left corner of the job definition panel and click on Submit Job into Current Plan. A confirmation message is displayed, and you can switch to the Monitoring view to see what is going on.

Track/Monitor your Job:

You can also easily monitor the submitted job in WA through navigating to “Monitor Workload” page.

 

Select the job and click on job log option to view the logs of the Oracle Data Integrator Scenario job. Here, you can see that the scenario has been created successfully.

ODI Load Plan:

Log in to the Dynamic Workload Console and open the Workload Designer. To create a new job, select “Oracle Data Integrator Load plan (9.5.0.02)” job type in the Cloud section.

Establishing connection to the ODI Studio 12c:

In the connection tab specify the WSDL URL, username, password and work repository path of the configuration to let workload Automation interact with ODI and click Test Connection. A confirmation message is displayed when the connection is established.

Create the Load Plan:

In the Action Tab specify the Load Plan details to create the ODI Load Plan.

Provide the Load Plan name, context code, log level details.

Submitting your job:

Submit your job into the current plan. You can add your job to the job stream that automates your business process flow. Select the action menu in the top-left corner of the job definition panel and click on Submit Job into Current Plan. A confirmation message is displayed, and you can switch to the Monitoring view to see what is going on. 

 

 

Track/Monitor your Job:

You can also easily monitor the submitted job in WA through navigating to “Monitor Workload” page.

Workflow details ODI Load plan:

Here we can see the Workload scheduler workflow details.

 

Therefore, ODI Load plan and ODI scenario plugins in Workload Automation is a best fit for those who are looking for executing Load plans and scenarios in the ODI Studio

Are you curious to try out the ODI plugin? Download the integrations from the Automation Hub and get started or drop a line at santhoshkumar.kumar@hcl.com

Authors Bio

Rabic Meeran K

Technical Specialist at HCL Technologies

 

Responsible for developing integration plug-ins for Workload Automation. Hands-on with different programing languages and frameworks like JAVA, JPA, Spring Boot, Microservices, MySQL, Oracle RDBMS, Ruby on Rails, Jenkins, Docker, AWS, C and C++.

https://www.linkedin.com/in/arka-mukherjee-a6b924b4/

 

Saroj Kumar Pradhan

Senior Developer at HCL Technologies

 

Responsible for developing integration plug-ins for Workload Automation. Hands-on with different programing languages and frameworks like JAVA, JPA, Spring Boot, Microservices, MySQL, Oracle RDBMS, Ruby on Rails, Jenkins, Docker, AWS, C and C++.

 

Saket Saurav

Saket Saurav, Tester(Senior Engineer) at HCL Technologies

 

 

Responsible for performing Testing for different plugins for Workload Automation using Java Unified Test Automation Framework. Hands-on experience on Java programming language, Web Services with databases like Oracle and SQL Server.

 

 

 

 

Comment wrap
Further Reading
article-img
Automation | September 23, 2020
Accelerate your Cloud Transformation! Take advantage from HCL Workload Automation on AWS Marketplace.
"You may not be responsible for the situation you are in, but you will become responsible if you do nothing to change it." Cit. Martin Luther King.Get ready to accelerate your business by simplifying and automating workloads, improving service level agreements and reducing deployment and management time with your Cloud Transformation!
article-img
Automation | June 20, 2022
The Dynamic Workload Console is the one-stop automation platform for users across the business
The Dynamic Workload Console (DWC) has become a core platform for workload automation, providing visibility into everything all in one place.“The designing of a job stream is a key operation for schedulers and application developers to or interconnect business applications and achieve governance and control,” Zaccone said. “Our idea with the new Workload Designer is to empower what we had and push it to an advanced level to provide everything is needed to our customers.” 
article-img
Automation | May 24, 2022
Ensuring Passwordless Job Schedulation with CyberArk Integration
CyberArk is an identity and access manager offering a wide set of identity security capabilities. You can use it to submit Workload Automation jobs without specifying any passwords, which are retrieved from the CyberArk vault.
Close
Filters result by
Sort:
|