Quantcast
Channel: ODTUG Aggregator
Viewing all articles
Browse latest Browse all 1880

EPM Automate in Oracle Planning and Budgeting Cloud Service (PBCS)

$
0
0

Author: Tom Blakeley, Performance Architects

EPM Automate is the utility used in Oracle Planning and Budgeting Cloud Service (PBCS) that is used to manage automation routines, including the upload of data files; running Oracle Hyperion Financial Data Quality Management Enterprise Edition (FDMEE) data rules; building dimensions; and managing the application. As part of most projects, we spend a good chunk of time with this utility, setting up business users for success with automation routines that automatically refresh key components on a nightly basis. While batching scripting can get fairly complex quickly, there are a series of basic commands that any administrator can pull together to start their own automation routines. I’ve included some of these here along with the likely use cases to help folks get started.

Case #1: On a nightly basis an aggregation business rule runs to calculate the Planning application, and then a map reporting job pushes data from one plan type to another.

Script Steps: Login, Run Business Rule, Run Map Reporting Application, Logout

As you can see in the screenshot below, I’ve created a very basic script that logs into the EPM environment, and then launches the Business Rule followed by the Map Reporting application. Nothing fancy here; the script moves to the next step, even if the prior step fails.

Tom 1

Case #2: Each weekend an automation script needs to run that uploads a new metadata file, runs a cube refresh, and runs a business rule to prep the cube for the next week.

Scripts Steps: Login, Run File Upload, Run Metadata Import, Run Cube Refresh, Run Business Rule, Logout

As you can see in the screenshot below, I’ve create a script that performs the initial login to the EPM environment, and then initiates a file transfer from the on-premise file system up to the Cloud Inbox. From here, I’ve launched a metadata import using the file that I previously placed in the inbox. From here I’ve run a cube refresh to push my metadata changes down to the underlying Essbase database. Finally, I’ve executed the cube aggregation and logged out.

Tom 2

Case #3: Each night we need to run a FDMEE data load, and then run an aggregation script on the cube. This refreshes the system with updated financial data, and preps it for the next day of planning and reporting.

Script Steps: Login, Run File Upload to the FDMEE Data Load Folder, Run FDMEE Data Rule, Run Business Rule, Logout

In the script below, I’ve first shipped up my data file into the Oracle Cloud instance from my local on-premise environment. Once up in the cloud, I process the data file using an FDMEE data rule called “Monthly Financials.” From there, I launched cube aggregation and logged out.

Tom 3

In the above three examples, I am just using basic commands to move data around and run rules. Typically we would want to then wrap these commands in error detection and logging components. We might also integrate this with other local routines to provide end-to-end automation. I’ll cover this in another post – so for now, happy scripting.

 


Viewing all articles
Browse latest Browse all 1880

Trending Articles