Quantcast
Channel: ODTUG Aggregator
Viewing all articles
Browse latest Browse all 1880

Preventing Oracle Planning and Budgeting Cloud Service (PBCS) Data Load Errors Before They Happen

$
0
0

Author: Richard Pallotta, Performance Architects

We recently worked with a client to migrate their on-premise Oracle Hyperion Planning application to Oracle Planning and Budgeting Cloud Service (PBCS). Over the years, they developed a supporting infrastructure of business processes that worked well for them. Our concern was to mitigate any disruptive process changes that could result from the transition and pose risks to a successful implementation. As a result, we wanted to adapt familiar concepts and processes from the new platform to facilitate the platform change.

One of those “nice things” I liked about on-premise Essbase data loads was the error rejection logging feature that created a log file for dimension members in the fact data load file that are not in the Essbase outline. I’ve seen many creative ways to incorporate that into data update processes. The manual data upload feature in PBCS does not offer that feature and simply errors-out on the first rejected member. In other words, it displays the member but doesn’t commit any of the data upload to Essbase, which is probably a good thing. My suspicion is the fact data is loaded to a temporary relational table and analyzed as a back-end process before a commit is made. In my opinion, this inhibits the process flow more than the old Essbase process did.

Our new process leverages the customer’s existing process and simply compares metadata files downloaded from PBCS against the newest data files and identifies potential dimension member rejects. That way, the PBCS administrator just updates the PBCS dimensions before the data is actually loaded.

There are certainly other methods than the one we implemented here; pushing data quality checks (which is essentially what this is) upstream and closer to the source system; using EPM Automate to automate and integrate the entire update process; go old-school and use Excel VLOOKUPS or other manual methods; or probably a dozen other hybrid solutions (all of which a testament to the powerful nature of Oracle’s framework philosophy for PBCS).

I’m a big fan of developing simple processes using a minimum of tools and this one uses Windows batch scripts and SQLite. If you’re not familiar with SQLite, it’s a core technology used in every Android and iPhone device, as well as managing things like your passwords, browsing history, and other stored artifacts in desktop Internet browsers such as Safari, Chrome, Firefox, etc. It’s an open source, cross-platform tool, and if your client or company is okay with using the world’s most widely distributed database, it’s a great solution that requires zero installation. You drop it in a folder or directory and it’s good to go.

Windows batch scripts manage the entire process and it can be scheduled or run manually by the administrator. SQLite reads the SQL queries that are stored as text files, as well as importing the text data files into tables in a database file that I also chose to store in the same folder. The process creates log files and produces the results in a folder structure of your choosing; this is the one I created off the root folder:

The SQL scripts are text files that SQLIte reads and processes. This one creates the table to hold the kickouts, does the compare (shown with the yellow arrow.) Then a few results are displayed on the screen as well as written to the results files:

Manually running the process is simply done from a command window:

I created a few prompts along the way so the administrator can monitor the job process:

SQLite can easily handle millions of data rows in just a few seconds. When the process finishes, the results are displayed on screen and are also written to the results files:

Results are then written to a text file:

To summarize: this is a handy data quality validation tool that’s portable and can be easily modified for quick deployment because it consumes a minimum of resources and can be run manually or using a scheduler. This is also handy because it is cross-platform (using shell scripts instead of Windows batch scripts) and is a familiar process for existing on-premise Oracle EPM (Hyperion) customers who currently use manual data validation tools and processes.

Need help figuring this out at your organization?  Contact Performance Architects at sales@performancearchitects.com and we’d be glad to help!


Viewing all articles
Browse latest Browse all 1880

Trending Articles