Quantcast
Channel: ODTUG Aggregator
Viewing all 1880 articles
Browse latest View live

EPM Cloud Update: TRCS Stealth release!

$
0
0

And Oracle does another stealth release with the December 2016 EPM Cloud update. FCCS and EPBCS customers noticed a new option this week when creating an application: Tax Reporting!

trcs-stealth-release

After inquiring about this to the Oracle Product Management team, it looks like TRCS (Tax Reporting Cloud Service) joined the EPBCS/FCCS code line and became available to existing customer Test environments after last weekend’s December EPM Cloud update. Existing customers can start playing with it immediately. It should hit Prod environments the night of December 16, 2016.

New customers will see it available at cloud.oracle.com in the new year. I expect that the TRCS pricing will follow the EPBCS/FCCS licensing model, at a retail price of $250/user/month (minimum 10 users), but this is also to be confirmed once the product is formally announced to new customers.


Filed under: EPM Cloud Update Tagged: EPBCS, EPM, EPM Cloud, FCCS, Oracle, Oracle EPM, TRCS

Allocations Always Equal 100

$
0
0

I’ve been writing and speaking about allocations inside of Hyperion Planning and Essbase for several years. Inevitably on a project, I’m asked to write an allocation script that requires a planner to enter a percentage that must total up to be 100%. The scripting is easy, but what happens when you enter a value that doesn’t equal 100% and you either under of over allocate your expenses.

Traditionally I’ve used a webforms and data validation to try to enforce compliance, but I kept thinking to myself, there must be another way to get to the same result. There is, and I’m here to share that with you.

Let’s look at an example. If I’m  subdividing out expenses to various cost center or Entities by a percentage. Let’s say IT services gets moved down to a profit center level. 15% goes to Retail, 50% goes to Wholesale and another 30 goes to Direct. Is that 100%, no it’s 95%. So, what happened to that extra 5%?

My default response was to set up a Webform with a rollup and a validation that displayed visually if you reached the desired number of 100%. If you didn’t, the cell would show up red or yellow.

This works great for a small dataset, but when you have many intersections, visually it becomes difficult to scroll down to see the validation Total.

Instead of assuming that the red color cell would alert everyone for the need to correct the totaling, why not force a calculation to true up the percentages to always equal 100%? Pure simplistic genius.

In this example, I have three cost pools that need to receive their share of Admin and Marketing Expenses. This is only step one in the process. I’m first dividing up my high-level Admin and Marketing expense into three high-level cost centers; Retail, Wholesale and Direct. I’ll further subdivide the expenses by net sales later, but since a total percentage of net sales may give me a misleading figure in the allocation, we wanted the flexibility to assign a percentage to each high-level cost center first. For example, wholesale may have a three times the dollar amount of sales as compared to direct, but utilize far less admin and marketing dollar to make those sales happen.

Let’s leverage my form and expand on it. I added and additional member in the account dimension called AdminMarketingAllocation_TrueUp that is dynamically calculated off it’s stored sibling.

With a simple member formula, I can always ensure that my percentage allocations always equal 100%. See my IF statement below. The basic tenant is that I need to divide by Total generated by the input. If I’m at a perfect 100%, my values don’t change. But if I’m below or under, my values are adjusted so that I can ensure that I equally 100%.

The downside to this particular formula is that it will need to be updated if you add additional cost pools. I also did not add in a default exception for #MISSING or a zero value, but you could layer that in as well if you wanted to establish a standard costing.

Returning to my form, I layered in the additional account, where I can true up the percentages. In this example, I have overallocated by 10%, which if ran the allocation ruleset referencing those values would generate 10% additional expenses. Instead, the true up derives to percentages of each relative to the total. My 50% takes a 5% reduction to be in line relative to the total, now equally 45%. My other values are adjusted in tandem as well.

All that’s left to do is update your Business Rules of Calc scripts to reference the new account that you created.

With this simple trick, you can save your user community, (and your) a lot of headache in the future and you never have to worry about over or under allocating your expenses again.

 

 

 

 

The post Allocations Always Equal 100 appeared first on TopDown Consulting Blog.

TRCS First Look: Creating an Application

$
0
0

Following the awesome release of Tax Reporting Cloud Service (TRCS) last weekend to existing EPBCS/FCCS customers, I had to (of course), go in and play with this new cloud technology. Not being a previous Hyperion Tax Provisioning user, I decided to start simple and create a new application to get the ball rolling.

Caveat: as this is the first public release of this technology, I would imagine that the screen shots below will be out of date by the time the product is formally released to new customers in 2017.

First things first…if there is no existing application, you need to create one by selecting Tax Reporting from the startup screen. Unfortunately, no sample application option is provided yet.

trcs-stealth-release

Then, as is the normal protocol for this code line, name your application and put in a (required) description. The application name must be 8 characters or less, as this product has an Essbase back end.

trcs-creating-an-app-01

At this point in time, there are very few options for the initial application details. I see that you can choose your year start and end. In addition, you can determine if you want to enable multi-currency. As there is no documentation available publicly for this product yet, it’s uncertain how multi-currency works here. I would guess that it’s similar to how the EPBCS multi-currency feature works.

trcs-creating-an-app-02

Once the initial details are selected, a review screen (and most importantly, a Back button) are available before you press Create.

trcs-creating-an-app-03

Once you press Create, application creation starts.

trcs-creating-an-app-04

I don’t recall this process taking overly long – probably less than 10 minutes.

trcs-creating-an-app-05

Once the application was created, I was taken to the home screen. A variety of brightly colored cards and clusters were available for my perusal. Naturally, my curiosity led me to selecting the unfamiliar ones.

trcs-creating-an-app-06

The Operations cluster offers the following cards:

trcs-creating-an-app-07

Here is the Tax Provision National cluster:

trcs-creating-an-app-08

The Tax Provision Regional cluster:

trcs-creating-an-app-09

Total Tax took me to a dashboard (empty, as I have no data loaded yet). Yay! Pre-built content!

trcs-creating-an-app-10

The CbCR card takes me to a different dashboard (also empty):

trcs-creating-an-app-11

Approvals looks and feels like the EPBCS/FCCS approvals feature:

trcs-creating-an-app-12

Library is a card that I haven’t seen in many Oracle EPM Cloud products. This is what it displays out of the box:

trcs-creating-an-app-13

The Application cluster has a bunch of similar sounding card options to EPBCS and FCCS:

trcs-creating-an-app-14

And finally, the Tools cluster also looks and feels like EPBCS and FCCS:

trcs-creating-an-app-15


Filed under: EPM Cloud Update Tagged: EPBCS, EPM, EPM Cloud, FCCS, Oracle, Oracle EPM, TRCS

Oracle Profitability and Cost Management Cloud

$
0
0

Oracle announced the release of Profitability & Cost Management Cloud Service (PCMCS). PCMCS is a cloud based SaaS that enables business users to gain insight into hidden profit and cost across key business dimensions—such as products, customers, sales channels, and more—and take action to improve profitability and lower costs. Oracle Profitability and Cost Management Cloud provides actionable insight into allocation-based business processes by seamlessly combining data from the general ledger and other financial systems with data from operational systems. The solution provides the transparency needed to support analysis within today’s complex enterprises.

Rapidly Build Profit and Cost Models
Gain laser focus on profitability and costs across your business. With Oracle’s solutions for profitability and cost management, it’s never been easier—or more powerful.

  • Perform faster, easier, multidimensional analysis and scenario modeling
  • Leverage industry-leading technology to create, maintain, and deploy business models
  • Drive rapid application design with a business rules engine and intuitive user interface
  • Get transparency into cost and revenue allocations via traceability maps

AdvancedEPM is very excited about the intuitive cloud interface that makes it easy to analyze costs and profit. With PCMCS you can rapidly build, maintain, change, and analyze allocations. With useful built-in and ad hoc reporting business users can model opportunities by changing assumptions.
epm-blog

If your organization currently uses Oracle Hyperion Profitability and Cost Management, and you have built your model using the management ledger style, it is easy to migrate to the cloud solution.
oracle

Contact AdvancedEPM Consulting today for more information on getting started, or migrating to, Profitability & Cost Management Cloud Service.

The post Oracle Profitability and Cost Management Cloud appeared first on AdvancedEPM Consultants.

Accelerate Your Ride to the Cloud: Extending ERP with Oracle Profitability & Cost Management Cloud Service (PCMCS) for Standard Cost Rate Development

$
0
0

A common need among manufacturing organizations is improvement in the process of developing annual labor and overhead standards to use as input into standard cost rates for product cost and inventory valuation. In spite of the investments that have been made in ERP solutions, it is typically an offline Excel-based exercise that is required to take historical data from the ERP to determine the updated direct labor rate & overhead rate components of a product standard cost for an upcoming fiscal year.  The release of Oracle Profitability and Cost Management-Cloud Service (PCMCS) in October 2016 provides a unique opportunity for manufacturers to ease, streamline and document the process of generating the cost-per-direct labor hour or cost-per-machine-hour rates that are requisite in standard costing.

Background

Generally accepted accounting principles (GAAP) allow for one of multiple methods for the valuation of inventory to a manufacturer: Last-In, First-Out (LIFO); First-In, First-Out (FIFO); or a Weighted Average.

Because prices for labor and materials fluctuate throughout a year and inventory is built or drawn, it is difficult to track inventory on an on-going basis using these methods. Further, from a management perspective, it is more meaningful to separate the effects of price changes and inventory builds/draws from values associated with normal business.  Pricing decisions, incentive compensation and matching expenses to the physical flow of goods would all be adversely impacted by trying to constantly manage to these methods.

A common approach to achieve meaningful inventory and cost of goods sold values is to establish a “standard cost” for every product and then adjust the value of inventory on a separate line at year-end, to bring it to the GAAP basis.

This standard cost requires direct labor, direct material and an inclusion of an amount representing the “absorption” of certain of plant-related overhead costs into the inventory value.

There are two forms of overhead that must be included in the inventory value from a GAAP perspective: 1) Labor overhead and 2) Manufacturing overhead, sometimes called Indirect Overhead.

  1. Labor overhead represents the costs of direct labor resources above and beyond their direct hourly wage rate. This amount includes payroll taxes, retirement and health care benefits, workers’ compensation, life insurance and other fringe benefits.
  2. Manufacturing overhead includes a grouping of costs that are related to the sustainment of the manufacturing process, but are not directly consumed or incurred with each unit of production. Examples of these costs include:
  • Materials handling
  • Equipment Set-up
  • Inspection and Quality Assurance
  • Production Equipment Maintenance and Repair
  • Depreciation on manufacturing equipment and facilities
  • Insurance and property taxes on manufacturing facilities
  • Utilities such as electricity, natural gas, water, and sewer required for operating the manufacturing facilities
  • The factory management team

The most common first step for determining the value of overheads in inventory is to use a predetermined rate that represents a cost charge per direct labor hour or cost per machine hour. From product bills of material and routings, the total number of hours or labor or machine usage for a unit volume of production is known. The value of the overhead cost rate per direct labor hour (or machine hour) x the number of hours required per unit of production, yields the overhead cost rate per unit. In the example below, the ERP will calculate the cost per work center, but it is reliant on the Direct Labor and Overhead Rates to complete this process.

dp-image-1jpg

The challenge comes when calculating the applicable pre-determined rate for overhead per direct labor hour or machine hour by the applicable cost or work center. PCMCS can assist with automating and updating this process.

A Better Solution: The Ranzal PCMCS Standard Cost Solution

PCMCS provides the ability to quickly and flexibly put the creation of multi-step allocation processes into the hands of business users. It also provides for the management of hierarchies without the need for external dimension management applications as well as standard file templates for data upload.  Further, a series of standard dashboard and report visuals augment the viewing and monitoring of results.  These capabilities allow organizations to quickly load and allocate expenses to applicable overhead cost pools and then merge those cost pools with applicable labor or machine hour values to obtain the relevant overhead rates.

PCMCS allows users to quickly select the cost centers or work centers that are applicable as sources to be included in the overhead rate:

dp-image-2jpg

Users then can easily select the targets for collecting these costs into relevant pools,

dp-image-3

as well as the operational metric to use to assign these overhead costs to their applicable pools.

dp-image-4

Users then can easily select the targets for collecting these costs into relevant pools,

dp-image-5

Edgewater Ranzal is the leading implementation services provider of Oracle and Hyperion EPM solutions and has extensive experience with Hyperion Profitability and Cost Management (HPCM). Following the release of PCMCS, Ranzal will be announcing a Cloud servicing offering that will leverage the power of the Cloud to provide an accelerated method of producing the required inputs for overhead allocation in standard costing.

More than just Standard Costing

Additionally, while PCMS provides an excellent way to develop overhead rates for standard costing, it can simultaneously be utilized to determine allocations and costing valuations that leverage other methodologies for product and customer costing and profitability. Much has been written about the potential for inaccuracies if the standard cost basis of overhead allocation in product costing were to be used universally or exclusively for management analysis.  Overhead has become such a large portion of the total cost, that in many cases, overhead rates can be three or four times higher than their respective direct labor rates.  This suggests a general lack of causality between overhead and direct labor hours in many cases, and this has led to the evolution of other methods for costing.  Activity Based Costing is one such example, while simply allocating manufacturing variances to product lines is another.

PCMCS can be used to meet the requirements for both the externally reported methods and the management methods of product costing.

All of the Results in One Place

Determining the method by which overhead should be captured in the cost of different products of inventory is an important process because it represents a step by which a large number of dollars is moved from an expense to an asset, usually temporarily but sometimes permanently, and this can impact profitability and stock share price.

For the purpose of valuing inventory for statutory reporting, the overhead rate method is considered acceptable and it is widely used. It is therefore important that organizations find a way to develop and manage these cost valuations in a manner that is well-documented, has transparent methodology and is one that reduces the amount of time spent on the process.  However, it is not the only method that should be used for considering overhead in product and customer costing and profitability analysis.  Further, selling, general and administrative expenses (SG&A) represents another layer of cost that while not part of standard inventory cost, should be considered in overall product costs from a management perspective.

To this end, the Edgewater Ranzal PCMCS Standard Cost solution will provide an opportunity to fulfill multiple needs in costing and profitability and will do so in a manner that will be faster and more user-friendly than what has previously been experienced.


Vess + ODI to extract Essbase metadata

$
0
0
Well, apparently it’s Friday Fun Day over here in Seattle. I was going to head up to the mountains for the weekend but plans changed. So how about a little frankendriver Vess update with some ODI goodness thrown in? Vess has some really interesting data integration possibilities that I’ve mentioned before, one of which is being […]

MaxL Grammar Changes

$
0
0
Oh my, it has been a while since my last post .. and feel that there may be more than one in addition to this one.

So recently I was working with a Planning application that SPARC'ed me to write this post ;)

Not sure how many know, but in version 11.1.2.4 there was change made to the MaxL grammar that now allows for the export of data to be made anonymous!

The MaxL export data statement includes grammar you can use to make exported data anonymous, wherein real data is replaced with generated values. This removes the risk of sensitive data disclosure, and can be used in case a model needs to be provided to technical support for reproduction of certain issues.

Here is the 11.1.2.4 Technical reference: link

ASO/BSO keyword:
export database <dbs-name> ... data anonymous

Description: Export data in anonymized format. Anonymization removes the risk of sensitive data disclosure, and can be used in case sample data needs to be provided for technical support. Essbase replaces real data values with 1, for each value in the block.

These are the railroad diagram for each ASO & BSO, and enjoy exporting data for analysis, support, performance testing, etc.



QUERYRESULTLIMIT and its journey to-date

$
0
0
This setting started to appear in version 11.1.2.4.007 and has been making an appearance in a few releases, PSUs, or both for Essbase server and APS, let's take a look

Oracle Essbase - Release 11.1.2.4.000 Patch Set Update (PSU): 11.1.2.4.007
Defects Fixed in this Patch:

21881863 - When running large MDX queries against an Aggregate Storage Database can result in running out of memory. In this case, use this essbase.cfg setting:

Name of setting: QUERYRESULTLIMIT

Syntax: QUERYRESULTLIMIT [appname [dbname]] n

Where n is an integer value specifying the maximum number of query result cells.

See the "QUERYRESULTLIMIT Configuration Setting" topic in the "Documentation Updates in this Patch" section in this Readme.

>>>
Documentation Updates in this Patch

QUERYRESULTLIMIT Configuration Setting

Sets the maximum number of cells returned by an MDX query. This configuration setting applies to block storage, aggregate storage and hybrid aggregation databases.

Syntax

QUERYRESULTLIMIT [appname [dbname]] n
  • appname—Optional. Applies the query result limit to the application specified. If you specify appname, you must also specify a value for n, or Essbase Server ignores QUERYRESULTLIMIT. If you do not specify an application, you cannot specify a database, and the query result limit applies to all applications and databases on the server. If you specify a value for appname and do not specify a value for dbname, the query time limit applies to all databases in the specified application.
  • dbname—Optional. Must be used with appname and n, or the server ignores QUERYRESULTLIMIT. If you specify dbname, appname, and n, the query result limit is applied only to the specified database.
  • n—Integer value of n specifies the number of query result cells that the server allows a query to return. You must specify this parameter or the server ignores QUERYRESULTLIMIT. If you do not specify appname or dbname, the query result limit applies to the entire server.
Description

QUERYRESULTLIMIT specifies the maximum number of result cells that an MDX query can retrieve before Essbase Server terminates that query. You can apply this setting to an entire server, to all the databases in a single application, or to a single database.

If no limit is defined in essbase.cfg, there is no results limit.

When the number of returned cells for a query exceeds the result limit, an error message is returned.

Use QUERYRESULTLIMIT to limit the result volume of MDX queries and prevent a query from freezing when a very large number of result cells are returned.

Examples

QUERYRESULTLIMIT Sample Basic 100000

Sets 100,000 cells as the maximum number of results cells returned in a query to the Basic database for the Sample application.

QUERYRESULTLIMIT 150000

Sets 150,000 cells as the maximum number of cells that a query can return before being terminated. The query result limit applies to all applications and databases on Essbase Server that correspond to the essbase.cfg file containing this setting.

>>>

Oracle Hyperion Provider Services - Release 11.1.2.4.000 Patch Set Update (PSU): 11.1.2.4.008

Defects Fixed in this Patch

22822213 - Provider Services support for Essbase Server update for QUERYRESULTLIMIT configuration setting in the essbase.cfg file.

See the Essbase 11.1.2.3.508_22314799 PSE Readme for more information.

22976584 - Essbase properties service.olap.dataQuery.grid.maxRows and service.olap.dataQuery.grid.maxColumns in the essbase.properties file are deprecated in this release. The results of the grid are now controlled by the QUERYRESULTLIMIT configuration setting in the essbase.cfg file on Essbase Server.

See the "Deprecated Essbase Properties in the essbase.properties File" topic in the "Documentation Updates in this Patch" section in this Readme.

>>>

Documentation Updates in this Patch

Deprecated Essbase Properties in the essbase.properties File

Essbase properties service.olap.dataQuery.grid.maxRows and service.olap.dataQuery.grid.maxColumns in the essbase.properties file are deprecated.

These properties, if defined in essbase.properties, do not have any effect on the grid result.

The results of the grid are controlled by the QUERYRESULTLIMIT configuration setting in the essbase.cfg file on Essbase Server.

With these changes, existing use cases that expect an error for the previous lower row and column limits set in JAPI will not get an error unless the QUERYRESULTLIMIT limit is exceeded.

With these changes, Provider Services JAPI does not apply any limits against a previous version of Essbase. Essbase has to be upgraded to use the QUERYRESULTLIMIT configuration setting.

>>>


Oracle Essbase - Release 11.1.2.4.000 Patch Set Update (PSU): 11.1.2.4.009

Known Issues in this Patch

QUERYRESULTLIMIT configuration setting does not honor all values in this patch.

This version of the QUERYRESULTLIMIT configuration setting documentation replaces the content in the Essbase 11.1.2.4.008 Readme. The default value is 1,000,000 and can be increased to 100,000,000 but no other value will be honored.


Oracle Essbase - Release 11.1.2.4.000 Patch Set Update (PSU): 11.1.2.4.010

Defects Fixed in this Patch:

22953962, 22999617 - QUERYRESULTLIMIT setting has an upper limit of 2^31 and when the limit is set to a greater value it is treated as the value 0.

22863123, 22861985 - When using QUERYRESULTLIMIT setting and pivoting a member from column to row or when you have particular spreadsheet layout while staying within the limit setting can result in a terminated query.


Patch Set Updates for Hyperion Essbase 11.1.2.4.011 – NONE


Patch Set Updates for Hyperion Essbase 11.1.2.4.012 – NONE


Oracle Hyperion Provider Services - Release 11.1.2.4.000 Patch Set Update (PSU): 11.1.2.4.013

Defects Fixed in this Patch:

23666602, 23149351 - Drill Through reports can exceed the limit of the QueryResultLimit Function and an error message is returned:

Error executing report \'EEGL'\'DT_REports'\'EEPL' in the Essbase Studio, message: Runtime error. Line =671.


Patch Set Update for Oracle Hyperion Essbase 11.1.2.4.014 – NONE

*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
 
AND, with all of that I think that there might be a few other points to note.
  • Default value is 1000000
  • The maximum value of 2^31 is equal to 2147483648 (or 2,147,483,648) that is over 2 billion, why?! Really, WHY?!?! What do you do with that many cells of data in Excel anyway?! Please don't share as I already know the answer, it starts with PI and ends with VOT table, please stop the insanity!
  • QUERYRESULTLIMIT does NOT have an 'unlimited' setting
  • Make sure that your Smart View client is updated to version 11.1.2.5.610 :)
  • 'unpublished bug' - huh?!? BUG 16005347 (MULTIPLE MEMBER SELECTION- ZOOM IN CAUSES "ESSBASE ERROR (1013295)" IN SMARTVIEW [UNPUBLISHED & INTERNAL]) ... but glad to hear it was fixed here with
    • Smart View v11.1.2.5.610, Patch 24711736
 
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*


All Base Dimensions are NOT Required in Smart View

$
0
0


You read the blog post title correctly.

Yes, if you didn't know or just were not aware, all of the base dimensions of an Essbase cube are NOT required on the Smart View grid. Let's take a look ..


Sample Basic: Base Dimensions {Attributes}

Application: Sample
Database: Basic
  • Year
  • Measures
  • Product {Caffeinated, Intro Date, Ounces, Pkg Type}
  • Market {Population}
  • Scenario
  • Caffeinated Attribute
  • Ounces Attribute
  • Pkg Type Attribute
  • Population Attribute
  • Intro Date Attribute


Smart View Grid WITH all Base Dimensions:




Smart View Grid WITHOUT all of the Base Dimensions:


Population; Not Market



Ounces; Not Product



Keep in mind that this tip applies to Essbase/Planning when there is/are attribute dimension(s). Otherwise, yes, you do need all of the base dimensions on the Smart View reporting grid to make it a 'valid reporting grid'.






Passing 300

$
0
0

It all began one summer

It seems so long ago (2,770 days or 396 weeks or 91 months or 7.6 years – but who’s counting?) that I first put pen to paper – Yes, I did.  Really.  I’ve now moved on to word processors for drafts and am thus so 21st century. – and started this blog.  And why the (re)counting?  Because this little corner of EPM inanity has hit 300 posts.  That’s an average of 39 posts of Stupid Programming Tricks, Compleat Idiot, Stupid Shared Services Tricks, Stupid Planning Tricks, and other sundry bits of EPM frivolity per year.  I pity you for reading this dreck.  Come to think of it, I pity myself for writing it at such a pace but on balance I think I feel worse for you.  

But it is a landmark of sorts and an opportunity to reflect on why this blog continues when so many contemporaneously launched blogs are moribund or nearly so.



So yes, 300 posts and yet some of you are still here.  Why?

Don’t know much about Essbase/PBCS/Planning/FDMEE/etc.

I seem to be forever chasing Oracle’s EPM seemingly ever-expanding products – how do I do X, how did someone else do Y (and how can I “borrow” their approach), why doesn’t that !@#$ing Z work?  Some of my fellow EPM practitioners seem to glide from tool to tool and solution to solution with nary a show of effort (Glenn, Celvin, TimG, TimT, Dino, and Pete I’m looking at each and every one of you.  With envy.).  I assure you that yr. most hmbl. & obt. svt never, ever, ever gets from A to B without a fair amount of pain.  Solving the problem is always fun, staring at it (best of course when in front of other people, the more senior the better) in complete incomprehension not so much.

So are you this?

Or this?

Everything I've Got Belongs To You

There are the greats in this industry – any industry really – and then there are the rest of us.  Is that so bad?  We’re not the smartest guys in the room but at least we get to be in the room.  Yes, I think I just insulted every one of you, Gentle Readers, but my point being that this blog’s primary purpose to help you and me get from A to B.  Maybe the fact that you read work-related blogs (obv. not just this one), read EPM books, follow EPM geeks on Twitter, and read and post on messageboards means that in fact you’re amongst the smartset.  Surely the smart ones use resources to solve their problems; surely the dumb ones don’t.  See?  I just rescued myself from having exactly zero readers.  Hopefully.

All kidding aside, this blog as it exists today would be pointless without you.  Thank you for putting up with what has been described as an idiosyncratic (read:  long winded with detours into obscurity) approach.  I hope you take the time to click on all of my laboriously-gathered links.  Goal one of this blog:  make you better EPM geeks.  Goal two of this blog:  make you all wish it was 1967 aka peak American popular culture as it’s a giant wasteland after that.  Let’s turn the clock back.  At least you’ll appreciate what your parents or grandparents (or in some cases great-grandparents) grooved to.

I’ve gotyour number

Google (Blogger and Google Analytics) is funny and by funny I mean inconsistent.

Here’s Blogger’s numbers:

Huzzah!  I’m closing in on a million page hits.

And then there’s Google Analytics:
Not-huzzah because it’s telling me that I’m closing in on half a million page views.  

It’s a riddle

A couple of interesting notes about the above:
  1. People don’t read this blog around Christmas.  Not a huge surprise there.
  2. My readership is going – slowly – down.  Why?

For the first, it’s nice to know that people have lives.

As for the decline (and it is real, alas) is I think based on two things:  number of posts per year (I hit my high in 2014 of 52 posts and readers vs. 40 the year after – less new content = less readers) and competition from other posts as well as Twitter and other social media.  I haven’t tried to count the number of EPM-related blogs extant today but it surely has to be about 50.  When I started it out the number was more like 10 although as noted most of those are dead, dead, dead.  YouTube, Facebook, and Twitter are yet more avenues for those who want to learn.

Or this blog sucks and is getting worse all the time.  You decide.

Why shouldn’t I

I like to think that actually the blog is getting better.  I’ve purposely hit on a combination of series posts such as the Compleat Idiot series on Planning in the cloud, Programming Stupid Tricks for unrelated Essbase, Planning, whatever-they-are tips and tricks, and community outreach posts such as live (sort of) blogging of Kscope, OpenWorld, and now meetups.  

You may have noticed that I’ve switched to a longer and more in depth approach in my Compleat Idiot cloud series.  There’s an awful lot to learn about Oracle’s cloud products.  Lots of innovation, yes, but also lots of work learning the tools and then keeping up with them.  I can’t think of how to do this except through this detailed way as so much innovation is coming out of the movement to the cloud.  Love the cloud or loathe it, money is being poured into the products in a way that simply hasn’t existed before.  That means the products change and expand constantly and that likely means the CompleatIdiot series won’t either.  That also means my life won’t get a lot better because some of these posts are over 50 pages when written in Word.  Ouch for both you in the reading and me in the writing.

While solutions to problems are what we’re all after, there is more to life and a career than code.  I’ve used this blog as a soapbox to encourage you in the strongest terms to get involved with our little community.  As an example, my involvement with ODTUG has utterly transformed my professional and personal life.  If it happened to me, it can happen to you.  Grasp the ring.  Reach.  Blow your horn.

Where I can, I’ve tried to also impart what little wisdom I’ve picked up in 20+ years of consulting in a 25+ year EPM so-called career.  Sometimes I shake my head at the folly of others when it comes to solutions (hubristically complex), code (ugly, hardcoded, slow, wrong – sometimes all four at once), and even social interactions (Is there anyone more awkward than a geek?  Thought not.) and then realize that I almost certainly did the same thing at one point or another.  Smart people learn from others’ mistakes.  Think of this as a plea to be smart and occasionally listen to me as I’ve made every mistake there is.  

The other bit of advice I’d give you is don’t be afraid to be a contrarian.  That of course doesn’t mean you’re always right, but reflect on why people say what they say.  Is a technical recommendation for the good of customer or is for the benefit of the speaker?  Is product X the solution that everyone follows because a vendor is pushing it or would some other simpler and cheaper approach work just as well?  

In a word:
 

Try to See It My Way

Have I fulfilled this blog’s mission?  Here’s what I wrote on 10 May 2009:
What about the “hacking” in the name of this blog? Hacking can mean all sorts of bad things and that’s what villains do. Good hackers are more interested in taking an ordinary tool (but so cool) and doing out of the ordinary things in a geek chic way.

To that end, I’m going to try to share with you some of the dumb things I’ve done and how you don’t have to do them, how to make Essbase do things it “can’t” do, and generally make Essbase dance.

Lastly and most importantly, I’ll also share code/techniques/approaches. I welcome your comments (constructive please, I have an average ego and it is bruised when pummeled) and most of all your suggestions for improvements. I’ve never written a piece of code that hasn’t been improved through examination by a fresh set of eyes and as a consultant if I can’t fix where I wrote it, I’ll make it better next time.

And, despite the title of this web site, I won’t limit the scope of my postings to Essbase. I’ll include anything else that touches Essbase, from Planning to Dodeca, to who knows what.

That, for good or ill, is pretty much what this blog is all about.  Through the passage of time I’ve forgotten about “geek chic” and shall henceforth casually drop it into conversation.

All kidding aside, I’ve tried very hard to live up to my vision of education and outreach and I think on balance I’ve managed to do it.

Watch what happens

So where does this blog go from here?  Will there be another 300?  Will I lose my ever-lovin’ mind and actually do this again?  Maybe.

So long as I’m involved in this little industry, I feel I have no choice but to keep learning.  Whether that’s through this blog, speaking at conferences, writing books, or in some other completely-monetarily-uncompensated form, I’ll keep on learning and sharing.  One day, hopefully not too (actually, yes, hopefully given what that entails) long from now, I’ll retire and this blog will come to an end.  I’m not dead yet and I’ve got a lot of livin’ to do so expect more of Cameron in one form or another.

Because of you

So yes, this blog exists because I use it as a mechanism to teach myself but making it public with a readership that rounds down to zero would be pointless.  Thank you for your support, your comments and corrections, and your continued readership.

Call me

Want to see a topic?  Have a question (hopefully) answered?  You can reach me care of this blog or via Twitter or via LinkedIn or reach out to me in person at meetups, Kscope, and Open World.

An undocumented Update in the FCCS Dec-16 release

$
0
0
The other day, I blogged about the changes in the Dec-16 release for FCCS.  As it turns out there is a bug that was fixed (well partially fixed) that people might want to know about. It relates to reporting currencies. First a little history. Back in August, I submitted an SR which turned into a bug.  The issue was for the currency dimension, specifically when you create a reporting currency.

To do that, I went into the currency dimension and simply checked the reporting currency box for the currency I wanted.


The next time the Essbase database was refreshed, I had a new reporting currency. Nifty. The member name for the new reporting currency is CAD_Reporting, no problem, The Alias for the member is "CAD"  Hmmm the alias is the same as the member name of the input currency. If you tried to retrieve data in Smart View with Aliases turned on, an error popped up about an ambiguous member name can't be resolved.  
 
I know what you are saying (I can read your mind). Just remove the alias from Cad_Reporting and all will be good.  Nice idea, but unfortunately, we can't edit the aliases for reporting currencies.
 
In the Dec-16 release although it is not documented, the issue is partially resolved. Now when you retrieve from Smart View you will not get an error about the name. if I retrieve the following sheet (Of course my POV is set to retrieve data properly) I get:


 
Looks good. If I switch to aliases (or just type in Cad_Reporting while Aliases are turned on) I get:
 
 
No error, but now which is which? Both say CAD, but the one on the right is really the alias for CAD_Reporting and the one on the left is the member name for the CAD input currency.   So at least now we don't get the error retrieving the data, but it will be a bit confusing for the users. They really have to put in "CAD_Reporting" and not the Alias CAD even when they have aliases turned on.  It should not be too bad until they completely fix it as the input currencies are not really used much, but expect to get questions about why what appears to be the same intersection provides different results.
 

On-Premise to Cloud FDMEE integration

$
0
0

At a recent engagement, we needed to load Actuals from a Hyperion Financial Management (HFM) application to a new Oracle Planning and Budgeting Cloud Service (PBCS) application. Oracle had recently released a new PSU for FDMEE (11.1.2.4 PSU 200), which introduced some exciting new features, including the ability to add a Cloud application as a target application. Perfect timing for us since we otherwise would have to go with a file based approach.

This Hybrid Implementation Support feature enabled us to load data directly from HFM to PBCS using Data Synchronization, another neat feature introduced in the 11.1.2.4 release.

In addition to this scenario, these two features now enable new data integration possibilities, such as:

  • Other On-Premise Applications to Cloud Data Synchronizations (PBCS, EPBCS, FCCS)
  • Cloud applications to On-Premise Application Data Synchronizations
  • Cloud Data Export to Custom Application (file)
  • On-Premise ERP, Data Warehouse, Database or File to Cloud Applications.
  • Cloud Application to On-Premise ERP writeback

Configuring Cloud support

To enable a secure communication channel with the cloud the PBCS Secure Sockets Layer (SSL) Certificate needs to be added to the on-premise FDMEE WebLogic server.

The configuration steps are detailed in the FDMEE Administration Guide 11.1.2.4.200, Deploying Hybrid EPM Applications section,  found online at: http://docs.oracle.com/cd/E57185_01/nav/datamanagement.htm

Adding a new Cloud Target Application

Once the configuration is done and the FDMEE service is restarted a new target application option is available to be added.

You first enter the connection details to your PBCS pod. We experienced a few snags with this initial connection, mainly due to that the SSL configuration on the FDMEE server had not been done correctly at first. Another undocumented issue we ran into was that the URL needed a forward slash at the end.

image3

Once these issues had been resolved the connection worked and we were able to successfully add the new target application.

image4

Notice the Deployment Mode!

Setting up Data Synchronization

Once the new target application is created, FDMEE treats it as any other EPM application and the setup is identical to setting up an on-premise application, such as adding import formats, locations, and data load rules.

Import Format

The first thing to configure, as with any new FDMEE data source, is a new Import Format.

image5

Select the Source Type as EPM and select your source on-premise application. Select Target Type as EPM and select the new Target Cloud Application.  Once you click Save you then need to map the Source dimensions to the relevant Target dimensions.  Note that all dimensions of your PBCS application are displayed. However, you only need to map the dimensions that are used in your target Plan Type. You select your target plan type when creating the data load rule.

Location

Now go ahead and add a new Location and select to use the newly created Import Format.

Data Load Rule

Now navigate to the new Location using the POV toolbar, select the Data Load Rule task and create a new data load rule.

image6

This is where you select your Target Plan Type.

As this is not a file based load rule we need to define the source query, in this case, what data we want to pull from our HFM application.

On the Source Options tab, you need to add Source Filters for all the dimensions of your source application. You can use the Select buttons for each dimension to define the member or member function to use, such as selecting all base members under a parent.

Mappings

The last step we need to do is to add maps for all our relevant target dimensions following the usual FDMEE mapping logic as described in a couple of our previous blogs.

Congratulations, you are now ready to start using this Location and Data Load Rule to extract data from your on-premise HFM application and load it to the cloud.

 

 

 

The post On-Premise to Cloud FDMEE integration appeared first on TopDown Consulting Blog.

Replicating On-Premise Essbase Load Rules in Oracle Planning and Budgeting Cloud Service (PBCS) Using Data Management

$
0
0

Authors: Mohan Chanila and Tyler Feddersen, Performance Architects

One of the perceived issues with the release of Oracle Planning and Budgeting Cloud Service (PBCS) in 2014 was the lack of full-use Essbase in the cloud.  With this, the ability to use load rules to modify, edit, and prepare a data file to upload into the cloud was taken away, and the PBCS interface did not provide the ability to tweak and modify a data file. However, “Data Management,” sometimes referred to as “Financial Data Quality Management (FDMEE) for the cloud” is included in your PBCS subscription, and can be used to replicate load rule functionality.

This blog entry highlights three examples where you can easily replicate on-premise Essbase load rule functionality in PBCS using the power of Data Management in PBCS: merging data fields; prefixing fields in a data file to match a PBCS dimension; and mapping fields in the data file to certain dimension members:

  1. Merging Data Fields

A common task performed in Essbase load rules is to use a non-formatted data file to merge or concatenate two or more columns in the file to provide a PBCS field.

In Data Management, this task can be performed in the setup window using the “Import Format” option. The sample window looks like this:

mohan-1

Choose “File” as your source and the EPM application and plan type to load the data into as your target. In order to concatenate or merge multiple columns together, simply click on the “Add” button and then enter an expression that specifies which columns in the data file you want to merge.

  1. Prefixing Fields in a Data File to Match a PBCS Dimension

A good example of prefixing fields in a data file to match a PBCS dimension includes a case where you need to load accounts, and the source system provides data files with a numeric account code (for example, “40001”) which you need to convert to a PBCS account member that contains a prefix (for example, “AC_40001”).

This can be done using Data Management in the “Workflow” tab under “Data Load Mapping.” In the sample screenshot below, you would simply use the “LIKE” expression to say, “All accounts should be prefixed with an “AC_” (the actual expression would be, “LIKE * to target AC_*”).

mohan-2

  1. Mapping Fields in the Data File to Certain Dimension Members

One of the more difficult load rule tasks is mapping fields from a data file into specific dimension members in Essbase. This is rarely done using load rules since it is far easier to format a data file to provide the requisite fields. However, in many situations (such as a smaller organization with limited IT resources), this is something that the EPM specialist may have set up for on-premise applications.

When you move onto PBCS, since load rules are no longer available, this task can now be performed using Data Management.  First, in the setup window, you will need to specify the source as a data file and the target as your EPM application. This is explained in Section 1 above.

After this, navigate to the “Workflow” section and the “Data Load Mapping” window. This time, you can set up a complex mapping table using the source and target mappings you’ve created and replicate them in FDMEE using the various functions available in Data Management. You can choose to do explicit mappings using one-to-one mappings; choose a range using the “BETWEEN” function; or use the “LIKE” operator to map all. Another nice feature in Data Management is that these mappings can be saved and imported in bulk.

Sample screen shot shown below:

mohan-3

We can list many more examples of load rule functionality that can be recreated seamlessly in Data Management in PBCS; however, that would take a lot more real estate than is possible in this blog entry.

5 Ways to Ease Cloud Migration

$
0
0
  1. Obtain Strong Executive Sponsorship and Governance

Organizational members’ exuberating confidence is essential to any project, regardless of company size. Frequent communication between management and every employee ensures mutual understanding that the project is a strategic priority. Through building support at each working level within the company, employees and frontline managers will be more willing to embrace the change knowing there is community involvement. While employees may think there is nothing wrong with the current processes and question how the change will affect existing procedures, supportive executive members can lead a smooth transition through their assurance in the project.

  1. Find a PROVEN Oracle Partner—AdvancedEPM

Connecting with a partner recommended by Oracle, such as AdvancedEPM Consulting Inc., is critical for both the implementation process and the time that follows. AdvancedEPM is an Oracle Platinum Partner with rich industry experience and broad technical capabilities. The team comprised only of seasoned technical experts secures knowledge gained through experience using and developing Oracle solutions to any system. Finding a company such as this with a vast array of experiences in projects and industries guarantees a professional, smooth, and quick transition to the cloud.

  1. Take an Iterative, Incremental Approach

Perhaps one of the most obvious benefits of switching to the cloud is not having to install anything on-premise; this makes sure the system gets up and running more quickly than any other less-agile solution. Clients are encouraged to apply new ERP cloud processes to smaller units of work first, to determine whether the solution meets their principal business requirements. During this time the team should take note of any scenario that does not meet their needs and give such feedback to the implementation partner so they can provide recommendations moving forward. Be careful about seeking too many changes as doing so might negate the countless advantages of the cloud. Communication between client and partner at each step of the process enables a company to fully embrace new capabilities and optimal business practices this system provides.

  1. Utilize Standard Best Practices Already in the Application

Before customizing the entire application, be sure to take advantage of already existing features that have proven to be beneficial for a wide array of projects. A modern ERP delivers 80% of common business processes that functions across most industries. If your company currently uses an old, on-premise system, you are not experiencing all the benefits that a progressive ERP cloud offers. Waiting to customize will help to make your business more efficient and streamline the transition to the cloud. Personalize your application after establishing some early wins from implementing the most up-to-date system on the market.

  1. Engage Users and Process Owners from the Start

Reach out to all users and administrators of the current system, and get them involved from the beginning of the transition. The men and women who use the company’s current EPM/ERP systems are instrumental in making a smooth evolution to the cloud. Engaging in conversation surrounding current system flaws and desired future outcomes are the keys to success. System operators should be engaged and learning alongside the implementation processes, so when the project goes live, users will be well-adjusted and comfortable with all aspects of the cloud. This will ensure minimal change management and eliminate delays in normal business operations.

Please contact AdvancedEPM today at info@advancedepm.com for more information.

The post 5 Ways to Ease Cloud Migration appeared first on AdvancedEPM Consultants.

Align KPIs to Improve Reporting Processes

$
0
0

Not so long ago an Oracle finance survey found that more than 50 percent of finance managers in global companies specified a lack of visibility into reporting. Many also admitted difficulty in controlling the quality of financial data across the reporting process.

It’s tempting to throw new technology solutions at these issues. But not so fast… Investing in brand new solutions will not streamline critical processes, or generate meaningful data. However, if you align technology with your company’s KPIs, you will rationalize all those processes, finding ‘where’ to make new investments.

When KPIs become TMI

Key performance indicators are just that: KEY. Numerous KPIs across a broad range of processes could get confusing and dilute the power of the data. Nailing down the value of business priorities and drawing a clear connection to KPIs will help drive data clarity. This clarification activity indicates where technology will empower the organization, and where new technology is not necessary.

With aligned KPIs, you can measure more strategically, have more time to model, see results of potential actions and respond with greater efficiency. You’ll have a better understanding to explain:

  • What happened?
  • Why did we get these results?
  • What will happen if, for example, we acquire a business or increase sales by 10 percent next quarter?

Finance teams and senior management will be able to see the possibilities and plan without incurring risk. An objective advisory service team can work closely with you to establish a process for modeling and assessment. They can also help you find the right technology to plug in along the way if it is necessary. The information you extract and apply to your organization’s processes will be meaningful and drive results.


Drillbridge Update: Officially Announcing Drillbridge Plus

$
0
0
It has been awhile since an official post on Drillbridge, so today I am happy to say that there has been a lot going on with Drillbridge behind the scenes! For those of you not familiar, Drillbridge is an innovative software application that runs as a service and makes it very easy to implement drill-through […]

Installing JunkieFramework in FDMEE

$
0
0
Installing JunkieFramework in FDMEE

In a previous blog post we announced the release of the JunkieFramework (call it whatever you want). This post will focus on getting the framework installed and show some basic usage.

Save Script

The simplest way to load the framework is by grabbing the source file JunkieFramework.py. Then save this file to your custom scripts folder. On our test server, this was: C:\FDMEE\data\scripts\custom\JunkieFramework.py. Make sure you modify the default Settings object to match your environment.

Import Library

Next, you need to import the framework into an event script for use. Our sample is from a BefImport event script. The code below automatically determines the custom scripts folder and loads the library. Once we import the library, we initialize the framework for use. Here I pass in the fdmContext and fdmAPI while using the default SMTP settings. Note: the framework can be used with custom, import and batch scripts as well.

Sweet Success

To see the framework in action you just need to execute a data load rule. This simple script will log the fdmContext properties to the process log and send an email. The good thing here is it's open source, so you can customize it to suit your needs. And hey, if you make changes that others might find useful, feel free to share and send us a pull request. :)

If you are planning on adding your own code to the framework you may want to add the reload command so that Jython does not use the cached version reload(jframe) (Thanks Francisco).

Guest Posts Welcome

$
0
0
I’ve talked with a fair number of you lately about various Oracle/Essbase/Hyperion topics and there are a lot of you out there with great ideas, news, or things to talk about but you don’t have your own blog. I just wanted to reiterate that if you want to write about something and get it out to folks, […]

ODI 12c new features: Dimension and Cubes! Part 4 (Loading using Surrogate Keys)

$
0
0
Hi guys how are you? Today we’ll continue the dimension and cubes series (Part 1, Part2 and Part 3 here) and we’ll see how to load data using Surrogate keys. After all the setting done in the last post, now the only thing left is to create the interfaces and map everything. For the Surrogate […]

A deep dive and demo of the coming Essbase Cloud Service

$
0
0

 

In case you missed it….

Yesterday, there was a Webinar hosted by Kumar Ramaiyer VP of product development to go over the new Essbase Cloud service. Included was a demo of a few use cases, which I will showcase below.

First and foremost it seems that Essbase will be included in a cloud service called Oracle Analytic Cloud– which will be a Platform-as-a-Service offing that combines Essbase and Data Visualization.

There will be two versions available: The Standard edition which will be Data Visualization and Essbase. The Enterprise edition will also include BICS.

Historically Essbase was created to model spreadsheet use cases that empower business users… a self proclaimed “Excel on Steroids.”  Of course, over time Oracle enhanced Essbase with tighter security, in-memory capabilities, Hybrid ASO/BSO, etc. But make no mistake, the Essbase cloud service certainly gets back to the roots of seamlessly extending and integrating Excel.

But as Kumar pointed out, Essbase in a cloud model is not new, as it is actually the back-end of many other cloud offerings available today such as Enterprise Planning Cloud, Planning and Budgeting Cloud, Financial Consolidation and Close, and even Financials Cloud and Project Management Cloud.

The general themes of the offering are:

  • Flexibility: Bring your own data, or start from scratch model
  • Ease of use: Users don’t need a lot of knowledge of Essbase
  • Collaborative: Workflow and change tracking built in
  • Relevant: Hybrid ASO/BSO is the default cube type
  • Fast: Real time benefits. Load data at the leaf level and data is automatically calced and aggregated on the fly.

 

So who is the intended customer for this?

Case 1: Move to the Cloud

For existing Essbase users, Oracle says that people have the ability to move their applications from on-prem to their cloud using LCM exports. For those that do not want to move existing models, the Essbase cloud service could be used in conjunction with on-prem by leaving existing apps on-prem and develop new models using an on-prem/cloud hybrid approach. This approach allows customers to test out applications in using the latest versions deployed in the cloud.

 

Case 2: Combination and Consolidation 

Of course, one of the biggest limitations of the Planning and Budgeting Cloud Service (PBCS) is that you can only have one Planning application. This forces users to purchase multiple separate PBCS instances for each application they need. Oracle sees the Essbase Aloud  as an opportunity to consolidate data from multiple separate PBCS instances into one large Essbase Cloud application for a single enterprise wide reporting source.

 

Case 3: Extend Excel

Let’s face it Excel is great but there are many limitations:

  • Size limitations
  • No collaboration or versioning
  • Data is segregated with no integrity
  • No real data visualization

So for new Essbase customers that want to extend the capabilities of Excel, this is a good option. Of course proponents of an Excel-only environment like Excel because of the isolation and protection of data – if someone makes a mistake it only affects their spreadsheet and not the enterprise data. However, Oracle included Sandboxing in this offering to address those concerns. Sandboxing allows users to create a private copy of the data to try things out before pushing them into the system.

 

The Analytic Cloud:  It’s not just Essbase

Perhaps the most compelling part of the Analytic Cloud is the included data visualization capabilities that transforms Essbase from a multidimensional data store to a full fledged analytic platform.

Data visualization allows users to discover and research Essbase data in a new refreshing way. It provides insights with rich color that can easily track and monitor progress in a collaborative fashion.

image002

Data can be visualized in the Data Visualization web interface or in Excel just as easily. Through the interfaces, it’s easy to identify things like dimensions, hierarchies, measures and attributes. You can quickly and easily change graph types or look at the data in a table format.

image003

image005

Architecture

As with many Oracle products that has moved into a cloud offering, Essbase was completely redesigned for the platform. They replaced the Essbase agent with a java based agent and created a unified middleware layer for all services that conforms to modern standards. REST, Java API, Scripting, R, and Groovy are all supported.

image008

They eliminated familiar tools such as Essbase Administration Services and Essbase Studio with other web based tools. These changes allow Oracle to expose ways to administer Essbase and run scripts without needing to give customers access to the servers. With these tools you can still view database statistics, edit the outline, manage connections and locks, and administer security.

image010

The new java based architecture gives way to a bring-your-own-browser environment as well as full smartivew connectivity and integration with BICS. It also creates a platform that is designed for higher concurrency, more secure, and scalable.

While you can have unlimited number of dimensions, you do have to chose your “shape.” You can choose from 1,2,4,8, or 16 CPUs with 7.5-15GB of memory per CPU. A CPU in this instance is a OCPU which is equivalent to a core. The service can be metered or non-metered.

They also simplified security. There are only two basic user role types

  • System level roles – to provision administrators and power users
  • application roles – app mangers, DB manager, update, read-only, etc.

Security provisioning is performed though a simple interface.

 

image011

 

Loading data and creating cubes

There are many ways to interface with and load into the Essbase cloud service

  • Life Cycle Management import/export
  • The new EssCS command line tool
  • Upload from free form excel
  • Design new in Excel
  • Java and REST APIs
  • Using a gallery template
  • The cloud UI itself.

 

You can use LCM to move cube artifacts from on-prem to the cloud. They can come from 11.1.2.3 or 11.1.2.4 deployments. Cross cube references such as XREF and partitions are not copied over, and will need to be recreated manually.

LCM imports can be done with the new EssCS command line tool:

image014

The EssCS command line tool can do a lot more. You can use this tool to execute jobs like calcs, clears, and data loads. You can even check status of jobs. Some commands out of the box:

  • login
  • Logout
  • Calc (execute a calc)
  • Dataload
  • Dimbuild
  • Download (download a file from a cloud instance)
  • Listfiles (ie – list all rul files)
  • Maxl (execute a MAXL script)
  • Upload (upload a file)
  • Version

Again, these new tools are for one reason: to give users the ability to interact with the cloud instance without giving the ability to get on the server. Remember this is a platform as a service… you will not have any access past the UI and these provided tools.

For a new Essbase applications, users can use the Cube Designer, which is a UI full of templates and excel modeling tools that help you create cubes. You can use Excel templates as a modeling tool that describes the structure that will be used to create the outline. There are pre-built templates for specific business use cases such as price analytics, margin analytics, cash flow, etc

 

image015

 

Summary of the Demo

Demo 1 – loading/editing data from Excel into a cube.

Case: I am looking at my staff of managers that all have staff and a budget to hire. I want to see them all on one place to see how the overall budget is going.

Log in:

image017

 

Look at the headcount

image002

Drill to detail:

image021

Add another manager called Imram who has a candidate Michal Jordon. I’m going to use an excel sheet to do it.

 

image024

Upload to the cloud:

image025

image027

 

See the new data:

image030

 

Demo 2:  Import a spreadsheet and create a new cube from it.

image044

Choose Transform data:

image038

Choose application name:

image039

It automatically shows what it thinks the hierarchies are.

image041

Build the cube. You can watch the job in the job manager

image043

When complete, we can see the new app in the web UI and in Smartview:

image045

Demo 3: Creating a cube using a template:

Open up the Gallery and select a pre-built application workbook.

image047

Fill out the workbook

 

image049

image051

image053

image055

Note: Screen shots taken from public Oracle webinar. A copy of the recorded webinar can be found on Oracle’s web site and blogs.


Viewing all 1880 articles
Browse latest View live