Quantcast
Channel: ODTUG Aggregator
Viewing all 1880 articles
Browse latest View live

PBCS vs. EPBCS: Comparing Oracle's Cloud Planning Applications @usanalytics @orclEPMblogs

$
0
0


If you’re thinking about migrating your Hyperion Planning environment to the cloud, there are several best practices that help you make the move successful. We’ve talked about how a cloud migration lends itself to the opportunity to consolidate and cleanse your environment. But before deciding what needs consolidation, you have to make a big decision: Oracle Enterprise Planning and Budgeting Cloud Service (EPBCS) or Oracle Planning and Budgeting Cloud Service (PBCS)?

Whether you choose PBCS or EPBCS, the overall benefits an Oracle cloud application are the same: no upfront cost for hardware or software, less IT involvement, and no annual maintenance costs. In this blog post, we’ll take a deeper dive into comparing Oracle’s cloud planning applications. 


New EPM cloud users with SSO - Quick Tip @omarshubeilat @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs

$
0
0
Have you ever tried to create an EPM cloud user (PBCS, PCMCS, EPRCS..)...and you were expecting a temporary password in the email notification but it was nowhere to be found? I know It can be a bit confusing, you're quite certain you've created the user but at the same time you cannot log in because you don't have your password /security questions setup for the first time.

This is a sample email notification notifying you've been granted access to the EPM instance, as you can see there is just a direct link to the login page but no temporary password.



 This is a sample email notification with a temporary password.



Well, this is happening because of Single Sign-On configuration is on, which means you must select the "Maintain Identity Domain Credentials" when you create the new user if the user is not part of the Identify Provider domain.







That's it, quick and easy! 

Hope this helps.


KScope 18 Speaker Award @devepm @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs

$
0
0
Hey guys how are you? It has been awhile since last time I wrote anything here…. and surprise, surprise, it’s because I’m crazy working in a project that was sized small but turn out huge and the size didn’t change…. 🙂 never happened before heheheh 😉 This is just a small post to tell how […]

ODTUG Board of Directors Nominations Close in 3 Days!

$
0
0
This is your opportunity to nominate the person you believe will best provide leadership and policy development for ODTUG. For more information, please click here. All nominees must be paid ODTUG members in good standing.

OAC-Essbase has Swagger

$
0
0
Oracle Analytics Cloud (OAC) Essbase has swagger. I don't mean the self-confident arrogant demeanor and walk. Rather I'm talking about something I've seen multiple people ask for.  I've seen requests for documentation on the REST API. For those of you who don't know REST for  it is Representational State Transfer and refers to a protocol for for a stateless casheable communications HTTP based way to execute commands on cloud applications. In OAC it is used for everything from outline updates, to loading data, to stopping and starting applications to just about everything in OAC.  

Rather than provide the historically meaningless written documentation that we have endured for years, the OAC development team decided to provide us with an interactive testable set of documentation called Swagger. In order to get to the documentation, like with other REST calls, you use HTTP connections in your browser.  To connect use:
https://your Essbaseinstancename/essbase/rest/doc/#  

You will most likely be asked to provide your login credentials. 

One loaded the interface will look like:



You then can click on any item and get details about it. For example is we want to find out about the OAC instance we can select About:



Even better, notice above you can try it out. When I executed it I got:



Give it a try and you will have swagger yoursef




















Connectiong to Autonomous OAC through MaxL

$
0
0
Just a quick note, since I had to ask to get it to work. You have the capability to download a secure version of MaxL from the OAC Essbase instances. What this means is you can run the MaxL you have on-premises just by changing the server to the cloud URL.  If you are migrating from  on-prem, this can save you a lot of time as you won;t have to write REST calls or EssCLI.

When you connect to a PaaS OAC instance, you can just get the public IP address and use it, but ther is no public IP address for Autonomous OAC (OAAC). Insted, you have to use the server address like:
https://your instance.oraclecloud.com/essbase/, but that does not work.   I order to get it to work after essbase you have to add /agent.

Note, the same thing is true if you want to connect using EssCli.

Good luck. Hopefully your life became easier.

PSA: Planning Cloud Sept 2018 Variable Changes @opal_epm @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs

$
0
0
After tearing my hair out for 2 or 3 days and pestering Oracle Support with a number of SR’s, I have finally understood the impact of an important Sept 2018 update. This currently affects only Test instances, but the Sept Prod update arrives this Friday. This concerns how user & substitution variables are created and … Continue reading PSA: Planning Cloud Sept 2018 Variable Changes

PCMCS…Yeah, FDMEE Can Do That! @ranzal @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs

$
0
0

Oracle Profitability and Cost Management Cloud Service and Oracle Financial Data Quality Management Enterprise Edition Working Together Better

Over the last year, we have been fielding, positioning, and aligning more with Oracle’s new Cloud products. Some of the most common questions we are asked are:

  1. Has Edgewater Ranzal done that before?
  2. What “gotchas” have you encountered in your implementations and how have you addressed them?
  3. What unique offerings do you bring?

These are all smart questions to ask your implementation partner because the answers provide insight into their relevant experience.

Has Edgewater Ranzal done that before?

Edgewater Ranzal is an Oracle PCMCS thought leader and collaborates with Oracle as a Platinum partner to enhance PCMCS with continued development. To date, we’ve completed nearly 20 PCMCS (Cloud) implementations, and almost 80 Oracle Hyperion Profitability and Cost Management (HPCM – on premise) implementations spanning multiple continents, time zones, and industries. Our clients gladly provide references for us which is a testament to our success and abilities. Additionally, we frequently have repeat clients and team up with numerous clients to present at various conferences to share their successes.

As a thought leader in the industry and for PCMCS, we sponsor multiple initiatives that deliver implementation accelerators, test the latest product enhancements prior to their release, and work in tandem with Oracle to enhance the capabilities of PCMCS.

Our Product Management team is comprised of several individuals. Specifically for PCMCS, Alecs Mlynarzek is the Product Manager and has published the following blog: The Oracle Profitability and Cost Management Solution: An Introduction and Differentiators.  I am the Product Manager for Data Integration and FDMEE with several published blog posts related to FDMEE.

Now let’s explore some of the data integration challenges one might unexpectedly encounter and the intellectual property (IP) Ranzal offers to mitigate these and other data integration challenges that lurk.

What gotchas have you encountered in your implementations and how do you mitigate them?

We could go into great depth when detailing the PROs for using FDMEE with PCMCS…but it is much more beneficial to instead share some of the other less obvious discoveries made. Note that we work directly and continuously with Oracle to improve the product offering.

  • Extracting data via FDMEE data-sync is challenging. The size of the data cube and configuration settings of PCMCS has a threshold limit – 5,000,000 records and a 1GB file size – both of which are quite often reached. As a result, we have developed a custom solution for the data-sync routine.
  • Large datasets directly into PCMCS via DM (Cloud-based Data Management) can exhibit performance problems due to the server resources available in the Cloud. Functionality in on-premise FDMEE (scripting, Group-By, etc.) helps reduce the number of records going into the Cloud and therefore provides a performance gain.
  • Patching to the latest FDMEE patch set is crucial. Cloud applications (PCMCS, FCCS, E/PBCS) update monthly. As a result, we need to consistently check/monitor for FDMEE patches. These patches help ensure that canned integrations from Oracle are top-notch.

FDMEE_PCMCS Image 1

  • Executing two or more jobs concurrently via EPMAutomate is quite troublesome due to the workflows needed and how EPMAutomate is designed. As a result, we have invested considerable time into cURL and RESTful routines. We discovered that the login/logout commands are tied to the machine, not the user-process, so any logout from another executing run logs out all sessions.

FDMEE_PCMCS Image 2

  • The use of EPMAutomate is sometimes difficult. It requires a toolset on a PC – “JumpBox” – or on-premise EPM servers. It also requires the use of .BAT files or other scripted means. By using FDMEE, the natural ease of the GUI improves the end-user experience.
  • Loading data in parallel via FDMEE or DM can cause Essbase Load Rule contention due to how the automatic Essbase load rules are generated by the system. Oracle has made every effort to resolve this before the next Cloud release. Stay tuned… this may be resolved in the next maintenance cycle of PCMCS (18.10) and then the on-premise update of patch-set update 230.
  • We all know that folks (mainly consultants) are always looking to work around issues encountered and come up with creative ways to build/deliver new software solutions. But the real question that needs to be asked is: Should we? Since FDMEE has most of the solutions already packaged up, that would be the best tool for the job. The value that FDMEE can bring is scores better than any home-grown solution.

What unique offerings do you bring?

At Edgewater Ranzal, we have started to take some of our on-premise framework and adopt it for PCMCS. Some of the key benefits and highlights we provide are:

  • To combat the complications with loading data via FDMEE because of FDMEE’s inability to execute PCMCS clears out-of-the-box, we have added the functionality into the Ranzal IP catalog and can deploy this consistently for our clients. This is done via the RESTful functionality of PCMCS. Some of the items we have developed using REST are:
    • Import/export mappings
    • Execute data load rules or batch jobs from 3rd party schedulers
    • Refresh metadata in the Cloud
    • Augment EPMAutomate for enhanced flexibility
    • Execute business rules/clear POV commands as part of the FDMEE workflow
    • Execute stored procedures (PL/SQL) against DBaaS (see below)
    • Enhanced validation framework (see below)
  • We have redeveloped our Essbase Enhanced Validate to function with the PCMCS Cloud application. FDMEE on-premise can now validate all the mapped data prior to loading. This is great for making sure data is accurate before loading.

FDMEE_PCMCS Image 3

  • The Edgewater Ranzal tool-kip for FDMEE includes the ability to connect to other Cloud offerings for data movements, including DBaaS and OAC.

FDMEE_PCMCS Image 4

Can FDMEE do that…and should FDMEE do that?

Yes, you should use FDMEE to load to PCMCS, and it is an out-of-the-box functionality! As you can see, unlike DM whose feature comparison to FDMEE will be discussed in a later blog and white-paper, there are a lot of added benefits.  The current release of FDMEE v11.1.2.4.220 provides product functionality enhancements and has greater stability for integrations with most Cloud products.  Suffice it to say, having python scripting available and server-side processing for large files will greatly enhance your performance experience.

FDMEE_PCMCS Image 5

Contact us at info@ranzal.com with questions about this product or its capabilities.


FCCS - Housekeeping Tasks

$
0
0
As more and more customers migrate to or license and implement Oracle FCCS, people are asking what they need to do with regard to housekeeping or care and feeding of the application. There are several tasks that should be done to keep FCCS in good shape. Nothing is hard, some can be automated, and those not currently "automatable" will be. These are listed below in no particular order.


  1. Validate Metadata - within the metadata manager run the process to validate metadata. Issues occur when metadata settings are not correct and the validation process will tell you what is wrong.
  2. Download application snapshots - the application snapshot should be downloaded for backup purposes. This is easy to automate with EPM Automate.
  3. Download and truncate the audit logs. As the audit logs grow, they utilize space. Every now and then (the frequency will vary from customer to customer) these logs should be downloaded and then truncated.
  4. Clear Empty Blocks - run this business rule to remove empty blocks, as they are using space when they're not needed. Running monthly or weekly is about right.
  5. Dense Restructure - this is like defragmenting the application. The statistics will show you whether needed or not. If the value is close to 1, then running is not needed. If something like 0.001024, then definitely needed. Again, checking weekly is about right.
Doing these things should keep FCCS running quickly and smoothly.





EPM Cloud Updates – October 2018 – EPBCS, FCCS, PCMCS, ARCS, EPRCS & EDMCS

$
0
0
  Oracle releases the application updates documentation on new features and fixed issues from the Applications Release Readiness site. From there, you will be able to check the update for Customer Experience, Human Capital Management, Enterprise Resource Planning, Supply Chain Management, Enterprise Performance Management.   A quick recap of the EPM applications. Planning and Budgeting Cloud Enterprise Planning Cloud...

Automating data flows between EPM Cloud and OAC – Part 1

$
0
0
In past blogs I have covered in detail, automation in EPM Cloud using the REST API. Recently I have blogged comprehensively on the Essbase REST API in OAC, so I thought I would combine these and go through an example of automating the process of moving data between EPM Cloud and OAC Essbase.

The example will be based on extracting forecast data from PBCS using Data Management, downloading the data and then loading this to an OAC Essbase database. I will provide an option of downloading data directly to OAC from PBCS for those who have a customer managed OAC instance, alternatively for autonomous OAC the data can be downloaded from PBCS to a client/server before loading to Essbase.

I am going to break this into two parts, with the first part covering the setup and manual steps to the process, then the second part gets into the detail of automating the full process with the REST API and scripting.

Before I start I would like to point out this is not the only way to achieve the objective and I am not stating that this is the way it should be done, it is just an example to provide an idea of what is possible.

To start out with I am going to want to extract forecast data from PBCS and here is a sample of the data that will be extracted:


To extract the data, I am going to use Data Management, once the integration has been defined I can add automation to extract the data using the REST API.

As it is EPM Cloud, I will need to extract the data to a file and this can be achieved by creating a custom target application in Data Management.


The dimensions have been created to match those of the OAC Essbase database, I could have included scenario but that is always going to be static so can be handled on the Essbase side.


There are slight differences between the format of the Year in PBCS


to that in the Essbase database.


Aliases could be used but I want to provide an example of how the difference can be handled with period mappings in Data Management.


This will mean any data against, say FY19, in PBCS will be mapped to 2019 in the target output file.

If there are any differences between other members these can be handled in data load mappings in DM.

In the DM data load rule, source filters are created to define the data that will be extracted


In the target options of the file a fixed filename has been added, this is just to make the process of downloading the file easier. If this is not done, you would need to either capture the process ID from the REST response to generate the filename or read the filename from the jobs REST response, both methods produce the same outcome but, in this example, I am going for the simpler option.


Before running the integration, I will need to know which start and end period to select.

For the automated process I am going to pick this up from a substitution variable in Essbase, it would be the same concept if the variable is held in PBCS as both have a REST resource available to extract the information.


The data will be extracted for a full year, so based on the above sub var, the start period would be Jan-19 and the end period Dec-19


Running the rule will extract the data from PBCS, map and then produce an output file.


The rule ran successfully so the exported file will be available in the inbox/outbox explorer.


If I download the file you can see the format of the exported data.


When I cover the automation in the next part I will provide two options, the first one will download the data file directly to OAC from PBCS and then load the data, the second will download the file from PBCS to a machine running the automation script and then stream load it to Essbase.

As this post is about manually going through the process, I have downloaded and the file from PBCS and uploaded to OAC Essbase.


The file has been uploaded to the Essbase database directory.


Now an Essbase data load rule is required to load the above file.

A new rule was created, and the uploaded data file selected.


The columns in the data file were mapped to the corresponding dimensions.


The data is always being loaded to the forecast scenario member and not contained in the file, so this was added to the datasource information.


As I mentioned earlier I could have easily included scenario in the data export file by adding the dimension to the target application in Data Management, it is up to you to decide which method you prefer.

Once created it will be available from the scripts tab and under rules.


To run the rule, head over to jobs in the user interface and select “Load Data”


The application, database, rule and data file can then be selected.


The status of the data load can then be checked.


This is a hybrid database there is no need to run a calculation script to aggregate the data, if aggregations or calcs were required to be run then you could simply add this into the process.


A retrieve on the data confirms the process from extracting data from PBCS to OAC Essbase has been successful.


You could apply this process to extracting data from OAC and loading to EPM Cloud, one way to do this could be to run an Essbase data export script, the export file could then be uploaded to EPM Cloud, and a Data Management rule run to map and load to the target application.

We have a process in place, but nobody wants to live in a manual world, so it is time to streamline with automation which I will cover in detail in the next part. Stay tuned!

Labor Budget Increases, Staffing Shortages Loom Large for Healthcare Execs in 2019; Set Expectations Now and Uncover Your Capabilities for an Enterprise-Based Labor Productivity Solution! @ranzal

$
0
0

The two resounding topics on healthcare websites and in related blog posts:   (1) increased labor costs and (2) burnout or shortages of clinical staff.  The article published in “Healthcare Finance” Labor Budget Increases, Staffing Shortages Loom Large for Healthcare Executives in 2019 highlights this exact topic.

This isn’t surprising considering access to healthcare for all has increased; therefore, there are more patients to see which, in turn, requires more staff which results in increased labor costs…see where I’m going here? It’s easy to see how this can quickly become a major concern for providers to analyze and keep up with demand.

It becomes evident while working with numerous healthcare clients that not all healthcare companies are treated equally regarding their maturity scale when answering specific labor questions, providing/analyzing data, or even supporting a labor productivity solution. Edgewater Ranzal’s complimentary Healthcare Labor Productivity Assessment Workshop not only helps reset clients’ expectations, but also uncovers clients’ enterprise-based labor productivity solution capabilities.

Our solution utilizes Oracle Cloud or on-premise technology to help clients see an immediate return-on-investment just by analyzing contract agency usage statistics, providing detailed overtime analysis, and offering the ability to compare productivity across national standards that are loaded into the system. Additionally, we help clients align their labor productivity solutions with their planning/budgeting processes to improve budget detail and accuracy.  Comprehensive experience with data integration – often a challenging task for clients – allows us to work with staff to bring all the required data elements together to create a cohesive picture of labor productivity details.

Take a look at our webinar recording of The Key Ingredients to Understanding Labor & Productivity to learn more about our solution to uncover best practices in addressing labor productivity in your organization.  Then contact Edgewater Ranzal’s Healthcare experts to answer specific questions about implementing a solution to help cut labor costs and provide data-rich analytics to your organization.

EPM Lab – ASO Cube Clear 0s

$
0
0
Since the ASO (Aggregate Storage Option) cube is able to aggregate data on the fly, you may leverage the ASO cube to process the aggregation.  There are a few ways to move data from BSO cube to ASO: Data synchronization Data Map Smart Push @XWRITE Groovy rules Once you push data to the ASO side,...

EPM Cloud - Recent additions to EPM Automate and REST API

$
0
0
In the EPM Cloud 18.10 release there were a few additional commands added to the EPM Automate utility, these are also available through the REST API as the utility is built on top of the API.

An annoyance for me with EPM Automate and the REST API has been not being able to rename a snapshot, even though it has always been possible through the web UI.


Not being able to rename out of the UI made it difficult to automate archiving the daily snapshot in the cloud instance before the next snapshot overwrote the previous one. You could download, rename and upload but this over complicates what should have been a simple rename.

With the 18.10 release it is now possible to rename a snapshot with a new EPM Automate command.

To rename a snapshot, the syntax for the utility is:

epmautomate renamesnapshot <existing snapshot name> <new snapshot name>

Using EPM Automate and a script, it is simple to rename the snapshot, in the following example the daily snapshot is renamed to include the current date.


This means the snapshot is now archived and the next daily maintenance will not overwrite it.


Please note though, there is a retention period for snapshots which currently stands at 60 days and a default maximum storage size of 150GB. If this is exceeded then snapshots are removed, oldest first to bring the size back to 150GB.

The documentation does not yet provide details on how to rename a snapshot using the REST API, but I am sure it will be updated in the near future.

Not to worry, I have worked it out and the format to rename a snapshot using the REST API is:


If the rename is successful, a status of 0 will be returned.


In the UI you will see the snapshot has been renamed.


If the rename was not successful, a status that is not equal to 0 will be returned and an error message will be available in the details parameter.


The functionality will only rename snapshots and does not work on other file types.

It is an easy task to script the renaming of a snapshot using the REST API. In the following example I am going to log into a test instance and rename the daily snapshot, then copy the daily snapshot from the production instance to the test instance. This means the production application is ready to be restored to the test environment if needed, also the test daily snapshot has been archived.


The above section of the script renames the test snapshot, the next section copies the production snapshot to the test instance.

When calling the REST API to copy a snapshot, a URL is returned which allows you keep checking the status of the copy until it completes.


Now in the test instance, the daily snapshot has been archived and contains a copy of the production snapshot.

 

It is also possible to copy files across an EPM Cloud instance using the EPM Automate command “copyfilefrominstance”. This command was introduced in the 18.07 release and the format for the command is:

epmautomate copyfilefrominstance <source_filename> <username> <password_file> <source_url> <source_domain> <target_filename>

To achieve this using the REST API is very similar to my previous copy snapshot example.

Say I wanted to copy a file from the test instance to the production one and rename the file.


An example script to do this:


The file has been copied to the production instance and renamed.


When the 18.10 monthly readiness document was first published it included details about another EPM Automate command called “executejob”

“executejob, which enables you to run any job type defined in planning, consolidation and close, or tax reporting applications”

This was subsequently removed from the document, but the command does exist in the utility.


The command just looks to bypass having to use different commands to run jobs, so instead of having to use commands such as “refreshcube”,”runbusinessrule” or “runplantypemap” you can just run “executejob” with the correct job type and name.

For example, if I create a new refresh database job and name it “Refresh”


The job type name for database refresh is “CUBE_REFRESH” so to run the refresh job with EPM Automate you could use the following:


The command is really replicating what has already been available in the REST API for running jobs.

The current list of job types is:

RULES
RULESET
PLAN_TYPE_MAP
IMPORT_DATA
EXPORT_DATA
EXPORT_METADATA
IMPORT_METADATA
CUBE_REFRESH
CLEAR_CUBE


I am not going to go into detail about the REST API as I have already covered it previously.

The format for the REST API is as follows:


The response will include details of the job and a URL that can be used to keep checking the status.


I was really hoping that the functionality was going to allow any job that is available through the scheduler to be run, for instance “Restructure Cube” or “Administration Mode” but it looks like it is only for jobs that can be created. Hopefully that is one for the future.

In 18.05 release a new EPM Automate command appeared called “runDailyMaintenance” which allows you to run the daily maintenance process without having to wait for the maintenance window. This is useful if new patches are available and you don’t want to wait to apply them. In 18.10 release the command includes a new parameter which provides the functionality to skip the next daily maintenance process.

The format for the command is:

epmautomate rundailymaintenance skipNext=true|false

The following example will run the maintenance process and skip the next scheduled one:


I included the -f to bypass the prompted message:

“Are you sure you want to run daily maintenance (yes/no): no?[Press Enter]”


The REST API documentation does not currently have information on the command but as the EPM Automate utility is built on top of the API, the functionality is available.

The format requires a POST method and the body of the post to include the skipNext parameter.


The response will include a URL to check the status of the maintenance process.


When the process has completed, a status of 0 will be returned.


It is worth pointing out that as part of the maintenance steps, the web application service is restarted so you will not be able to connect to the REST API to check the status while this is happening.

Another piece of functionality which has been available through the REST API for a long time, but not EPM Automate, is the ability to return or set the maintenance window time.

To return the maintenance time, a GET method is required with the following URL format:


The “amwTime” (Automated Maintenance Window Time) is the scheduled hour for the maintenance process, so it will be between 0 and 23.

To update the schedule time a PUT method is required and the URL requires a parameter called “StartTime”


If the update was successful a status of 0 will be returned.

You can then check the maintenance time has been updated.


The following script checks the current maintenance time and updates it to 03:00am


I did notice a problem, even though the REST API is setting the time, it is not being reflected in the UI.


It looks like a bug to me. Anyway, until next time…

ODC Appreciation Day: Oracle Cloud Customer Connect @opal_epm @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs

$
0
0
It’s that time of year again! What is ODC Appreciation Day? Every year, the Oracle community (led by Tim Hall) picks a day to flash mob blog in appreciation of the Oracle Developer Community (“ODC”) program. Take a look at Tim Hall’s blog post to find out additional details, but the rules of this year’s … Continue reading ODC Appreciation Day: Oracle Cloud Customer Connect

EPM Lab – EPM Connections @_Jun_Zhang_ @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs

$
0
0
If you have multiple EPM cloud subscriptions, you may want to connect them together. When users have logged into one application, they could easily toggle between different subscriptions.   EPM connection provides this functionality. By using EPM connections, you will be able to: Use Navigator to move from one application to another User Clusters and...

FCCS - Blocks

$
0
0
Those who are coming to FCCS from Essbase or Planning know all about blocks. But for those coming from HFM or Enterprise or MicroControl, blocks are a new concept.

BRIEFLY, blocks are how data is stored in FCCS. A block is a combination of any stored sparse and dense member. Each combination of stored sparse dimensions is a separate block. But all members in a dense dimension is stored in one block. In FCCS, the Account dimension is the only dense dimension, so all accounts are in the block for a given combination of sparse dimensions.

Blocks get automatically created when loading data but not with calculations. You write a perfect calc, deploy, consolidate, and get no result. It looks like the calc either didn't run or didn't work. What actually happened is the calc worked but since there was no block there was no place to store the result.

So, what to do? On the rule insertion points, there is an option to enable automatic block creation.



On the bottom right of the above screenshot, you'll see an option to auto create blocks. Clicking No will change it to Yes with the following warning.


So now when you consolidate, the calc will run as before but now there will be a place to store the data.

There are other ways to create blocks, but this is the simplest (other than loading data).





Automating Oracle EPM Cloud efficiently @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs

$
0
0

This article acknowledges a tried-and-tested approach for EPM Cloud automation, and challenges this approach in an effort to increase the efficiency of EPM teams and reduce the overall risk for an organization: Today Data Management* and EPMAutomate are the standard for Oracle EPM Cloud integration and automation, and for good reason: EPMAutomate is a free utility (with your EPM Cloud subscription) that ... Read more

The post Automating Oracle EPM Cloud efficiently appeared first on FinTech Innovations.

OBIEE Development: Merging the RPD with Git (Free Open-Source Tool) @usanalytics @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs @orclEPMblogs

ARCS Updates (October 2018): New EPM Automate Utility Version, Considerations for the Academy, and More

$
0
0

The October updates for Oracle's Account Reconciliation Cloud Service (ARCS) have arrived. In this blog post, we’ll outline new features in ARCS, including a new EPM Automate Utility version and considerations for the Academy.

We’ll let you know any time there are updates to ARCS or any other Oracle EPM cloud products. Check the US-Analytics Oracle EPM & BI Blog every month.

The monthly update for Oracle ARCS will occur on Friday, October 19 during your normal daily maintenance window.

Viewing all 1880 articles
Browse latest View live