Wednesday, December 4, 2013

First impressions with the Calculation Manager

Having read that Calculation Manager is now a mature product, I left aside some time for evaluating this new development environment for HFM. First, I consulted the material found on the net, and got immediately started. I wrote and tested all the typical calculation rules found in most of my client implementations.

Here is some must-read material that I found very useful during this endeavor:
It appears that there are benefits from moving to calculation manager, though I'm much concerned with productivity of rules writing. There is too much clicking around on the visual replacements of both logic and functions.

On the contrary, using a good text editor, which:
1. supports customization to highlight the HFM VBScipt syntax,
2. provides with custom function and subroutine quick lists,
3. bookmarks, and
4. intellisense
is way more productive to write old fashioned classic rules than using the calculation manager. Though it may be worth keeping with the currents, as is always the case with technology.

One thing I was curious about, was whether the process of deploying rules with the calculation manager is irreversible, like the metadata case. However, it seems that both classic and calculation manager rules can overwrite each other at any time. In addition as soon as there are rules implemented with the calculation manager, you can extract them as you would do with classic rules.

Thursday, November 28, 2013

HFM Dimensions and Metatada

Overview
In this article, we shall first clear up a confusion observed frequently between HFM Dimensions and Metadata and explore their relationships. We shall close by examining a peculiarity of the Value dimension.

Note that all presented information refers to Oracle Hyperion Financial Management version 11.1.2.2 or earlier.

HFM Dimensions and Metadata
When training people in designing or maintaining HFM applications, all too often there is confusion between metadata items and dimensions. It is probably because for the most part of the application design, we populate members in Entities, Accounts and Custom dimensions through using our metadata design tools, which seem to be in one-to-one relationship with the dimensions used and selected in the application.

It may be helpful to separate between the two by visualizing metadata as an abstract concept related to application design and dimensions as the incarnation of the design in a working application. One thing is for sure: there is no one-to-one correspondence between a metadata item and a dimension.

In addition, not all dimensions behave the same. The following table summarizes the similarities and the differences in nature and behavior of both metadata items and dimensions pertaining to an HFM application.


A common HFM application uses 12 dimensions. More custom dimensions may be added in newest versions of the software; still the 12 dimensions are the core. Our first observation then, is that the number of metadata items does not equal to the number of core dimensions (refer to the first column).

Second, there are no dimensions for Currency, ConsolidationMethod, SecurityClass and Alias. We could have also added the AppSettings item found in the classic metadata designer, still this is so obvious that this item does not relate to any dimension whatsoever.

Our third observation is that there are also differences in the degree that we can customize each metadata item. For example, while we can freely add members almost to any metadata item, though ICP, Value and View metadata items (and their corresponding dimensions) limit what we can do with them. We can add members to the ICP dimension only indirectly, by adding Entities and marking them with the IsICP flag.

In addition, there is no way to add members in the Value dimension directly. Though for each new member we add to the Currency metadata item, three new members appear in the Value dimension -e.g. by adding GBP in the currencies list, we get GBP, GBP Adj and GBP Total in the Value dimension.

Finally, the View metadata item can be customized only to provide with the strictly defined members YTD, HYTD, QTD, MTD etc. Here we are restricted to the keywords that we can use as well as to the number of members which must equal to the number of levels set in the Period dimension.

Next, we observe that not all dimensions support hierarchical structure and those that do, may not support hierarchical aggregation. Even though we can structure Scenarios in hierarchies, the members of those hierarchies will not roll-up to their parent. Even though Entities can form hierarchies which aggregate, the aggregation calculations are performed only when consolidation is executed and they are dictated by the consolidation rules logic.

ICP and Value dimensions form fixed structures that cannot be customized. In addition the Value dimension has a complex hierarchical aggregation, whereby adjustment members are dynamically added with their corresponding members, while other members like <Parent Currency> and [Proportion]/[Elimination] are populated by executing translation or consolidation logic.

Value - Is it Really a Dimension?
Finally, I should mention another important peculiarity with HFM dimensionality. Value is not a “real” dimension. It behaves like a normal dimension as regards a single entity. However, when we roll-up to parent entities, it stops to behave like any other dimension. For example, consider a custom dimension member; let's call it Road bikes being part of the Total Products hierarchy in Custom1.


When you input a number in the Road bikes member in the Entity NY, you expect that this number will be part of the total found in the Road bikes, of the parent entity US.

Now consider the Value dimension, part of it is shown below:


When you post an adjustment to the <Entity Curr Adjs> member in the Entity NY, it will not be part of the <Entity Curr Adjs> member in the parent entity US. Instead, the adjustment number will be part of the total in the <Entity Currency> member of the parent entity. What you should keep in mind from the above discussion is that the Value dimension limits its behavior as a dimension within a single entity.

You can find more detail on how the value dimension works in a fine article here:

http://thanos-onetruth.blogspot.gr/2013/11/value-dimension.html

A Final Word

As mentioned above the present article refers to versions of HFM, earlier than 11.1.2.3. In 11.1.2.3 we have a few changes related to our discussion worth mentioning:
1. A new metadata item has been added corresponding to the cell text
2. The core dimensions have been decreased to 10, leaving Custom3 and Custom4 out of the core. It would not make sense to also leave Custom1 and Custom2 out as well, since they hold the currency members which hold the exchange rates.

Acknowledgements
Special thanks to Thanos Antzoulatos for reviewing and commenting on the above content.

Thursday, October 31, 2013

Best Practices in Account Details Breakdown

Introduction
This topic first appeared as my answer to a question posted in Oracle's OTN Financial Consolidation forums. Since then I have included several improvements to clear-up the presented information. The description of the design issue goes like this:

Suppose we have a Sales account which needs to be detailed by TotalProducts hierarchy in a custom dimension. Therefore TotalProducts is the sum of its children Product1, Product2, Product3 etc. Usually we get the total sales figure as part of the company's General Ledger trial balance. However, in order to have the sales figure both participate on the company's income statement and provide with the desired product analysis, we need to wait until the individual product sales figures become available. This prevents us from previewing calculated and consolidated early drafts of our financial statements.

The concept of the Data Stream
The pattern described above is all too common in consolidation applications. Different pieces of data, arrive at different points in time each providing different information or different aspects of the same information. This is roughly what we call Data Streams. We may identify two or more streams of data in a more formal fashion by analyzing the following characteristics:
  1. We get them from different sources (systems, people, departments etc)
  2. They become available at different points in time
  3. They are of different levels of detail or use different aspects of analysis
  4. They come in different frequencies (monthly, quarterly, annually)
  5. They may address to different audiences (internal management, external authorities, auditors etc)
It is imperative that we hold data from the different streams at different locations. We should never overwrite or delete information, even if it is redundant as in our example Sales account. All information used must be present and auditable at any time. And more often than not, redundant information among different streams must reconcile.

Let's return to our setting. We receive a sales figure from the trial balance stream and several sales figures broken down by product in a subsequent stream. Keeping in mind the above imperative, we can choose from the following solutions, all equally applied in typical HFM applications:

1st solution: use supplementary accounts
With this method we don't keep a single account for both sales figures. We use account Sales as a target of the trial balance and account SupplSales as a target for the product stream. The account SupplSales attribute Custom1TopMember is set to TotalProducts to provide with product analysis. Here is how the product hierarchy in Custom1 dimension looks like (indented members are children):


In addition to the two accounts Sales and SupplSales, we shall also need a validation account for instance VSales to hold the difference between Sales and SupplSales. To calculate VSales we have to write a validation rule like this:


While VSales equals Sales when the trial balance stream is submitted, as soon as the product stream is also submitted VSales will zero out. We can also connect VSales to the application validation accounts, to control the process flow.

2nd solution: submit the product stream as a journal entry
In this case we need a single sales account and use a slightly modified product hierarchy in Custom1:


When loading the trial balance, the total sales figure will be targeted to the TB_AllProducts member. From there the sales figure will roll-up to TotalProducts and the total sales figure will enable us preview the financial statements before all submissions of different streams are made available.

At a later time when the products stream comes in, we shall load it as a balanced journal entry. This journal entry will credit product sales figures to Product1, Product2 and Product3 and debit the total to TB_AllProducts, effectively countering the previously loaded amount, without overwriting it. The result will look like this:


Note that when we select <Entity Curr Total>, we get both the trial balance figure at the summary level, and the individual product sales at the detail level. Also note that if the individual product sales submitted, do not meet the summary loaded with the trial balance, a balancing figure will appear in the cell identified by the combination A#Sales.C1#TB_AllProducts.V#<Entity Curr Total>, instead of zero. This data cell can be considered as the validation, though we have neither provided with a validation account, nor a validation calculation rule. Once more, we can connect the above cell to the application validation accounts, to control the process flow.

3rd solution: use an automatically calculated balancing entry
This is a more advanced technique, whereby you keep a single Sales account and support with product analysis in a custom dimension as before, with the following changes:


Note that in this case we must also set the attribute AggrWeight of TB_AllProducts to zero, so that we ensure that there is no double counting when both the trial balance and the product streams are loaded. In order for this to work we also need to write the following calculation rule:


As soon as we load the trial balance stream and execute the calculations, the picture will be like this:


Note that the whole trial balance amount is captured from the calculation and rolls all the way up to the total, while the product sales cells are still blank.

Subsequently, the product stream comes in, we calculate again and the above picture changes as follows:


Note that in the above example the balancing entry is zero, while the TotalProducts value has not changed since the initial load of the trial balance.

Now, interesting things happen when the product values do not meet the trial balance total. In that case, the balancing figure will fill the gap between the two totals. We can view the balancing member as the validation between the two streams. Like the previous solutions, we can connect the balancing member to the application's validation accounts to control the process.

Conclusion
We presented three techniques that we can use for addressing the agreement of redundant information coming in different streams. There is no single answer to the question which is best. One has to examine each individual account and the related data streams that correspond to the account and decide what fits best. I have cases where all three patterns have been applied to the same application for different accounts and/or streams. Possibly, you could come up with a hybrid solution, it is all fine.