Pages

Friday, July 31, 2020

EPM Integration Agent: Configuration tips

EPM Integration Agent was introduced sometime last year providing ability to connect to in-premise data sources. Though the tool has ability to connect to third party REST APIs via Groovy or Jython scripting but I assume the majority of the use cases would be to connect to relational databases, usually warehouses which had been used to feed data to various reporting and planning applications in the in-premise world. Unless clients move to any cloud based ERP, the data warehouses  are going to stay for a while and the need to pull data too.

Oracle provides connectivity for Oracle and MSSQL server by default, but what about other data sources? Other than scripting, if any data source offers SQL capability, we can essentially pull data. I had a recent experience connecting to Netezza database via Integration Agent and thought of sharing some quick tips I learnt during this exercise.

  • Integration Agent Port

Agent requires a port to communicate and by default it is 9090. Many organisations, specially those dealing with financials services would have their ports blocked. Moreover, 9090 is commonly used by many applications as their default port and might be already in use. Use the Telnet command  find out if the port is already in use. If not available use a separate port

  • Start agent manually as process before registering it as a service

Agent now offers an ability to register it as a windows service. When attempting to do it, the service gets installed even though the agent couldn’t register itself with cloud application. Ensure to check the agent logs. What I prefer is to register the agent manually first using the process rather than service to ensure the INI was populated correctly and agent was able to communicate to cloud application

  • Check Proxy Settings

Many organisations have proxy servers setup for external communications. Agent program is essentially a java based utility which would not be able to communicate via proxy unless proxy details are provided in the INI file. With EPM Automate, we can provide the proxy details as part of login command but with Integration Agent, this needs to be provided in INI file. Many of us tend to not go through documentation details about each and every parameter and that’s what I did myself. the INI file provides CUSTOM_JAVA_OPTIONS parameter to provide proxy details to be used when agent attempts to communicate to cloud. The proxy details can be provided in format -Dhttp.proxyHost=proxy.example.com -Dhttp.proxyPort=8080

  • Installation paths for Groovy and Jython

When installing Jython or Groovy on the machine having agent, ensure the installation path does not have spaces in it. Usually the default path for Groovy is the Program files (x86) folder with spaces in between, agent does not seem to like it and fails to register

  • JDBC drivers for source technology

Some of the source technologies would require JDBC drivers to be explicitly installed on the agent machine. This is different from the jar file copied over in the agent’s lib folder. A technology like Netezza requires JDBC driver to be installed on the machine to connect to server. Once JDBC drivers are installed, copy the jar file and place it in the lib folder.

Cheers !!!

Friday, July 3, 2020

Groovy Series 7: Submit Groovy generated rules to Essbase

Ever since Groovy functionality was introduced with Planning cloud, it has evolved over a period of time, offering new capabilities and methods to achieve what was not possible with traditional Calc script rules. Sometime, I have been asked “Is Groovy a replacement for Essbase Calc scripts” and my answer is a big ‘NO’. Groovy should not be perceived as a replacement to Calc script, rather consider it a booster which enhances user experience and provides flexibility to dynamically generate Calc scripts and improves performance.

Calc scripts are an integral part of how we run calculations on Essbase, whether we generate the code dynamically using Groovy or stick to traditional scripting. Lets look at different ways to submit these Calcs to Essbase and how Groovy has evolved.

  • Return method

This had been the oldest and default method until May 2019 when Job Factory class was introduced. Justifying its name, this method is used to submit a dynamically generated Calc script to Essbase for execution. A simple example would be to declare a Stringbuilder(), capture Position dimension members from edited cells, build a script and submit with some logic and submit it to Essbase.

image

The limitation with this method is, once called, you cannot perform any further action in the script. If I have a scenario to perform certain calculation on edited cells, aggregate data and then perform another action, I would have to break it down into multiple or at least two rules, one where I submit the groovy generated script and two perform aggregation.

  • Job Factory

This class was introduced in May 2019 update which allowed submitted an existing rule, ruleset or a template as a job. This allows making calls to existing calculations written in traditional Calc Script to be executed as part of a groovy rule. All we need to do is create a job of the calculation and submit it. It also allows passing values to runtime variable. This means we can have a groovy rule which performs certain task before groovy submits a dynamically generated script to Essbase.

image

Considering the same example, I first execute an existing rule as a job, passing the cost centre as the RTP value and then generate the script using edited positions and submit the script to Essbase. In this example, I haven’t put in the code to determine the status of the submitted job but in ideal case you would move ahead once the status is complete.

  • executeCalcScript method

This one is the latest addition allowing a much flexible way to submit calculation to Essbase and continue performing further tasks without breaking it into multiple rules. This method belongs to the Cube class allowing us to execute the dynamically generated code on that specific cube.

A simple use case would be

  1. Capture the edited cost centre
  2. Perform certain calculation on the edited members in Cube1
  3. Push data from Cube1 to Cube 2 using smart push
  4. Perform another calculation on Cube2 on same edited cost centre

All of this is performed in a single calculation script.

image

All these methods offer flexibility to design a dynamic and efficient solution with full control on members that should be enacted upon and achieve desired performance.

Cheers !!!

Tuesday, June 30, 2020

Groovy Series 6: Beauty of cscParams

While working with Groovy scripts, a common task is to dynamically generate an Essbase calc script capturing the edited members from the grid. This script should then be submitted to Essbase for execution. But the important task is to ensure these edited members when passed on to the Calc script are written in required Essbase format, enclosed in quotes.

In past we have been using various groovy methodologies to identify unique members and converting them to Essbase format concatenating with quotes, commas and spaces. How about if all of this can be done in one step my enclosing the entire collection in a method which spits out strings, enclosed in quotes and separated with commas ? Yes that’s what cscParams do for us.

image

Let’s look at the above script example, in Step 1, I declare an empty list and push the edited accounts from the COA dimension into the list. Step 2 and 3 define two ways of converting this list into a string which I can use either dynamically generate a Calc script or pass as an override member list while executing a smartpush. Step 4 prints the strings generated from both.

image

Because the grid I edited had both base periods and YearTotal on the grid, when members are edited at month level, the YearTotal also is marked as edited and edited members get repeated.

In the first print statement, I have to explicitly filter the unique members and on converting them to a string does not enclose them in quotes and have square brackets around members. This is the literal conversion of list into string.

The second print statement uses the cscParams function and just passing the list of edited members, it filters the unique members, enclose them in quotes and separate them by commas. This is exactly what is required when passed on to dynamically generated Calc script or list of override members in smartpush.

The third print statement prints the str string without filtering the unique members and using the old way of concatenating the list members.

Its quite obvious how cscParams saves the effort of filtering or concatenating the list in just single step. This can be captured in another string object as shown in above example to avoid rewriting the method multiple times.

Cheers !!!

Sunday, June 14, 2020

Groovy Series 5 : Interacting with external web services

Though this feature came in last year and I have been a bit late in writing about, I thought it would be still worth making a record of the same to help me (and others if people are reading) to make use of the same. The update to EPBCS (though it no longer is PBCS or EPBCS, let’s just stick to it to symbolise it as availability of Groovy) , Oracle had made its Groovy features more powerful by adding an ability to interact with external web services. What this means is, we can now do HTTP GET or POST with external webservices. If you like to know more about the methods, you can check W3School. In Groovy API, we have Get(), Post(), Delete() and Put(), which do majority of the tasks.

A very basic example would be the ability to update FX rates into our application in an automated way. Usually this is one of the forecast preparation tasks for finance users to ensure FX rates are updated and they might be crunching these rates manually to put in an average or current value. I will be using the https://exchangeratesapi.io/ for fetching the FX rates, but any other service which returns FX rates or data via web service can be used.

A call to external web service can be made directly in the business rule or by adding it as a named connection within EPM cloud connections. A named connection can be added from ‘Connections’ –> ‘Other Web Service Provider’. In this case, I have added exchange rate API URL and named it FXRates. Since this service does not require any user name and password, I left these fields blank.

image

image

Lets look at the usage of Exchange rates API and how requests can be made to receive JSON responses. I prefer to use Postman app to manually check REST requests but any REST client can be used.

  1. Sending a request to fetch FX rates for particular date https://api.exchangeratesapi.io/2019-06-30

    image
  2. Restricting the rates to specific currency codes and desired base currency https://api.exchangeratesapi.io/latest?base=AUD&symbols=USD,EUR

    image
  3. Defining the start and end date and combining it with base currency and desired output currencies https://api.exchangeratesapi.io/history?start_at=2020-05-01&end_at=2020-05-31&base=AUD&symbols=EUR,USD

    image

For my application, I need to pull both average and ending rates with base currency as AUD and conversion rates for USD and EUR. The FX rates for May is empty and should eb populated with average and ending rates

image

On executing the business rule, it asks for a date, this could be any date within a month for which rates should be populated, say we choose May 31, 2020.

image

On completion, the rates get populated for the month

image

Let’s dig into behind the scene mechanics to achieve this

  • Section 1 shows the declaration of the RTPs used in the business rule, it is the date prompt which appears on execution of the business rule. RTP is defined in Calc Manager as date type
  • Section 2 imports the required classes to work with JSON and calculation
  • Section 3 breaks the year, month and day into separate variables

image

  • Section 4 declares a Calendar object. This will be used to generate a date and determine the month and year for EPM month and Year
  • Section 5 determines the financial year based on month. Since Aus has financial year Jul to Jun, if month is greater than 6, year should increase by one
  • Section 6 generate values for month and day from the calendar object. Also we declare a list ‘Months’ with all 12 months. This will be used later to pull the month name. It might be possible to return a month name using Groovy but found this to be the easiest way

image

  • Section 7 captures the first day and the last day of the month in the selected date
  • Section 8 checks if the application has a connection with name FXRates, calls the REST service. The lastDayDate variable holds the date in YYYY-MM-DD format with the base currency and exchange rate currencies
  • Section 9 captures the response as Map object. The data captured in response are the ending rates for the date with base currency as AUD and reporting currencies as AUD and USD. These are passed to job console

image

image

  • Section 10 makes second call to the exchange rates API to get rates from day 1 to last day of the month. the base rate being AUD and exchange rate currencies are USD and EUR
  • Section 11 captures the response as Map object. the JSON response returns the rates for all the days
  • Section 12 adds up all the rates captured in the response. This will be used to determine the average by adding up all rates and dividing it by the number of day tracked by counter
  • Section 13 passes the values captured from response to job console. The month and the financial year captured from calendar object determines the members of Period and Years dimension

image

image

  • Section 14 uses the DataGridBuilder to build a write grid by passing the dimension members and data. The average rates and the closing rates are written to application
  • Section 15 checks the number of accepted and rejected cells as a result of saving data to the cube

image

image

Below is the code to copy paste the logic as explained above.

/*RTPS:{Groovy_Cal_Date}*/

import org.json.JSONObject
import groovy.json.JsonSlurper
import java.math.BigDecimal

def yearPart = rtps.Groovy_Cal_Date.toString().substring(0,4)
def monthPart = rtps.Groovy_Cal_Date.toString().substring(4,6)
def dayPart = rtps.Groovy_Cal_Date.toString().substring(6,8)

//println yearPart
//println monthPart
//println dayPart

Calendar calObj = Calendar.instance
//println Math.round(calDate/10000,0).toString()
calObj.set(yearPart.toInteger(), monthPart.toInteger()-1, dayPart.toInteger() )
calObj.set(Calendar.DAY_OF_MONTH, calObj.getActualMaximum(Calendar.DAY_OF_MONTH))
//print calObj

//print "Month:"+ calObj.get(Calendar.MONTH)
int varYear
if(calObj.get(Calendar.MONTH)+1>6)
     varYear = calObj.get(Calendar.YEAR) + 1
else
     varYear = calObj.get(Calendar.YEAR)

int varMonth = calObj.get(Calendar.MONTH)+ 1 /*Month in Calendar object stat from 0*/
int varDay = calObj.get(Calendar.DAY_OF_MONTH)

def Months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec']

def endRateRespObject
def lastDayDate = calObj.get(Calendar.YEAR).toString() + '-' + varMonth.toString().padLeft(2,'0') + '-' + calObj.get(Calendar.DAY_OF_MONTH)
def firstDayDate = calObj.get(Calendar.YEAR).toString() + '-' + varMonth.toString().padLeft(2,'0') + '-01'

if(operation.application.hasConnection("FXRates")){
    HttpResponse<String> response =  operation.application.getConnection("FXRates").get("/$lastDayDate?base=AUD&symbols=USD,EUR").asString() 
    //print response.body
    JsonSlurper slurper = new JsonSlurper()
    endRateRespObject = slurper.parseText(response.body.toString()) as Map<String,Map>
    //def ratesObject = slurper.parseText(respObject.rates.toString()) as Map
    //println endRateRespObject.rates.size()
    println "AUD to EUR Ending:" + endRateRespObject.rates.EUR
    println "AUD to USD Ending:" + endRateRespObject.rates.USD
}
double totalEUR, totalUSD
int counter
if(operation.application.hasConnection("FXRates")){
    HttpResponse<String> response2 =  operation.application.getConnection("FXRates").get("/history?start_at=$firstDayDate&end_at=$lastDayDate&base=AUD&symbols=USD,EUR").asString()
   // print response2.body
    def avgRateRespObject = new JsonSlurper().parseText(response2.body) as Map<String,Map>
    avgRateRespObject.rates.size()
    if(avgRateRespObject.rates.size()>0){
            def rateEntries = avgRateRespObject.rates.entrySet()
           
            rateEntries.each { rateEntry->
                def entryValue = rateEntry.value as Map<String,Double>
             totalEUR= totalEUR + entryValue.EUR
             totalUSD= totalUSD + entryValue.USD
             counter=counter+1
         }    
    }else{
            counter=1
    }
}
println "totalEUR:"+ totalEUR
println "totalUSD:"+ totalUSD
println "counter:" + counter
println "AUD to EUR Average:" + totalEUR/counter
println "AUD to USD Average:" + totalUSD/counter
println "Loading data to "+ Months[varMonth-1] + " and " + 'FY'+varYear.toString().substring(2,4)

//Write Grid to push FX rates
DataGridBuilder fxWriteGrid = operation.application.getCube("FINRPT").dataGridBuilder("DD/MM/YYYY")
fxWriteGrid.addPov('E_NA', 'Month', 'Units', 'COA_NA', 'Working', 'Act_DE', Months[varMonth-1], 'FY'+varYear.toString().substring(2,4) )
fxWriteGrid.addColumn('Local_AUD_Average','Local_AUD_Closing')
fxWriteGrid.addRow(['EUR'],[totalEUR/counter,endRateRespObject.rates.EUR])
fxWriteGrid.addRow(['USD'],[totalUSD/counter,endRateRespObject.rates.USD])

DataGridBuilder.Status writeGridStatus = new DataGridBuilder.Status()
fxWriteGrid.build(writeGridStatus).withCloseable { writeGrid ->
     println("Total number of cells accepted:" + writeGridStatus.numAcceptedCells)
     println("Total number of cells rejected:"+ writeGridStatus.numRejectedCells)
     operation.application.getCube("FINRPT").saveGrid(writeGrid)
}

Cheers !!!

Sunday, September 1, 2019

Power BI + Oracle PBCS - Part 3: Hierarchical reports

After I wrote the other two posts and shared the same with few of my colleagues, a lot of question poured in asking ‘How about this..?’ or ‘Can we do this…?’, etc and one question asked by our very own Hyperion Jedi (Ahmed Hafez) left me intrigued looking for answers to ‘Can we have hierarchy driven report in Power BI when connecting to PBCS?’ This kept me busy for a few hours with an outcome, the snapshots of which look something like below. The first chart shows the top-level entity roll-up ‘All Entities’, providing an option to drill-down to BUs. The same action can be performed on BUs to lead to entities.
image
image

Let’s look at the steps to achieve the required hierarchical structure. I would not talk about the steps to connect Power BI to PBCS and how to parse data, rather suggest to go through Part 1 and Part 2 of this post. In past posts, we utilised the concept of calling PBCS export data slice web service to pull data and parse it. In addition to pulling data slice, we would also need the metadata hierarchy and its parent-child relationship to define the roll-ups. This can be achieved using the ‘Get Member’ endpoint of the REST API.

URL: https://<<service-name>>//HyperionPlanning/rest/{api_version}/applications/{application}/dimensions/{dimname}/members/{member}

The api_version is V3, dimname is the dimension from which member information needs to be extracted and the member is the member name. Interestingly, this service returns the entire dimension structure if we provide the top member of the dimension as a member name. The response is a nested JSON response which keeps going one level down when a member has children.
It starts from the first step on the left and applying all the steps in the second snapshot, give the output as a flattened hierarchy or in Essbase world, a generation layout extract. The query was saved with the name ‘Entity Metadata’.
image
image
image

In Power BI, we can convert columns of a table into a hierarchy by right-clicking on one of the columns and choosing ‘New Hierarchy’ option. This would lead to the creation of the hierarchy as ‘Entity_Hierarchy’ starting from Entity dimension name, roll-up member All Entities, followed by BUs and finally leaf entities.
image
image

As discussed in earlier posts, we would now need to pull our entity data. In this case, we pull EBIT data by entity and Year to give us the below data table. We pull only the level zero entities as the hierarchy aggregation will be handled by Power BI internally.
Note: I am not a Power BI expert to claim the described method is best and their could be more efficient ways to achieve the same.
image

We can now set up a relationship between the two extracts – entity metadata extract and the EBIT data extract by entity and year. This can be done by merging the two data queries and create a new table. On editing an existing query, we have an option of merging queries as a new. We merge the
image
image


After merging, the data and year from the merged query and the hierarchy is dropped on the visual to get the desired results.
image

The bar chart in this example will be a hierarchy driven chart giving options to drill-down or drill-up and view the results. At the same time, this can be also managed through the drill-don and up buttons on the chart.
image

Ta-da !!!

Its important to pay attention to the volume of the data handled with such integrations. In my opinion, we need to pull only that data what we need to show on the chart rather than pulling the entire cube. This can be broken down into multiple data sources and used together on the chart by establishing a relationship between data sources. Pulling large volumes might hit performance and can be avoided by breaking down the data into multiple data sets. Moreover, I haven’t tried experimenting with ragged hierarchy and need to see how it can be achieved.
Cheers !!!

Power BI + Oracle PBCS - Part 2: Connect and Visualise

This post is in continuation to Part 1 of the same topic in which I talked about how PBCS REST APIs can be used to pull data as JSON response. We used the Postman app to view our results. In this post, we will see how we can make a call to an HTTP POST request and parse the JSON response within Power BI.

Note: I am not a Power BI expert and method shared herein is one such way of doing this. There might be more efficient ways to achieve the same where Power BI experts can help.

Power BI offers a lot of inbuilt connectors and methods to pull data from discrete sources and they serve the purpose perfectly. But in certain cases, prebuilt connectors do not serve the purpose and that’s when “Blank Query” brings out its magic, helping to connect to anything which can be defined in Query terms. Because Power BI’s Web connector makes an HTTP GET request by default, we cannot use it to execute our data slice REST API which is POST in nature. Thus we use the Query to enable Power BI to make an HTTP POST request.
image
image

In the blank query connector, we have “Advanced Editor” where we can define the query to pull data from PBCS. There is a specific way to define the query and provide the required parameters. A blank editor comes up as shown in below snapshot and we need to define what should go in each section.
image

Taking the same cash flow example, we define the authentication key and the request body what was passed to the POST request. This is where I find Postman app quite handy as it can provide an encrypted key for the password which we used when attempting to make a request and this encrypted key can be used in a query without compromising the actual password.
After you receive a successful response in the Postman app, click on the code option under the Send button. This gives the user an option to generate the code in various languages. Under each option, you could find the header content which was sent with the HTTP POST request. In the below example, it shows the payload that was used and the authorization key indicating that Basic authentication was used.
image
image

The query has 2 parts, first the authentication key and content (which is the request body JSON, we submit with HTTP POST request) and the submission of request using Web. Contents method. Since the JSON we submit with HTTP POST request have all member and dimension names wrapped in quotes, in order to pass the quotes with the request, we need to wrap the member and dimension names within double-quotes.
image

On hitting Done, it will return the JSON response and I like the way its quite easy to parse the JSON response into a tabular structure which I explain in following steps.
  1. It returns three groups of the response – POV, Columns and Rows. Since the POV and Column members were fixed, we will be expanding Rowsimage
  2. Power BI uses Power Query to do this parsing. In case you have not observed, Power Query is also available in Excel to pull data from different sources and the same actions can be performed there too. Power Query uses DAX language to perform all these steps. Nice thing about this is that you can view all the steps on the right pane and can go back to any step when required and start over. The step 1 above would produce list objects with all the recordsimage
  3. Convert this into a table using the option on top left cornerimage
  4. The available table has one column, which needs to be expanded using the button at top of the table image
  5. Expanded columns are further expanded one by one and from the option choose “Extract Values”image
  6. This would pop-up a window. Choose a delimiter to delimit members from multiple dimensions. For the header (first) column, choose the option to choose a delimiter. I this case I chose | (pipe) delimiter, but anyone of available options or custom option can be used. Do not choose any delimiter for the data (second) column image
  7. If we look back in the request body, we used scenario and year dimension on rows and the response returned members of both dimension in rows. When using the delimiter, we can easily split the members in the following stepsimage 
  8. Right-click on the headers (first) column and choose split column –> By Delimiter and choose |(pipe) to split the contents into two separate columns. Similarly, the data column is text and should be converted to values to be used on charts. Right-click on data (second) columns and choose option Change Type –> Decimal Numbers to convert them into numbers. We can also rename the columns to help identify them when using on chartsimage
  9. All the steps are captured in the editor on the right pane and we can name this query a relevant name so it can be used as a source. In this case I renamed it from Query2 to Cash Flow Dataimage
  10. Choose the option “Close and apply” to save it as a data sourceimage
  11. At this point, it may prompt validation of credentials, select edit credentials option and then choose Anonymous in the subsequent window to connectimageimage
  12. Similarly, other queries can be created to pull required data from PBCS and used in different charts to give a pretty dashboardimage_thumb5[4]

Since the data is being pulled using web services, every time Power BI dashboard is refreshed, the query sends a request to PBCS to pull real time data. Moreover, the data from various other sources can be viewed together in single dashboard. This provides ability to bring Actual and Budget data from all possible sources at one place making the decision making easy.
Cheers!!!