Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
INTER-POD REVOLUTIONS
Connected Enterprise Solutions in Oracle EPM Cloud
Vatsal Gaonkar
February 2021
AGENDA
Introduction/
Proposition
Terminology Use Case Configurations:
Revolutions
Q&A
INTRODUCTION/ PROPOSITION
Presenter, Implementation Partner & Problem Statement
VATSAL GAONKAR
> Summary
> Product Manager – Enterprise Planning; Oracle ACE
> Over 15 years of Enterprise Performance Management (EPM) experience across on-
premises, cloud, and hybrid deployments
> Design, Development, and Deployment roles across approx. 60 projects over the years
> EPM Cloud Credentials – EPBCS Certified; Design and deployments for Enterprise Planning
and Profitability apps; EPM Cloud automation; EPM Cloud Data Integration
> Speaker at Oracle / Hyperion conferences: HUGMN, DCOUAG, OAUG, ODTUG and
OpenWorld
> Contact:
> Email: vatsal.gaonkar@alithya.com
> twitter: @dieselvat
ALITHYA IS YOUR TRUSTED ADVISOR
2400
employees
3000+
projects
375+ ERP/EPM
Cloud projects
23+
years
NASDAQ
ALYA
225+ Oracle
consultants
7 Oracle
ACEs
1000+
clients
Enterprise Applications Application Services Business Strategy Data & Analytics
Enterprise Resource Planning (ERP/SCM)
Human Capital Management (HCM)
Enterprise Performance Management (EPM)
Customer Relationship Management (CRM/CXM)
Cloud Infrastructure Migration
Cloud Enterprise Services
Digital Applications Development
Legacy Systems Modernization
Quality Assurance
Strategic Consulting
Digital Transformation
Enterprise Architecture
Organizational Performance
Business Intelligence
Data Governance
Modern Data Architecture
Artificial Intelligence
Machine Learning
TRANSFORMATION = BUILDING A CONNECTED ENTERPRISE
CONNECTED BUSINESS PROCESSES
TRADITIONAL
CHALLENGES
Reduce Costs
Seamless Collaboration
Informed Strategic
Decisions
Improved Operational
Performance
Happier
Customers
Improved Margins &
Revenues
One Source of
the Truth
TRANSFORM AND
CONNECT
Operations
Marketing
Sales
Finance
HR IT
Connect all
planning
Close with
confidence
Reinvent
analytics
and
reporting
Outperform with
intelligence
Align
enterprise
data
Hidden Costs &
Overhead from Manual
Processes
Inability to Flex Work
Lack of Transparency
for Decision Making
Time Wasted on Non-
Value Add Activities
Less Time to Focus on
what’s most important
Lots of Data; Little
Information
Siloed Data &
End User ETL
EPM CLOUD MULTI-
APPLICATION
ARCHITECTURES
need ready and seamless integration of
business data and processes
POLL 1
Why and how would you need integration between multiple EPM
Cloud applications?
TERMINOLOGY
EPM Cloud Artifacts for configurations today
EPM CLOUD ARTIFACTS
> EPM Cloud Planning :: Planning
> Calculation Manager Rules :: Calculation Scripts :: BR
> Data Management :: Data Integrations :: DM
> Data Sources/ Targets :: Databases, BSO/ ASO & Plan types
> Groovy Business Rules :: Groovy
> EPM Connections :: Connection
> EPM Automate :: Automate
> REST API :: REST
> JSON
USE CASE
Move Labor model data to Financials
LABOR TO FINANCIALS CONFIGURATION
WFRPT FINRPT
Labor Financials
Department labor
model changes
Report & Analyze by
Jobs and Employees
Department Financial
planning
Department Income
Statement
Trickle Feed
Focused On-Save
Scheduled Push
Extract and Load
INTER-POD REVOLUTIONS: CONFIGURATIONS
Trickle
Feed
Focused On-
Save
Scheduled
Push
Extract and
Load
✓ Calculation
Manager Rules
✓ Data Management
✓ Groovy Business
Rules
✓ EPM Automate &
Scheduler
✓ Every 5 mins for
small sets of data
✓ Right click push
from the form
✓ Data Management
✓ Groovy Business
Rules
✓ On Demand from the
data form
✓ Data
Management
✓ Groovy Business
Rules
✓ EPM Automate &
Scheduler
✓ Every 15 mins for
all data
✓ Groovy Business
Rules
✓ Import, Clear &
Export data slices
✓ On-save
intersections working
like @XWRITE
POLL 2
Do you use or are planning on designing an multi-pod EPM Cloud
implementation?
CONFIGURATIONS: REVOLUTIONS
How each type of configuration is implemented
CONNECTIONS BETWEEN INSTANCES
TRICKLE FEED
Copy updated data and run incremental data push
CONFIGURATION/ DESIGN
Planning/
Groovy:
Data copy
and
setEdited
Groovy:
Filtered
Integration
& setEdited
(false)
Connection/
REST:
Call DM Rule
DM:
Run Inter-
pod Data
sync
DM:
Clear Region
at for Edited
members
Planning:
Target data
updated
WFRPT
FINRPT
POLL 3
What is your expertise level with Groovy Scripting?
GROOVY (RUN ON SAVE)
/*RTPS: {Entity} {L_Scenario}, {L_Year} */
def varYear = rtps.L_Year.toString()
def varScenario = rtps.L_Scenario.toString()
def curEntity - rtps.Entity.toString()
// Capture Jobs and Employees whose data changed
operation.grid.dataCellIterator.each { DataCell cell ->
if(cell.edited) {
List JobList << cell.getMemberName("Job")
List EmployeeList << cell.getMemberName("Employee")}}
// Generate the calc script to calculate for edited Jobs and Employees
List povmbrs = operation.grid.pov
String curVersion = povmbrs.find {it.dimName =='Version'}.essbaseMbrName
String FixString = """ "${JobList.join('", "')}","${EmployeeList.join('", "')}", “$varYear”, "$curVersion",
"$curEntity" """
String calcScript = """FIX($FixString)
FIX("$varScenario")
%Template(name:="Calc Workforce Expenses",application:="EPBCS",plantype:="OEP_WFP",dtps:=())
ENDFIX
DATACOPY "$varScenario" TO "TrickleFeed";
ENDFIX"""
println("The following calcscript was run: n $calcScript")
Cube wfmaincube = operation.application.getCube(“OEP_WFP”)
Wfmaincube.executeCalcScript(calcScript.toString())
GROOVY (RUN ON SAVE)
Cube wfmaincube = operation.application.getCube("OEP_WFP")
// Define Grid
DataGridDefinitionBuilder wfmaincubegridbuilder = wfmaincube.dataGridDefinitionBuilder()
// Suppression in grid definition
wfmaincubegridbuilder.setSuppressMissingBlocks(true)
wfmaincubegridbuilder.setSuppressMissingRows(true)
// Create POV, Row and Columns for the grid
wfmaincubegridbuilder.addPov(['Version','Scenario','Property'],[['OEP_Working'],['TrickleFeed'],['OWP_Expe
nse Amount']])
wfmaincubegridbuilder.addColumn(['Years','Period'],[[‘&OEPPlanYr’],['ILvl0Descendants("YearTotal")']])
wfmaincubegridbuilder.addRow(['Job','Employee','Entity','Account’],[['ILvl0Descendants(“OWP_Total
Jobs")'],['ILvl0Descendants(“OWP_Total Employees")'], curEntity,
['ILvl0Descendants("Salary_Accounts")','ILvl0Descendants("Hour_Stats")']])
//Build the grid using the definition
DataGridDefinition wfmaincubegriddefinition = wfmaincubegridbuilder.build()
// Load the grid and set to Edited cells
DataGrid wfmaincubeGrid = wfmaincube.loadGrid(wfmaincubegriddefinition,false)
// Iterate through the grid get to Entity, Employee and Job intersections to set them as Edited
wfmaincubeGrid.dataCellIterator().each{
it.setEdited(true)
}
GROOVY (RUN EVERY 5 MINS THROUGH AUTOMATION)
// Get Edited cells
wfdataloadcubeGrid.dataCellIterator.each { DataCell cell ->
if(cell.edited) {
List entities << cell.getMemberName("Entity")
}
String entitiesDM = "${entities.join('", “’)}”
//Run data integration with source filters override
HttpResponse<String> jsonResponse = operation.application.getConnection(“WFRPT_FINRPT_DM").post()
.header("Content-Type", "application/json")
.body(json(["jobType" : "INTEGRATION", "jobName" : “TrickleFeed_Rule", "periodName" :
"{$DataRuleStartPeriod}{$DataRuleEndPeriod}", "importMode" : "REPLACE", "exportMode" :
“REPLACE_DATA", "sourceFilters" : ["Entity":$entitiesDM ]])).asString()
//Mark Cell Unedited in Trickle Feed
DataGrid wfUneditcubeGrid = wfmaincube.loadGrid(wfUneditcubegriddefinition,false)
wfUneditcubeGrid.dataCellIterator().each{
it.setEdited(false)
}
CONSIDERATIONS
23
> Automate: Server or Object Store for Schedule/ Automation
> Automate: EPM Automate commands understanding
> TrickleFeed Scenario member
> EPM Connections for REST API
> For contention issues: Back-up with 15 min job
> DM: Clear Region
FOCUSED ON-SAVE
Either on-save from a form OR Right click Action Menu
CONFIGURATION/ DESIGN
Planning:
Save on
Form OR
Right Click
Action Menu
Groovy:
Uses data
form context
to call DM
Connection/
REST:
Call DM Rule
DM:
Run Inter-
pod Data
sync
DM:
Clear Region
at Target
Planning:
Target data
updated
WFRPT
FINRPT
GROOVY (RUN ON SAVE OR FROM ACTION MENU)
// Get all entities to push on save or on right click
/*RTPS: {Entity}*/
Cube wfmaincube=operation.application.getCube(“OEP_WFP”)
Dimension EntityDim=operation.application.getDimension(“Entity”,wfmaincube)
def EntityList = EntityDim.getEvaluatedMember(“@Relative(rtps.Entity.toString,0)”,wfmaincube)
String entitiesDM = "${EntityList.join('", “’)}”
// Get start and end periods for the data load rule from the scenario
def StartYear = operation.application.getDimension('Scenario').getMember(‘Budget').toMap()["Start
Year"].toString().substring(2)
def EndYear = operation.application.getDimension('Scenario').getMember(‘Budget').toMap()["End
Year"].toString().substring(2)
def StartPeriod = operation.application.getDimension('Scenario').getMember(‘Budget').toMap()["Start
Period"].toString()
def EndPeriod = operation.application.getDimension('Scenario').getMember(‘Budget').toMap()["End
Period"].toString()
String DataRuleStartPeriod = "$StartPeriod"+"-"+"$StartYear"
String DataRuleEndPeriod = "$EndPeriod"+"-"+ "$EndYear“
// Use Data Management REST API connection
HttpResponse<String> jsonResponse = operation.application.getConnection(“WFRPT_FINRPT_DM").post()
.header("Content-Type", "application/json")
.body(json(["jobType" : "INTEGRATION", "jobName" : “Focused_OnSave", "periodName" :
"{$DataRuleStartPeriod}{$DataRuleEndPeriod}", "importMode" : "REPLACE", "exportMode" :
“REPLACE_DATA", "sourceFilters" : ["Entity": "$entitiesDM"]])).asString()
CONSIDERATIONS
27
> DM: Clear Region
> DM: Multiple invocation of the same rule not allowed
> EPM Connections for REST API
> Planning: Make it an admin function
SCHEDULED PUSH
Jump Server scheduled push from source to target
CONFIGURATION/ DESIGN
Automate:
runBusiness
rule
Groovy:
Provide
Start and
End Periods
Connection/
REST:
Call DM Rule
DM:
Run Inter-
pod Data
sync
DM:
Clear Region
at Target
(fully
defined)
Planning:
All data
updated
WFRPT
FINRPT
EPM AUTOMATE (RUN ON SCHEDULE)
REM Run Business Rule
call epmautomate runbusinessrule “WFRPT_FINRPT_Interpod” “plantype=OEP_WFP” “RTP_StartPeriod=Jan-
20” “RTP_EndPeriod=Dec-20”
REM Run Rule set
call epmautomate runRuleset “WFRPT_FINRPT_Interpod_Ruleset” “RTP_StartPeriod=Jan-20”
“RTP_EndPeriod=Dec-20”
GROOVY (RUN FROM EPM AUTOMATE COMMAND)
// Get start and end periods for the data load rule from run-time prompts
/* RTPS: {RTP_StartPeriod} {RTP_EndPeriod}*/
def String DataRuleStartPeriod = rtps.RTP_StartPeriod.getEnteredValue()
def String DataRuleEndPeriod = rtps.RTP_EndPeriod.getEnteredValue()
// Use Data Management REST API connection
HttpResponse<String> jsonResponse = operation.application.getConnection("WFRPT_FINRPT_DM").post()
.header("Content-Type", "application/json")
.body(json(["jobType" : "DATARULE", "jobName" : “Scheduled_Push", "startPeriod" :
"$DataRuleStartPeriod", "endPeriod" : "$DataRuleEndPeriod", "importMode" : "REPLACE", "exportMode" :
"REPLACE_DATA"])).asString()
CONSIDERATIONS
32
> Automate: Server or Object
Store for Schedule/
Automation
> Automate: EPM Automate
commands understanding
> EPM Connections
> DM: Fully defined Clear
Region or Business Rule
> Planning: Data delay at
target, but clean cut of data
at all times
JSON EXTRACT AND LOAD
Export, Clear and Import data slices on save
CONFIGURATION/ DESIGN
Planning:
Save on
Form
Groovy/
JSON:
Source data
slice Export
grid
Connection/
REST:
Call Planning
Export data
slice
Groovy/
JSON:
Create
target data
slice clear
Connection/
REST:
Call Planning
to Clear
data slice
Groovy/
JSON:
Import grid
using Export
grid
WFRPT
FINRPT
Connection/
REST:
Call Planning
to Import
data slice
Planning:
Specific
intersection
updated
GROOVY (RUN ON SAVE)
// Export data slice
Connection sourceWfR = operation.application.getConnection(“WFRPT_Connection")
HttpResponse<String> jsonResponse=sourceWfR.post("/rest/v3/applications/WRKFORCE/plantypes/WFRPT/exportdataslice")
.header("Content-Type", "application/json")
.body("""
{"exportPlanningData": false,
"gridDefinition": {"suppressMissingBlocks": true, "suppressMissingRows": true, "suppressMissingColumns": false,
"pov": {
"dimensions": ["Entity","Scenario","Version","Years"],
"members": [ ["Sales_US"],["Budget"],["OEP_Working"],["FY20"] ]
},
"columns": [{
"dimensions": ["Period"],
"members": [[ "ILvl0Descendants(YearTotal)"]]
}],
"rows": [{
"dimensions": ["Property","Jobs","Employee","Account"],
"members": [ ["OWP_Expense Amount"],["OWP_Total Jobs"], ["OWP_Existing Employees"],["ILvl0Descendants(Salary
Accounts)"]]}]}}""").asString()
// Convert response to import json
def exportresponse = new JsonSlurper().parseText(jsonResponse.body) as Map
GROOVY (RUN ON SAVE)
// exportresponse for export data slice
{
"pov": [
"Sales_US",
"Budget",
"OEP_Working",
"FY20"
],
"columns": [
["Jan", "Feb", "Mar", "Apr", "May", "Jun“, "Jul", "Aug", "Sep", "Oct", "Nov", "Dec ],
],
"rows": [
{
"headers": [
"OWP_Expense Amount",
"OWP_Total Job"
"OWP_Existing Employees",
"Base_Salary"
],
"data": [
"", "", "10000", "10000", "10000", "10000", "10000", "10000", "10000", "10000", "10000", "10000“]
}
}
GROOVY (RUN ON SAVE)
// Transform exportresponse
//Replace Additional Headers etc.
exportresponse.rows.each{
// Add load member
Collections.addAll(((List)it['headers']),"Load")
// Remove additional dimension members
((List)it['headers']).removeAll(["OWP_Total Jobs","OWP_Existing Employees"])
// Map workforce accounts to financials accounts
Collections.replaceAll(((List)it['headers']),"OWP_Expense Amount",“Labor_Data")
//Replace missing data with #missing
Collections.replaceAll(((List)it['data']),"","#Missing")
}
GROOVY (RUN ON SAVE)
// Clear data slice
Connection targetFinR = operation.application.getConnection(“FINRPT_Connection")
jsonResponse = targetFinR.post("/rest/v3/applications/FINPLAN/plantypes/FINRPT/cleardataslice")
.header("Content-Type", "application/json")
.body("""
{ "clearEssbaseData": true,"clearPlanningData": false,
"gridDefinition": {"suppressMissingBlocks": true,
"pov": {
"dimensions": ["Entity","Scenario","Version","Years","DataView"],
"members": [[“Sales_US"],["Budget"],[“OEP_Working"],["FY20"],["Labor_Data"]]
},
"columns": [{
"dimensions": ["TimePeriods"],
"members": [["ILvl0Descendants(YearTotal)"]]
}],
"rows": [{
"dimensions": ["Account"],
"members": [["ILvl0Descendants(Salary Accounts)"]]}]}}""").asString()
GROOVY (RUN ON SAVE)
// Build Import json
Map importDataToTargetMap = new HashMap()
importDataToTargetMap['aggregateEssbaseData'] = false
importDataToTargetMap['cellNotesOption'] = 'Overwrite'
importDataToTargetMap['dateFormat'] = "DD/MM/YYYY"
importDataToTargetMap['dataGrid'] = exportresponse
// Import data slice
jsonResponse = targetFinR.post("/rest/v3/applications/FINPLAN/plantypes/FINRPT/importdataslice")
.header("Content-Type", "application/json")
.body(json(importDataToTargetMap)).asString()
response = new JsonSlurper().parseText(jsonResponse.body) as Map
def String msg = ""
if(response["numRejectedCells"] != 0){
println "${response['numAcceptedCells']} cells were loaded."
msg = "The process completed successfully but had ${response['numRejectedCells']} rejected cells. The following were
identified. n${response['rejectedCells']}"
println msg
throwVetoException(msg)
}
else{
println "${response['numAcceptedCells']} cells were loaded."
}
CONSIDERATIONS
40
> EPM Connections
> Planning: Understanding of Export, Clear and Import data slices
> Json: Understanding of Json structure
Q&A
https://www.alithya.com/en/technology-partners/oracle-solutions
infosolutions@alithya.com
@Alithya
@Alithya_Inc
https://www.linkedin.com/company/alithyaoraclepractice

More Related Content

nter-pod Revolutions: Connected Enterprise Solution in Oracle EPM Cloud

  • 1. INTER-POD REVOLUTIONS Connected Enterprise Solutions in Oracle EPM Cloud Vatsal Gaonkar February 2021
  • 4. VATSAL GAONKAR > Summary > Product Manager – Enterprise Planning; Oracle ACE > Over 15 years of Enterprise Performance Management (EPM) experience across on- premises, cloud, and hybrid deployments > Design, Development, and Deployment roles across approx. 60 projects over the years > EPM Cloud Credentials – EPBCS Certified; Design and deployments for Enterprise Planning and Profitability apps; EPM Cloud automation; EPM Cloud Data Integration > Speaker at Oracle / Hyperion conferences: HUGMN, DCOUAG, OAUG, ODTUG and OpenWorld > Contact: > Email: vatsal.gaonkar@alithya.com > twitter: @dieselvat
  • 5. ALITHYA IS YOUR TRUSTED ADVISOR 2400 employees 3000+ projects 375+ ERP/EPM Cloud projects 23+ years NASDAQ ALYA 225+ Oracle consultants 7 Oracle ACEs 1000+ clients Enterprise Applications Application Services Business Strategy Data & Analytics Enterprise Resource Planning (ERP/SCM) Human Capital Management (HCM) Enterprise Performance Management (EPM) Customer Relationship Management (CRM/CXM) Cloud Infrastructure Migration Cloud Enterprise Services Digital Applications Development Legacy Systems Modernization Quality Assurance Strategic Consulting Digital Transformation Enterprise Architecture Organizational Performance Business Intelligence Data Governance Modern Data Architecture Artificial Intelligence Machine Learning
  • 6. TRANSFORMATION = BUILDING A CONNECTED ENTERPRISE CONNECTED BUSINESS PROCESSES TRADITIONAL CHALLENGES Reduce Costs Seamless Collaboration Informed Strategic Decisions Improved Operational Performance Happier Customers Improved Margins & Revenues One Source of the Truth TRANSFORM AND CONNECT Operations Marketing Sales Finance HR IT Connect all planning Close with confidence Reinvent analytics and reporting Outperform with intelligence Align enterprise data Hidden Costs & Overhead from Manual Processes Inability to Flex Work Lack of Transparency for Decision Making Time Wasted on Non- Value Add Activities Less Time to Focus on what’s most important Lots of Data; Little Information Siloed Data & End User ETL
  • 7. EPM CLOUD MULTI- APPLICATION ARCHITECTURES need ready and seamless integration of business data and processes
  • 8. POLL 1 Why and how would you need integration between multiple EPM Cloud applications?
  • 9. TERMINOLOGY EPM Cloud Artifacts for configurations today
  • 10. EPM CLOUD ARTIFACTS > EPM Cloud Planning :: Planning > Calculation Manager Rules :: Calculation Scripts :: BR > Data Management :: Data Integrations :: DM > Data Sources/ Targets :: Databases, BSO/ ASO & Plan types > Groovy Business Rules :: Groovy > EPM Connections :: Connection > EPM Automate :: Automate > REST API :: REST > JSON
  • 11. USE CASE Move Labor model data to Financials
  • 12. LABOR TO FINANCIALS CONFIGURATION WFRPT FINRPT Labor Financials Department labor model changes Report & Analyze by Jobs and Employees Department Financial planning Department Income Statement Trickle Feed Focused On-Save Scheduled Push Extract and Load
  • 13. INTER-POD REVOLUTIONS: CONFIGURATIONS Trickle Feed Focused On- Save Scheduled Push Extract and Load ✓ Calculation Manager Rules ✓ Data Management ✓ Groovy Business Rules ✓ EPM Automate & Scheduler ✓ Every 5 mins for small sets of data ✓ Right click push from the form ✓ Data Management ✓ Groovy Business Rules ✓ On Demand from the data form ✓ Data Management ✓ Groovy Business Rules ✓ EPM Automate & Scheduler ✓ Every 15 mins for all data ✓ Groovy Business Rules ✓ Import, Clear & Export data slices ✓ On-save intersections working like @XWRITE
  • 14. POLL 2 Do you use or are planning on designing an multi-pod EPM Cloud implementation?
  • 15. CONFIGURATIONS: REVOLUTIONS How each type of configuration is implemented
  • 17. TRICKLE FEED Copy updated data and run incremental data push
  • 18. CONFIGURATION/ DESIGN Planning/ Groovy: Data copy and setEdited Groovy: Filtered Integration & setEdited (false) Connection/ REST: Call DM Rule DM: Run Inter- pod Data sync DM: Clear Region at for Edited members Planning: Target data updated WFRPT FINRPT
  • 19. POLL 3 What is your expertise level with Groovy Scripting?
  • 20. GROOVY (RUN ON SAVE) /*RTPS: {Entity} {L_Scenario}, {L_Year} */ def varYear = rtps.L_Year.toString() def varScenario = rtps.L_Scenario.toString() def curEntity - rtps.Entity.toString() // Capture Jobs and Employees whose data changed operation.grid.dataCellIterator.each { DataCell cell -> if(cell.edited) { List JobList << cell.getMemberName("Job") List EmployeeList << cell.getMemberName("Employee")}} // Generate the calc script to calculate for edited Jobs and Employees List povmbrs = operation.grid.pov String curVersion = povmbrs.find {it.dimName =='Version'}.essbaseMbrName String FixString = """ "${JobList.join('", "')}","${EmployeeList.join('", "')}", “$varYear”, "$curVersion", "$curEntity" """ String calcScript = """FIX($FixString) FIX("$varScenario") %Template(name:="Calc Workforce Expenses",application:="EPBCS",plantype:="OEP_WFP",dtps:=()) ENDFIX DATACOPY "$varScenario" TO "TrickleFeed"; ENDFIX""" println("The following calcscript was run: n $calcScript") Cube wfmaincube = operation.application.getCube(“OEP_WFP”) Wfmaincube.executeCalcScript(calcScript.toString())
  • 21. GROOVY (RUN ON SAVE) Cube wfmaincube = operation.application.getCube("OEP_WFP") // Define Grid DataGridDefinitionBuilder wfmaincubegridbuilder = wfmaincube.dataGridDefinitionBuilder() // Suppression in grid definition wfmaincubegridbuilder.setSuppressMissingBlocks(true) wfmaincubegridbuilder.setSuppressMissingRows(true) // Create POV, Row and Columns for the grid wfmaincubegridbuilder.addPov(['Version','Scenario','Property'],[['OEP_Working'],['TrickleFeed'],['OWP_Expe nse Amount']]) wfmaincubegridbuilder.addColumn(['Years','Period'],[[‘&OEPPlanYr’],['ILvl0Descendants("YearTotal")']]) wfmaincubegridbuilder.addRow(['Job','Employee','Entity','Account’],[['ILvl0Descendants(“OWP_Total Jobs")'],['ILvl0Descendants(“OWP_Total Employees")'], curEntity, ['ILvl0Descendants("Salary_Accounts")','ILvl0Descendants("Hour_Stats")']]) //Build the grid using the definition DataGridDefinition wfmaincubegriddefinition = wfmaincubegridbuilder.build() // Load the grid and set to Edited cells DataGrid wfmaincubeGrid = wfmaincube.loadGrid(wfmaincubegriddefinition,false) // Iterate through the grid get to Entity, Employee and Job intersections to set them as Edited wfmaincubeGrid.dataCellIterator().each{ it.setEdited(true) }
  • 22. GROOVY (RUN EVERY 5 MINS THROUGH AUTOMATION) // Get Edited cells wfdataloadcubeGrid.dataCellIterator.each { DataCell cell -> if(cell.edited) { List entities << cell.getMemberName("Entity") } String entitiesDM = "${entities.join('", “’)}” //Run data integration with source filters override HttpResponse<String> jsonResponse = operation.application.getConnection(“WFRPT_FINRPT_DM").post() .header("Content-Type", "application/json") .body(json(["jobType" : "INTEGRATION", "jobName" : “TrickleFeed_Rule", "periodName" : "{$DataRuleStartPeriod}{$DataRuleEndPeriod}", "importMode" : "REPLACE", "exportMode" : “REPLACE_DATA", "sourceFilters" : ["Entity":$entitiesDM ]])).asString() //Mark Cell Unedited in Trickle Feed DataGrid wfUneditcubeGrid = wfmaincube.loadGrid(wfUneditcubegriddefinition,false) wfUneditcubeGrid.dataCellIterator().each{ it.setEdited(false) }
  • 23. CONSIDERATIONS 23 > Automate: Server or Object Store for Schedule/ Automation > Automate: EPM Automate commands understanding > TrickleFeed Scenario member > EPM Connections for REST API > For contention issues: Back-up with 15 min job > DM: Clear Region
  • 24. FOCUSED ON-SAVE Either on-save from a form OR Right click Action Menu
  • 25. CONFIGURATION/ DESIGN Planning: Save on Form OR Right Click Action Menu Groovy: Uses data form context to call DM Connection/ REST: Call DM Rule DM: Run Inter- pod Data sync DM: Clear Region at Target Planning: Target data updated WFRPT FINRPT
  • 26. GROOVY (RUN ON SAVE OR FROM ACTION MENU) // Get all entities to push on save or on right click /*RTPS: {Entity}*/ Cube wfmaincube=operation.application.getCube(“OEP_WFP”) Dimension EntityDim=operation.application.getDimension(“Entity”,wfmaincube) def EntityList = EntityDim.getEvaluatedMember(“@Relative(rtps.Entity.toString,0)”,wfmaincube) String entitiesDM = "${EntityList.join('", “’)}” // Get start and end periods for the data load rule from the scenario def StartYear = operation.application.getDimension('Scenario').getMember(‘Budget').toMap()["Start Year"].toString().substring(2) def EndYear = operation.application.getDimension('Scenario').getMember(‘Budget').toMap()["End Year"].toString().substring(2) def StartPeriod = operation.application.getDimension('Scenario').getMember(‘Budget').toMap()["Start Period"].toString() def EndPeriod = operation.application.getDimension('Scenario').getMember(‘Budget').toMap()["End Period"].toString() String DataRuleStartPeriod = "$StartPeriod"+"-"+"$StartYear" String DataRuleEndPeriod = "$EndPeriod"+"-"+ "$EndYear“ // Use Data Management REST API connection HttpResponse<String> jsonResponse = operation.application.getConnection(“WFRPT_FINRPT_DM").post() .header("Content-Type", "application/json") .body(json(["jobType" : "INTEGRATION", "jobName" : “Focused_OnSave", "periodName" : "{$DataRuleStartPeriod}{$DataRuleEndPeriod}", "importMode" : "REPLACE", "exportMode" : “REPLACE_DATA", "sourceFilters" : ["Entity": "$entitiesDM"]])).asString()
  • 27. CONSIDERATIONS 27 > DM: Clear Region > DM: Multiple invocation of the same rule not allowed > EPM Connections for REST API > Planning: Make it an admin function
  • 28. SCHEDULED PUSH Jump Server scheduled push from source to target
  • 29. CONFIGURATION/ DESIGN Automate: runBusiness rule Groovy: Provide Start and End Periods Connection/ REST: Call DM Rule DM: Run Inter- pod Data sync DM: Clear Region at Target (fully defined) Planning: All data updated WFRPT FINRPT
  • 30. EPM AUTOMATE (RUN ON SCHEDULE) REM Run Business Rule call epmautomate runbusinessrule “WFRPT_FINRPT_Interpod” “plantype=OEP_WFP” “RTP_StartPeriod=Jan- 20” “RTP_EndPeriod=Dec-20” REM Run Rule set call epmautomate runRuleset “WFRPT_FINRPT_Interpod_Ruleset” “RTP_StartPeriod=Jan-20” “RTP_EndPeriod=Dec-20”
  • 31. GROOVY (RUN FROM EPM AUTOMATE COMMAND) // Get start and end periods for the data load rule from run-time prompts /* RTPS: {RTP_StartPeriod} {RTP_EndPeriod}*/ def String DataRuleStartPeriod = rtps.RTP_StartPeriod.getEnteredValue() def String DataRuleEndPeriod = rtps.RTP_EndPeriod.getEnteredValue() // Use Data Management REST API connection HttpResponse<String> jsonResponse = operation.application.getConnection("WFRPT_FINRPT_DM").post() .header("Content-Type", "application/json") .body(json(["jobType" : "DATARULE", "jobName" : “Scheduled_Push", "startPeriod" : "$DataRuleStartPeriod", "endPeriod" : "$DataRuleEndPeriod", "importMode" : "REPLACE", "exportMode" : "REPLACE_DATA"])).asString()
  • 32. CONSIDERATIONS 32 > Automate: Server or Object Store for Schedule/ Automation > Automate: EPM Automate commands understanding > EPM Connections > DM: Fully defined Clear Region or Business Rule > Planning: Data delay at target, but clean cut of data at all times
  • 33. JSON EXTRACT AND LOAD Export, Clear and Import data slices on save
  • 34. CONFIGURATION/ DESIGN Planning: Save on Form Groovy/ JSON: Source data slice Export grid Connection/ REST: Call Planning Export data slice Groovy/ JSON: Create target data slice clear Connection/ REST: Call Planning to Clear data slice Groovy/ JSON: Import grid using Export grid WFRPT FINRPT Connection/ REST: Call Planning to Import data slice Planning: Specific intersection updated
  • 35. GROOVY (RUN ON SAVE) // Export data slice Connection sourceWfR = operation.application.getConnection(“WFRPT_Connection") HttpResponse<String> jsonResponse=sourceWfR.post("/rest/v3/applications/WRKFORCE/plantypes/WFRPT/exportdataslice") .header("Content-Type", "application/json") .body(""" {"exportPlanningData": false, "gridDefinition": {"suppressMissingBlocks": true, "suppressMissingRows": true, "suppressMissingColumns": false, "pov": { "dimensions": ["Entity","Scenario","Version","Years"], "members": [ ["Sales_US"],["Budget"],["OEP_Working"],["FY20"] ] }, "columns": [{ "dimensions": ["Period"], "members": [[ "ILvl0Descendants(YearTotal)"]] }], "rows": [{ "dimensions": ["Property","Jobs","Employee","Account"], "members": [ ["OWP_Expense Amount"],["OWP_Total Jobs"], ["OWP_Existing Employees"],["ILvl0Descendants(Salary Accounts)"]]}]}}""").asString() // Convert response to import json def exportresponse = new JsonSlurper().parseText(jsonResponse.body) as Map
  • 36. GROOVY (RUN ON SAVE) // exportresponse for export data slice { "pov": [ "Sales_US", "Budget", "OEP_Working", "FY20" ], "columns": [ ["Jan", "Feb", "Mar", "Apr", "May", "Jun“, "Jul", "Aug", "Sep", "Oct", "Nov", "Dec ], ], "rows": [ { "headers": [ "OWP_Expense Amount", "OWP_Total Job" "OWP_Existing Employees", "Base_Salary" ], "data": [ "", "", "10000", "10000", "10000", "10000", "10000", "10000", "10000", "10000", "10000", "10000“] } }
  • 37. GROOVY (RUN ON SAVE) // Transform exportresponse //Replace Additional Headers etc. exportresponse.rows.each{ // Add load member Collections.addAll(((List)it['headers']),"Load") // Remove additional dimension members ((List)it['headers']).removeAll(["OWP_Total Jobs","OWP_Existing Employees"]) // Map workforce accounts to financials accounts Collections.replaceAll(((List)it['headers']),"OWP_Expense Amount",“Labor_Data") //Replace missing data with #missing Collections.replaceAll(((List)it['data']),"","#Missing") }
  • 38. GROOVY (RUN ON SAVE) // Clear data slice Connection targetFinR = operation.application.getConnection(“FINRPT_Connection") jsonResponse = targetFinR.post("/rest/v3/applications/FINPLAN/plantypes/FINRPT/cleardataslice") .header("Content-Type", "application/json") .body(""" { "clearEssbaseData": true,"clearPlanningData": false, "gridDefinition": {"suppressMissingBlocks": true, "pov": { "dimensions": ["Entity","Scenario","Version","Years","DataView"], "members": [[“Sales_US"],["Budget"],[“OEP_Working"],["FY20"],["Labor_Data"]] }, "columns": [{ "dimensions": ["TimePeriods"], "members": [["ILvl0Descendants(YearTotal)"]] }], "rows": [{ "dimensions": ["Account"], "members": [["ILvl0Descendants(Salary Accounts)"]]}]}}""").asString()
  • 39. GROOVY (RUN ON SAVE) // Build Import json Map importDataToTargetMap = new HashMap() importDataToTargetMap['aggregateEssbaseData'] = false importDataToTargetMap['cellNotesOption'] = 'Overwrite' importDataToTargetMap['dateFormat'] = "DD/MM/YYYY" importDataToTargetMap['dataGrid'] = exportresponse // Import data slice jsonResponse = targetFinR.post("/rest/v3/applications/FINPLAN/plantypes/FINRPT/importdataslice") .header("Content-Type", "application/json") .body(json(importDataToTargetMap)).asString() response = new JsonSlurper().parseText(jsonResponse.body) as Map def String msg = "" if(response["numRejectedCells"] != 0){ println "${response['numAcceptedCells']} cells were loaded." msg = "The process completed successfully but had ${response['numRejectedCells']} rejected cells. The following were identified. n${response['rejectedCells']}" println msg throwVetoException(msg) } else{ println "${response['numAcceptedCells']} cells were loaded." }
  • 40. CONSIDERATIONS 40 > EPM Connections > Planning: Understanding of Export, Clear and Import data slices > Json: Understanding of Json structure
  • 41. Q&A