1) How do we manage the number of records in error stacks?
2) Identify owner of mapping and filter tables
3) Test on unit and currency conversion done?
4) Data release mechanism tested?
5) Yearly process chain or mechanism (such as balance brought forward) tested?
6) Any master data issue and escalation process?
7) Cleaning up of obsolete objects?
Monday, July 25, 2011
Monday, July 18, 2011
Intercompany Elimination - how BI helps to reduce manual workload
When preparing or combining consolidated balance sheet,routine manual finance task is required to deduct the intercompany items between a parent and its subsidiary. This can be done either through manual adjustment (by manual entries) or through dummy account posting.
However, there is a feature in BI that allows this deduction to be done automatically. It uses the elimination feature in the key figure infoobject (SAP reference). The elimination is by each characteristic pairs or done via start routine in the transformation for a more complex approach such as elimination at parent-child level (Intercompany elimination). The elimination figures is calculated in a separate flow and consolidated in the financial multiprovider for reporting.In a global environment, the different regions must have the standard and consistent master data that is referred to during the elimination such product (SKU) , selling business unit and buying business unit.
The business rules behind intercompany elimination:
1) IC sales -done as soon as data available in BI(dynamic) for the following account:
3) Profit in stock (involve IC margin and sales volume) - done monthly.Eg:
belong to the same level of the market hierarchy (eg. same End Market, Cluster, Zone, Area or Region)
However, there is a feature in BI that allows this deduction to be done automatically. It uses the elimination feature in the key figure infoobject (SAP reference). The elimination is by each characteristic pairs or done via start routine in the transformation for a more complex approach such as elimination at parent-child level (Intercompany elimination). The elimination figures is calculated in a separate flow and consolidated in the financial multiprovider for reporting.In a global environment, the different regions must have the standard and consistent master data that is referred to during the elimination such product (SKU) , selling business unit and buying business unit.
The business rules behind intercompany elimination:
1) IC sales -done as soon as data available in BI(dynamic) for the following account:
- Internal Net Turnover
- Bought in Goods
- Primary Supply Chain Cost
3) Profit in stock (involve IC margin and sales volume) - done monthly.Eg:
- 6 month rolling IC margin in June will be based on Jan-June IC Margin / Jan-June IC Sales volume
- 6 month rolling IC margin in July will be based on Feb-July IC Margin / Feb-July IC Sales volume
- Intracompany Elimination
- Intercompany Elimination
belong to the same level of the market hierarchy (eg. same End Market, Cluster, Zone, Area or Region)
Friday, July 15, 2011
The Framework of a BI BAU Organization

Change Management Framework - To be able to access the BI related changes and impacts from current development and changes in feed system, identify escalation party
Business Release Framework - To ensure the changes land successfully in the BR timeline, ensure the readiness of infrastructure and no impacts resulted from the transports
Quality Assurance Framework - To ensure no or minimal risks or defects landing into the system
Monday, July 11, 2011
Future of SAP BI?
Ok, we all know BO is gonna replace SAP BEx (not in near future) but HANA to wipe off the datawarehouse? This is like hardware evolution vs the datawarehousing technology. HANA 1.0 is more like the BWA but going forward, SAP's roadmap was to replace BW database with HANA, main memory cache is used to store data copy of database. But the BI application server is still to be used on top of HANA in addition to separate application server for ECC.
In-memory is going further to eliminate the need for a separate OLTP system . BI analytical capabilities are directly incorporated into the ECC system and allow all analytics to be run on operational data (ECC data). But hey, it's still a long way to go.
In-memory is going further to eliminate the need for a separate OLTP system . BI analytical capabilities are directly incorporated into the ECC system and allow all analytics to be run on operational data (ECC data). But hey, it's still a long way to go.

Wednesday, June 22, 2011
The BI Delivery E2E Bird Eye View

A successful BI delivery in any organization is actually an end to end process which involves a lot of parties and bridging of gaps between them. A common overlook in most organization is the lack of involvement of business (yellow box) and the bridge between the different parties (red arrow) that result in a long term lose of credibility from users and this leads to the failure of BI delivery.
Without the business involvement from early start coupled with the straight timeline of delivery, the project is eventually pushed to deliver a solution base on technical go live as there is a problem securing the business user's involvement at the last milestone of the project. A BI solution that is delivered into BAU without business go live is at high risk of capturing the bugs and issues later on when business starts to be engaged and they reveal tons of issues from the solution. This eventually leads to a lost of credibility on the BAU organization when it fails to deliver a lot of major fixes required by business due to the fact that those fixes are not ready to be taken on by Support team and the project team had left when the project was declared technical live. In situation like this, is it politically correct to say the BAU organization is subjected as the 'victim' of the whole process?Which party is going to fund the cost if the project team is required to stay back to fix the issues? That is the reason a solution that is fit to land into BAU mode has to be both technically and business live. The BAU organization has the authority to raise a red flag to prevent a solution to land into BAU state without proper sign off from both technical and business.
The other common overlook is the knowledge transfer and skills capability from Project to Support team to maintain the solution (blue arrow). Often if this is two different teams or companies,there will be a gap of knowledge transfer for Support team to take on any major bug fixes or enhancements, thus leading to a high turnover time that at the end resulted in business losing confidence of the BAU organization to ensure 'business run as usual'.
Thursday, June 2, 2011
Transport Watchout!
During the imports of a BI transports, the following should be look out:
1) Double check if Function Group or Shared User Exit are intentionally needed to be transported
2) Old transport overwrite new ones (esp in a multiple solution development environment)
3) Orphan transport left in non production system
1) Double check if Function Group or Shared User Exit are intentionally needed to be transported
2) Old transport overwrite new ones (esp in a multiple solution development environment)
3) Orphan transport left in non production system
Tuesday, May 31, 2011
Good Practise in Transformation Rule
It is discouraged to use the field routine to translate certain fields to its mappings or read certain attributes from the master data if the source data is huge. This is because the data is processed one single row each time the routine gets processed when it goes through the main loop instead of as the complete data package in an internal table in the start or end routine. The best practice in a transformation is to:
- Filter data in start routine and perform business mapping rules in the end routine. Usually in the end routine,you will work with the result package to capture error stack records and for re-transformation.
- Declaring global variables and capturing data in Global internal table at start routine.
- Using constant to pass derived value from one field to the next
Infoobjects Stories
Infoobject is the smallest part of the SAP BW building blocks and it stores master data. Thus it plays a very important part in ensuring single version of truth and to represent the correct business definition. It is also very crucial to design an infoobject's attribute and compounding key as if designed wrongly, there may be massive re-work or having obsolete and new infoobjects that are used in different solutions but have same business definition. That is why the creation and modification of infoobject should be controlled by tight governance and approved by a board of experts who are well verse with the BI landscapes and risk of introducing changes to the existing infoobjects.
Compounding of master data infoobject to source system is a must in a multiple source system environment and in a multiple regional landscapes that consolidate master data at global level. The 1:1 or 1:N relationshio in an infoobjects and its attributes has to be properly defines as it won't cater for N:N relationship in a drill down reporting, eg multiple end markets may get their goods from multiple factories. Hence it would be impossible to capture the drill down value for end market and factory in a single view.
Compounding of master data infoobject to source system is a must in a multiple source system environment and in a multiple regional landscapes that consolidate master data at global level. The 1:1 or 1:N relationshio in an infoobjects and its attributes has to be properly defines as it won't cater for N:N relationship in a drill down reporting, eg multiple end markets may get their goods from multiple factories. Hence it would be impossible to capture the drill down value for end market and factory in a single view.
Thursday, May 19, 2011
BI BAU Role
In most big organization, the employees retain in an organization are needed to be of value add and they are classified as BAU resources. Some company put it as Core and Non-Core.In a BI division one of the Core or BAU role is called Technical Design Authority. This role covers a wide range of responsibilities which comprises of a Build Analyst and a Cutover Analyst role. In a nutshell, a build analyst is a person who QA the build and design according to the technical blueprint or proposed fix while a cutover analyst is responsible for the coordination of transport and activities prior and during the business release cutover. The other role which is the bread and butter of a TDA tasks is to perform impact analyst on the changes going into Production box. This often include cross application impacts and shared objects in BI which can be analyzed from the Metadata repository. Cross applications often require the involvement of master data team in terms of master data cleanliness and mapping rules and ERP/APO approval for any of the datasource changes in the feed system. Some other activities under the care of TDA are process chain schedule and data reload requirement which require impact analysis before those tasks are executed by Support team. A senior TDA who has massive project and hands-on experiences should be able to perform review on the build and design of the solution proposed by the project team and can approve or deny a particular design to be implemented. This means he/she needs to have detail understanding of the proposed solution and architecture of the system in order to mitigate the risk of having flawed design or solution which does not work as expected, requires extensive maintainence or does not adhere to the organization standards.
In any BAU role, the ability to gauge a scenario or issue on the level of urgency and importance, the level of effort to solve the issue,the level of exposed risk,the level of impact and correct escalation route are important to avoid redundant discussions, long turn over time, too many processes introduced that has many gaps in between and eventually deviate from the real objective - to ensure business run as usual. In short, this role requires significant knowledge of the company BI processes and usage.
In any BAU role, the ability to gauge a scenario or issue on the level of urgency and importance, the level of effort to solve the issue,the level of exposed risk,the level of impact and correct escalation route are important to avoid redundant discussions, long turn over time, too many processes introduced that has many gaps in between and eventually deviate from the real objective - to ensure business run as usual. In short, this role requires significant knowledge of the company BI processes and usage.
Sunday, May 15, 2011
Checklist for Build and Design Review
During the walk-through by developers on the changes in BI environment, there are a number of criteria and standard that are required by the developers in their development work before it can be promoted to the BI production landscape.This is important as to avoid 'bad design' and overlook of design issue that can cause impact to other objects in a complex BI environment. These includes:
Review build list objects:
Check against functional design
LSA data flow review
Reporting
Scheduling
ABAP code
CR reference + business release in code description
Performance checklist:
Use of error stack
Reactivation of data flows
On the detail design review phase, these are the items that need to be QA and considered before the build can commence:
Review build list objects:
Check against functional design
LSA data flow review
Reporting
- Infocube dimension build
- Correct use of infoprovider infoobject as constant where multiproviders used
- Multiprovider build against standards
- Query build against standards (CKFs/RKFs)
- Web template design - efficient use
Scheduling
- Process chain walk-through & review
- Housekeeping process chains maintained
- Correct use of master data process chain
- Cross-functional object scheduling
ABAP code
CR reference + business release in code description
Performance checklist:
- Start routine table caching
- ABAP written efficiently
- Full loads / deltas used correctly
- DTP package sizes
- Web template size
Use of error stack
Reactivation of data flows
On the detail design review phase, these are the items that need to be QA and considered before the build can commence:
- Business release milestone
- Assess cross functional impacts
- Data volumes understood, included in SSD
- Conformance to a company design and naming convention standards
- LSA principals followed
- Process chains
- Supportability of solution ie. Ease of support
- Impact on current system / solution
- Authorizations -roles maintained, role design, central authorization adherence, analysis authorization
- User numbers
- Portal impact-portal engagement questionnaire filled out
- Scheduling - Scheduling impact understood/PSA / Change log deletions considered
- Future proof - ability to handle steady-state volumes
Wednesday, May 11, 2011
Modelling Methods (Slowly changing dimension)
1) Dependent attribute as a time dependent navigational attribute (key date can be set at Bex
query)
2) Dependent attribute as a node of an external time dependent hierarchy
3) Dependent attribute as a node of an external version dependent hierarchy
4) Put the dependent attribute of your characteristic as a characteristic in the same dimension
5) Time dependent master data with intermediate DSO set to overwrite (reprocessing approach -
reload all for back-dated master data and select data to be reprocessed for forward-dated
transactional data dates)
query)
2) Dependent attribute as a node of an external time dependent hierarchy
3) Dependent attribute as a node of an external version dependent hierarchy
4) Put the dependent attribute of your characteristic as a characteristic in the same dimension
5) Time dependent master data with intermediate DSO set to overwrite (reprocessing approach -
reload all for back-dated master data and select data to be reprocessed for forward-dated
transactional data dates)
Finance term for IT Geek
General Ledger
The central task of G/L accounting is to provide a comprehensive picture for external accounting and accounts. Recording all business transactions (primary postings as well as settlements from internal accounting) in a system that is fully integrated with all the other operational areas of a company ensures that the accounting data is always complete and accurate.
CO-PA
1. The updates that flow to FI and CO from different processes within SAP.
2. Value fields are building blocks of COPA.
3. Different types of updates that flow from FI to CO:
CCA
You use Cost Center Accounting for controlling purposes within your organization, making the costs incurred by the organization transparent.
Costs are assigned to their Function so you can determine where costs are incurred within the organization. From these cost centers you can then use different methods to assign the activities and costs to the relevant products, services, and market segments.
WBS
The Project System module in SAP R/3 holds vital information and guarantees constant monitoring of all aspects of a Project. A clear, unambiguous project structure is the basis for successful project planning, monitoring, and control.
You structure your project per the following points of view:
Projects Systems data to provide information about Brand Expenditure.
Branded Trade Expenditure which is not related to one specific brand can be captured on cost centers and needs to be allocated to branded WBS elements based on standard allocation keys set once a year. If only related to one brand, please capture on WBS element.
Statistical key figures
In SAP, statistical key figures can be created to enable automatic allocation methods to cycle the costs from cost centers to WBS elements. This statistical key figure functionality can be used for example: key account contracts.
By setting once a year a statistical key figure, the key account costs captured on cost centers can be cycled automatically, based on a set of allocation rules, to branded WBS elements.
The statistical key figure functionality will decrease the manual allocation postings currently done by several end markets.
The central task of G/L accounting is to provide a comprehensive picture for external accounting and accounts. Recording all business transactions (primary postings as well as settlements from internal accounting) in a system that is fully integrated with all the other operational areas of a company ensures that the accounting data is always complete and accurate.
CO-PA
1. The updates that flow to FI and CO from different processes within SAP.
2. Value fields are building blocks of COPA.
3. Different types of updates that flow from FI to CO:
- during the Supply Chain process
- during the Material Ledger process
- from Project Systems and Cost Centre Accounting
CCA
You use Cost Center Accounting for controlling purposes within your organization, making the costs incurred by the organization transparent.
Costs are assigned to their Function so you can determine where costs are incurred within the organization. From these cost centers you can then use different methods to assign the activities and costs to the relevant products, services, and market segments.
WBS
The Project System module in SAP R/3 holds vital information and guarantees constant monitoring of all aspects of a Project. A clear, unambiguous project structure is the basis for successful project planning, monitoring, and control.
You structure your project per the following points of view:
- By structures, using a work breakdown structure (WBS)
- By process, using individual activities (work packages)
Projects Systems data to provide information about Brand Expenditure.
Branded Trade Expenditure which is not related to one specific brand can be captured on cost centers and needs to be allocated to branded WBS elements based on standard allocation keys set once a year. If only related to one brand, please capture on WBS element.
Statistical key figures
In SAP, statistical key figures can be created to enable automatic allocation methods to cycle the costs from cost centers to WBS elements. This statistical key figure functionality can be used for example: key account contracts.
By setting once a year a statistical key figure, the key account costs captured on cost centers can be cycled automatically, based on a set of allocation rules, to branded WBS elements.
The statistical key figure functionality will decrease the manual allocation postings currently done by several end markets.
Subscribe to:
Posts (Atom)