Tuesday, May 31, 2011

Good Practise in Transformation Rule

It is discouraged to use the field routine to translate certain fields to its mappings or read certain attributes from the master data if the source data is huge. This is because the data is processed one single row each time the routine gets processed when it goes through the main loop instead of as the complete data package in an internal table in the start or end routine. The best practice in a transformation is to:

  • Filter data in start routine and perform business mapping rules in the end routine. Usually in the end routine,you will work with the result package to capture error stack records and for re-transformation.
  • Declaring global variables and capturing data in Global internal table at start routine. 
  • Using constant to pass derived value from one field to the next

Infoobjects Stories

Infoobject is the smallest part of the SAP BW building blocks and it stores master data. Thus it plays a very important part in ensuring single version of truth and to represent the correct business definition. It is also very crucial to design an infoobject's attribute and compounding key as if designed wrongly, there may be massive re-work or having obsolete and new infoobjects that are used in different solutions but have same business definition. That is why the creation and modification of infoobject should be controlled by tight governance and approved by a board of experts who are well verse with the BI landscapes and risk of introducing changes to the existing infoobjects.

Compounding of master data infoobject to source system is a must in a multiple source system environment and in a multiple regional landscapes that consolidate master data at global level. The 1:1 or 1:N relationshio in an infoobjects and its attributes has to be properly defines as it won't cater for N:N relationship in a drill down reporting, eg multiple end markets may get their goods from multiple factories. Hence it would be impossible to capture the drill down value for end market and factory in a single view.

Thursday, May 19, 2011

BI BAU Role

In most big organization, the employees retain in an organization are needed to be of value add and they are classified as BAU resources. Some company put it as Core and Non-Core.In a BI division one of the Core or BAU role is called Technical Design Authority. This role covers a wide range of responsibilities which comprises of a Build Analyst and a Cutover Analyst role. In a nutshell, a build analyst is a person who QA the build and design according to the technical blueprint or proposed fix while a cutover analyst is responsible for the coordination of transport and activities prior and during the business release cutover. The other role which is the bread and butter of a TDA tasks is to perform impact analyst on the changes going into Production box. This often include cross application impacts and shared objects in BI which can be analyzed from the Metadata repository. Cross applications often require the involvement of master data team in terms of master data cleanliness and mapping rules and ERP/APO approval for any of the datasource changes in the feed system. Some other activities under the care of TDA are process chain schedule and data reload requirement which require impact analysis before those tasks are executed by Support team. A senior TDA who has massive project and hands-on experiences should be able to perform review on the build and design of the solution proposed by the project team and can approve or deny a particular design to be implemented. This means he/she needs to have detail understanding of the proposed solution and architecture of the system in order to mitigate the risk of having flawed design or solution which does not work as expected, requires extensive maintainence or does not adhere to the organization standards.

In any BAU role, the ability to gauge a scenario or issue on the level of urgency and importance, the level of effort to solve the issue,the level of exposed risk,the level of impact and correct escalation route are important to avoid redundant discussions, long turn over time, too many processes introduced that has many gaps in between and eventually deviate from the real objective - to ensure business run as usual. In short, this role requires significant knowledge of the company BI processes and usage.

Sunday, May 15, 2011

Checklist for Build and Design Review

During the walk-through by developers on the changes in BI environment, there are a number of criteria and standard that are required by the developers in their development work before it can be promoted to the BI production landscape.This is important as to avoid 'bad design' and overlook of design issue that can cause impact to other objects in a complex BI environment. These includes:

Review build list objects:
Check against functional design

LSA data flow review

Reporting
  • Infocube dimension build
  • Correct use of infoprovider infoobject as constant where multiproviders used
  • Multiprovider build against standards
  • Query build against standards (CKFs/RKFs)
  • Web template design - efficient use

Scheduling

  • Process chain walk-through & review
  • Housekeeping process chains maintained
  • Correct use of master data process chain
  • Cross-functional object scheduling

ABAP code

CR reference + business release in code description
Performance checklist:
  • Start routine table caching
  • ABAP written efficiently
  • Full loads / deltas used correctly
  • DTP package sizes
  • Web template size

Use of error stack


Reactivation of data flows

On the detail design review phase, these are the items that need to be QA and considered before the build can commence:
  • Business release milestone
  • Assess cross functional impacts
  • Data volumes understood, included in SSD
  • Conformance to a company design and naming convention standards
  • LSA principals followed
  • Process chains
  • Supportability of solution ie. Ease of support
  • Impact on current system / solution
  • Authorizations -roles maintained, role design, central authorization adherence, analysis authorization
  • User numbers
  • Portal impact-portal engagement questionnaire filled out
  • Scheduling - Scheduling impact understood/PSA / Change log deletions considered
  • Future proof - ability to handle steady-state volumes

Wednesday, May 11, 2011

Modelling Methods (Slowly changing dimension)

1) Dependent attribute as a time dependent navigational attribute (key date can be set at Bex
query)
2) Dependent attribute as a node of an external time dependent hierarchy
3) Dependent attribute as a node of an external version dependent hierarchy
4) Put the dependent attribute of your characteristic as a characteristic in the same dimension
5) Time dependent master data with intermediate DSO set to overwrite (reprocessing approach -
reload all for back-dated master data and select data to be reprocessed for forward-dated
transactional data dates)

Finance term for IT Geek

General Ledger
The central task of G/L accounting is to provide a comprehensive picture for external accounting and accounts. Recording all business transactions (primary postings as well as settlements from internal accounting) in a system that is fully integrated with all the other operational areas of a company ensures that the accounting data is always complete and accurate.


CO-PA
1. The updates that flow to FI and CO from different processes within SAP.
2. Value fields are building blocks of COPA.
3. Different types of updates that flow from FI to CO:
  • during the Supply Chain process
  • during the Material Ledger process
  • from Project Systems and Cost Centre Accounting

CCA
You use Cost Center Accounting for controlling purposes within your organization, making the costs incurred by the organization transparent.

Costs are assigned to their Function so you can determine where costs are incurred within the organization. From these cost centers you can then use different methods to assign the activities and costs to the relevant products, services, and market segments.

WBS
The Project System module in SAP R/3 holds vital information and guarantees constant monitoring of all aspects of a Project. A clear, unambiguous project structure is the basis for successful project planning, monitoring, and control.

You structure your project per the following points of view:
  • By structures, using a work breakdown structure (WBS)
  • By process, using individual activities (work packages)

Projects Systems data to provide information about Brand Expenditure.


Branded Trade Expenditure which is not related to one specific brand can be captured on cost centers and needs to be allocated to branded WBS elements based on standard allocation keys set once a year. If only related to one brand, please capture on WBS element.

Statistical key figures
In SAP, statistical key figures can be created to enable automatic allocation methods to cycle the costs from cost centers to WBS elements. This statistical key figure functionality can be used for example: key account contracts.

By setting once a year a statistical key figure, the key account costs captured on cost centers can be cycled automatically, based on a set of allocation rules, to branded WBS elements.
The statistical key figure functionality will decrease the manual allocation postings currently done by several end markets.


Tuesday, May 10, 2011

Error Stack Handling

This is bound to be a debatable item for any BI solution on the owner of the error stack in BAU mode whether Support or Business users. Before the developer switch the error handling mode in DTP, please consider the fact of 'real' records that supposed to fall in error stack and those supposedly be filtered out before transformation level. Support also has the tendency to switch on the error stack in cases where a lot of data that were supposed to be filtered out causes process chain to fail.

'Real' records that suppose to be monitored in error stack are:
  • data related to mapping logic
  • data dependent of master data attribute to derive

The above points to the fact that the error stack management should fall into the master data,business user or SME as to correct those records, business logic is mandatory.

Unnecessary data which supposed to be filtered out (usually in start routine) are normally data that is not used from that module. Example records from certain sales organization are not needed to be reported can be filtered out before it went into the transformation level.

Sunday, May 8, 2011

0FI_GL_6 and when 0FI_GL_4

0FI_GL_4 Extractor works in different way . Init and delta is based on system date .So only one delta is possible in each day . This is applicable for older SAP PI release until 2004.1 and there is a OSS note to enable minute base delta. Please refer to OSS note 991429.

You can obtain detail information of 0FI_GL_4 extractor here.

In BI we extract 2 types of General Ledger information
  • Opening Balances (once a year, starts only on 1.1.2011) Take from 0FI_GL_4.
  • Line items postings (daily) Take from 0FI_GL_6.

If you have ECC version 5 above, you can opt to use 0FI_GL_10 for transaction figures (balance carry forward) and 0FI_GL_14 for line items. If you have EhP3 upgrade on your ECC, there is a new option to use 0FI_GL_20 for transaction figures and 0FI_GL_40 for line item. The downside for 0FI_GL_20 is this data source has after image delta and therefore a DSO is required to enable delta. For  0FI_GL_40, it is by default not delta enabled  because ist specifically created for the Remote Cube.

§

Outsourcing, BAU and CAB

In most big organization,resources are often categorized as 'Value Add' and 'Commodities'. 'Commodities' groups are often outsourced and this include developers and supports. 'Value Add' groups are BAU organization and often comprises of management and gatekeeper who contribute to impact analysis and decision making on certain action or escalation point when issue or change arise. They tend to form a lean organization structure who participate in advisory,coordination and management workforce. In order to have a standard agreeable escalation point either to project as enhancement or support as bug/break fixes, the Value Add group's decisions need to be governed by a set of rules call 'CAB exemption list'.

As the structure and solutions in an organization are different, the way this exemption list evolves is different. But the baseline of arriving at a standard agreeable ownership of issues and clear defined process has to be based on the content of the list. This list is an ongoing efforts to put down the ownership of issues based on the objects together with different type of scenarios that could possibly trigger the change of the object. Eg. In cases like a DTP,missing DTP is categorize as CAB1 as this impact the daily process chain while new DTP filtering rule are CAB2 as it may be an enhancement to cater for any new business rules. Any new scenarios need to be discussed in the CAB meeting to agree on which category it falls into, either CAB1 or CAB2. In a nutshell, CAB1 is a list of responsibilities held for bug and break fixes. CAB2 is a list of objects that involves enhancement to existing design. Small/easy enhancements such as adding values to the condition of filtering can be addressed by Support team wherelse multiple enhancements or big enhancement can be grouped into a mini project that falls into project.

It is still a debatable subject on whether SME falls under 'Commodities' or 'Value Add' group. This question back on the reliability to outsource entire development work to vendors with only indirect participation of BAU during build review session. SME of BI solutions requires a thorough understanding of technical and functional requirement for a particular solution and they are the best people to perform any impact analysis to the proposed solution. Most of the BI impact assessment are technical as they are the receiver system of the business change in the feed system of R/3,external system or APO.As such it is possible that the technical impact assessment can be done via the help of BI meta-repository and a where-used lookup. There are other exceptional cases in which changes in R/3 or feed system has to be impact analyze at BI level due to the nature of datasource or content being 'modified' at source system level (Please refer to my other post on this). This is because the projects are often the expert of the solutions because they design the solution and the gatekeeper will not have the necessary detailed insight of the design and build unless it's from blueprint and walk through sessions. The gatekeeper may not be involved in Support issues as well unless the support team is not performing according to their job scope.Support on the other hand is the group who understand the in and out of the flaws of the design as they are the front liner for any issues logged by business users.

It is also a common overlook of role feasibility when a BAU organization which aims to be a lean organization in IT services starts to streamline all 'subjects' (especially in BI where it consist of a range of different solutions for different reporting purposes) into one single SME role and further extend the role to a 'Value Add' gatekeeper role who held responsible for any changes that land into the system.

On top of all that, the next successful outsource strategy will evolve around the concept of 'long term partnership' and not base on mere vendor-client relationship as it takes a close knitted working relationship to form a successful IT organization in a corporate company.

The thing about APO-BI Integration

The major issues with the data reconciliation between APO and BI is:
1) In APO, the planning does not use source system but in BI, the materials are compounded to source system based on the nature of multiple R/3 regional boxes feeding data to one regional SAP BI instance which in turn consolidate with other regional BI instance to consolidate data at Global BI instance. There are incidents where same SKU may exist in 2 regional R/3 source but refer to different material.
2) In APO, the planning does not plan by base UOM or ISO UOM, it always plan in PUM. Hence when data is send to BI, the conversion factor must include converting base on PUM to the base UOM and then to the ISO UOM or CORE UOM.

Saturday, May 7, 2011

Common Bugs, Breaks and Enhancements

Whether it's in the phase of project transitioning into BAU or already in BAU mode, there are common major bugs,breaks and enhancements in SAP BI that are needed along with the standardization of master data (that runs in parallel with ERP convergence) . It is always a long back and forth discussion on who is the responsible party to be responsible for those changes and on which category the desire fix falls into whether it's a project defects(bugs), breaks (resulted from changes of other things) or an enhancement. This is important as it lead to the party who will fund the fix.

Here is a list of common bugs, breaks and enhancements in a highly integrated SAP BI platform:
Bugs
  • Web template functionality
  • Inconsistent coding logic applies in routines with the same business rules

Breaks
  • Changes in shared user exit
  • Changes in the shared infoobjects

Enhancement
  • Adding navigational attribute
  • New flow/enhanced datasource to pick new fields