Tuesday, December 20, 2011

Outsourcing BW and BI – When Does it Make Sense?

Stumbled upon this article today discussing on the outsource reality that hits BI.

Monday, August 29, 2011

Early positioning of organizational change management during your implementation process

In reality, supporting a BI solutions in BAU mode can be challenging not only because there are no documentation or insufficient knowledge transition but there are many important points that were only known by the project consultants which  were not possible to be knowledge transferred or documented fully at the end of the project milestone.A lot of times, the issue is only discovered when the report is showing inaccurate data or process chain fails. This can happen if a working solution is not robust enough to handle different type of future changes and scenarios.Hence it is important for a professional project consultant to document and highlights areas of common maintenance required, known bugs and customized/hardcoded method applied during the implementation process. To ensure the smoothness of handover, this has to be reviewed and updated periodically during the implementation phase and not only at the stage of near completion as what most of the project practices. This is to ensure a complete understanding of the solution and a proactive working partnership between the project and the clients/support. Some of the crucial points that can be covered in this area:

  • customized abap program to lookup mapping values or selection of period filtering
  • deletion selection in the process chain
  • process chain failure root cause
  • possible enhancement area in the transformation logic (any hardcoded values that can changed or added in future)
  • impact of changing a web template or query objects to the other items in same template
  • shared datasources and infoobjects and the dependency 
  • any customized objects sits on erp/feed system 
  • which master data attributes required to be maintained when new assignment is added
  • roles and authorization matrix (usually liase with S&A and Portal team)

Thursday, August 18, 2011

Bye to CO-PA BW backend and welcome CO-PA HANA?



Read the full article here.

Production Support Project TCODE

The consultant is required to have access to the following transactions in R3 :

  1. ST22 
  2. SM37 
  3. SM58 
  4. SM51 
  5. RSA7 
  6. RSA3
  7. RSA6
  8. SM13
  9. SE16
  10. RSO2

Depending on needs:

  1. SP01
  2. DB02
  3. SM14
  4. SUIM
  5. SM01

Authorizations for the following transactions are required in BW:

  1. 1. RSA1 
  2. 2. SM37 
  3. 3. ST22 
  4. 4. ST04 
  5. 5. SE38 
  6. 6. SE37 
  7. 7. SM12 
  8. 8. RSKC 
  9. 9. SM51 
  10. 10. RSRV 
  11. 11. RSMO


Wednesday, August 17, 2011

Query Design Tips for Performance

  • Use filters - use as many as possible to reduce amount of data need to read from source
  • Use the 0infoprov in query restriction if data model is designed in a multiprovider that contain data segregated by same definition for each Infocube
  • Avoid using condition and exception
  • Use free characteristics - use as few as possible
  • Use restricted key figure with care - generate more complex sql
  • Use more than one structure with care
  • Characteristics/Navigational attribute are more efficient than hierarchies
  • Avoid complex queries - consider RRI to offer analysis path rather than define queries showing everything in the infoprovider
  • Check the Use Selection of Structure Elements option
*While filters are evaluated by the database, conditions and exceptions are usually 
evaluated by the application server resulting in a much larger volume of data being transferred between both servers.

Tuesday, August 16, 2011

Some throw in for BI Whitepaper titles

1) IT Globalization impacts to BI and ERP
2) ERP convergence impacts to existing BI landscapes
3) Global template architecture and its feasibility in long run and local change management
4) The handshake relationship between BI and its feed partners - APO, SRM,CRM, ERP
5) Master Data Management gearing up towards ERP convergence
6) Reaping benefits from Consolidated Financial Reporting
7) How BI plays its role in bridging the gap between a fragmented business entities and Global Enterprise Model

Monday, August 15, 2011

Dimension attribute, navigational attribute or display attribute?

When modeling the infocube, the decision to include the infoobject in the dimension object itself or either as a navigational or display attribute is influenced by
1) slowly changing dimension/historical data view
2) cleanliness of the master data

Display attributes values are stored in the dimension table itself and it has its advantage such as data will reflect data from the historical-truth perspective.The disadvantage is the data in the infocube has to be reloaded if there is unclean or changed of master data assignment to the value of the infoobject which happen quite frequently in a global alignment environment. If the impact of truncating the infocube and reload is too high, a new infoobject that refers to the same business object may be introduce to replace the one with unclean master data. The later one may be labelled as 'no longer in use'.

Navigational attribute values are not stored in the dimension table but in the attribute table of the characteristic used in the infocube. Any changes to the attribute value assignment of the infoobject does not require the realignment of infocube. It may however require realignment of the aggregate containing the navigational attribute. If the infoobject and its attribute values are used in hierarchy , the hierarchy may required to be drop before the attribute value can be changed.

If the value is stored as a display attribute, any changes on the value of the attribute of the infoobject won't impact the data in the infocube as the display attribute is not stored as SID in the dimension table.Display attribute does not support drill down reporting and it can only be displayed in the report.

Either it's navigational or display attribute, the report will always refer to the current value in the master data and modelling the infocube with this approach does not support historical-truth.

Monday, July 25, 2011

Common questions posted during handover

1) How do we manage the number of records in error stacks?
2) Identify owner of mapping and filter tables
3) Test on unit and currency conversion done?
4) Data release mechanism tested?
5) Yearly process chain or mechanism (such as balance brought forward) tested?
6) Any master data issue and escalation process?
7) Cleaning up of obsolete objects?

Monday, July 18, 2011

Intercompany Elimination - how BI helps to reduce manual workload

When preparing or combining consolidated balance sheet,routine manual finance task is required to deduct the intercompany items between a parent and its subsidiary. This can be done either through manual adjustment (by manual entries) or through dummy account posting.

However, there is a feature in BI that allows this deduction to be done automatically. It uses the elimination feature in the key figure infoobject (SAP reference). The elimination is by each characteristic pairs or done via start routine in the transformation for a more complex approach such as elimination at parent-child level (Intercompany elimination). The elimination figures is calculated in a separate flow and consolidated in the financial multiprovider for reporting.In a global environment, the different regions must have the standard and consistent master data that is referred to during the elimination such product (SKU) , selling business unit and buying business unit.

The business rules behind intercompany elimination:
1) IC sales -done as soon as data available in BI(dynamic) for the following account:
  • Internal Net Turnover
  • Bought in Goods
  • Primary Supply Chain Cost
2) IC margin -done monthly
3) Profit in stock (involve IC margin and sales volume) - done monthly.Eg:
  • 6 month rolling IC margin in June will be based on Jan-June IC Margin / Jan-June IC Sales volume
  • 6 month rolling IC margin in July will be based on Feb-July IC Margin / Feb-July IC Sales volume
4) There are 2 types of elimination:
  • Intracompany Elimination
BI will eliminate when the BU/Entity has an identical corresponding Partner BU/ Partner Entity
  • Intercompany Elimination
BI will eliminate when the BU/Entity and the corresponding Partner BU/ Partner Entity
belong to the same level of the market hierarchy (eg. same End Market, Cluster, Zone, Area or Region)

Friday, July 15, 2011

The Framework of a BI BAU Organization


Change Management Framework - To be able to access the BI related changes and impacts from current development and changes in feed system, identify escalation party
Business Release Framework - To ensure the changes land successfully in the BR timeline, ensure the readiness of infrastructure and no impacts resulted from the transports
Quality Assurance Framework - To ensure no or minimal risks or defects landing into the system

Monday, July 11, 2011

Future of SAP BI?

Ok, we all know BO is gonna replace SAP BEx (not in near future) but HANA to wipe off the datawarehouse? This is like hardware evolution vs the datawarehousing technology. HANA 1.0 is more like the BWA but going forward, SAP's roadmap was to replace BW database with HANA, main memory cache is used to store data copy of database. But the BI application server is still to be used on top of HANA in addition to separate application server for ECC.

In-memory is going further to eliminate the need for a separate OLTP system . BI analytical capabilities are directly incorporated into the ECC system and allow all analytics to be run on operational data (ECC data). But hey, it's still a long way to go.


Wednesday, June 22, 2011

The BI Delivery E2E Bird Eye View



A successful BI delivery in any organization is actually an end to end process which involves a lot of parties and bridging of gaps between them. A common overlook in most organization is the lack of involvement of business (yellow box) and the bridge between the different parties (red arrow) that result in a long term lose of credibility from users and this leads to the failure of BI delivery.

Without the business involvement from early start coupled with the straight timeline of delivery, the project is eventually pushed to deliver a solution base on technical go live as there is a problem securing the business user's involvement at the last milestone of the project. A BI solution that is delivered into BAU without business go live is at high risk of capturing the bugs and issues later on when business starts to be engaged and they reveal tons of issues from the solution. This eventually leads to a lost of credibility on the BAU organization when it fails to deliver a lot of major fixes required by business due to the fact that those fixes are not ready to be taken on by Support team and the project team had left when the project was declared technical live. In situation like this, is it politically correct to say the BAU organization is subjected as the 'victim' of the whole process?Which party is going to fund the cost if the project team is required to stay back to fix the issues? That is the reason a solution that is fit to land into BAU mode has to be both technically and business live. The BAU organization has the authority to raise a red flag to prevent a solution to land into BAU state without proper sign off from both technical and business.

The other common overlook is the knowledge transfer and skills capability from Project to Support team to maintain the solution (blue arrow). Often if this is two different teams or companies,there will be a gap of knowledge transfer for Support team to take on any major bug fixes or enhancements, thus leading to a high turnover time that at the end resulted in business losing confidence of the BAU organization to ensure 'business run as usual'.

Thursday, June 2, 2011

Transport Watchout!

During the imports of a BI transports, the following should be look out:
1) Double check if Function Group or Shared User Exit are intentionally needed to be transported
2) Old transport overwrite new ones (esp in a multiple solution development environment)
3) Orphan transport left in non production system

Tuesday, May 31, 2011

Good Practise in Transformation Rule

It is discouraged to use the field routine to translate certain fields to its mappings or read certain attributes from the master data if the source data is huge. This is because the data is processed one single row each time the routine gets processed when it goes through the main loop instead of as the complete data package in an internal table in the start or end routine. The best practice in a transformation is to:

  • Filter data in start routine and perform business mapping rules in the end routine. Usually in the end routine,you will work with the result package to capture error stack records and for re-transformation.
  • Declaring global variables and capturing data in Global internal table at start routine. 
  • Using constant to pass derived value from one field to the next

Infoobjects Stories

Infoobject is the smallest part of the SAP BW building blocks and it stores master data. Thus it plays a very important part in ensuring single version of truth and to represent the correct business definition. It is also very crucial to design an infoobject's attribute and compounding key as if designed wrongly, there may be massive re-work or having obsolete and new infoobjects that are used in different solutions but have same business definition. That is why the creation and modification of infoobject should be controlled by tight governance and approved by a board of experts who are well verse with the BI landscapes and risk of introducing changes to the existing infoobjects.

Compounding of master data infoobject to source system is a must in a multiple source system environment and in a multiple regional landscapes that consolidate master data at global level. The 1:1 or 1:N relationshio in an infoobjects and its attributes has to be properly defines as it won't cater for N:N relationship in a drill down reporting, eg multiple end markets may get their goods from multiple factories. Hence it would be impossible to capture the drill down value for end market and factory in a single view.

Thursday, May 19, 2011

BI BAU Role

In most big organization, the employees retain in an organization are needed to be of value add and they are classified as BAU resources. Some company put it as Core and Non-Core.In a BI division one of the Core or BAU role is called Technical Design Authority. This role covers a wide range of responsibilities which comprises of a Build Analyst and a Cutover Analyst role. In a nutshell, a build analyst is a person who QA the build and design according to the technical blueprint or proposed fix while a cutover analyst is responsible for the coordination of transport and activities prior and during the business release cutover. The other role which is the bread and butter of a TDA tasks is to perform impact analyst on the changes going into Production box. This often include cross application impacts and shared objects in BI which can be analyzed from the Metadata repository. Cross applications often require the involvement of master data team in terms of master data cleanliness and mapping rules and ERP/APO approval for any of the datasource changes in the feed system. Some other activities under the care of TDA are process chain schedule and data reload requirement which require impact analysis before those tasks are executed by Support team. A senior TDA who has massive project and hands-on experiences should be able to perform review on the build and design of the solution proposed by the project team and can approve or deny a particular design to be implemented. This means he/she needs to have detail understanding of the proposed solution and architecture of the system in order to mitigate the risk of having flawed design or solution which does not work as expected, requires extensive maintainence or does not adhere to the organization standards.

In any BAU role, the ability to gauge a scenario or issue on the level of urgency and importance, the level of effort to solve the issue,the level of exposed risk,the level of impact and correct escalation route are important to avoid redundant discussions, long turn over time, too many processes introduced that has many gaps in between and eventually deviate from the real objective - to ensure business run as usual. In short, this role requires significant knowledge of the company BI processes and usage.

Sunday, May 15, 2011

Checklist for Build and Design Review

During the walk-through by developers on the changes in BI environment, there are a number of criteria and standard that are required by the developers in their development work before it can be promoted to the BI production landscape.This is important as to avoid 'bad design' and overlook of design issue that can cause impact to other objects in a complex BI environment. These includes:

Review build list objects:
Check against functional design

LSA data flow review

Reporting
  • Infocube dimension build
  • Correct use of infoprovider infoobject as constant where multiproviders used
  • Multiprovider build against standards
  • Query build against standards (CKFs/RKFs)
  • Web template design - efficient use

Scheduling

  • Process chain walk-through & review
  • Housekeeping process chains maintained
  • Correct use of master data process chain
  • Cross-functional object scheduling

ABAP code

CR reference + business release in code description
Performance checklist:
  • Start routine table caching
  • ABAP written efficiently
  • Full loads / deltas used correctly
  • DTP package sizes
  • Web template size

Use of error stack


Reactivation of data flows

On the detail design review phase, these are the items that need to be QA and considered before the build can commence:
  • Business release milestone
  • Assess cross functional impacts
  • Data volumes understood, included in SSD
  • Conformance to a company design and naming convention standards
  • LSA principals followed
  • Process chains
  • Supportability of solution ie. Ease of support
  • Impact on current system / solution
  • Authorizations -roles maintained, role design, central authorization adherence, analysis authorization
  • User numbers
  • Portal impact-portal engagement questionnaire filled out
  • Scheduling - Scheduling impact understood/PSA / Change log deletions considered
  • Future proof - ability to handle steady-state volumes

Wednesday, May 11, 2011

Modelling Methods (Slowly changing dimension)

1) Dependent attribute as a time dependent navigational attribute (key date can be set at Bex
query)
2) Dependent attribute as a node of an external time dependent hierarchy
3) Dependent attribute as a node of an external version dependent hierarchy
4) Put the dependent attribute of your characteristic as a characteristic in the same dimension
5) Time dependent master data with intermediate DSO set to overwrite (reprocessing approach -
reload all for back-dated master data and select data to be reprocessed for forward-dated
transactional data dates)

Finance term for IT Geek

General Ledger
The central task of G/L accounting is to provide a comprehensive picture for external accounting and accounts. Recording all business transactions (primary postings as well as settlements from internal accounting) in a system that is fully integrated with all the other operational areas of a company ensures that the accounting data is always complete and accurate.


CO-PA
1. The updates that flow to FI and CO from different processes within SAP.
2. Value fields are building blocks of COPA.
3. Different types of updates that flow from FI to CO:
  • during the Supply Chain process
  • during the Material Ledger process
  • from Project Systems and Cost Centre Accounting

CCA
You use Cost Center Accounting for controlling purposes within your organization, making the costs incurred by the organization transparent.

Costs are assigned to their Function so you can determine where costs are incurred within the organization. From these cost centers you can then use different methods to assign the activities and costs to the relevant products, services, and market segments.

WBS
The Project System module in SAP R/3 holds vital information and guarantees constant monitoring of all aspects of a Project. A clear, unambiguous project structure is the basis for successful project planning, monitoring, and control.

You structure your project per the following points of view:
  • By structures, using a work breakdown structure (WBS)
  • By process, using individual activities (work packages)

Projects Systems data to provide information about Brand Expenditure.


Branded Trade Expenditure which is not related to one specific brand can be captured on cost centers and needs to be allocated to branded WBS elements based on standard allocation keys set once a year. If only related to one brand, please capture on WBS element.

Statistical key figures
In SAP, statistical key figures can be created to enable automatic allocation methods to cycle the costs from cost centers to WBS elements. This statistical key figure functionality can be used for example: key account contracts.

By setting once a year a statistical key figure, the key account costs captured on cost centers can be cycled automatically, based on a set of allocation rules, to branded WBS elements.
The statistical key figure functionality will decrease the manual allocation postings currently done by several end markets.


Tuesday, May 10, 2011

Error Stack Handling

This is bound to be a debatable item for any BI solution on the owner of the error stack in BAU mode whether Support or Business users. Before the developer switch the error handling mode in DTP, please consider the fact of 'real' records that supposed to fall in error stack and those supposedly be filtered out before transformation level. Support also has the tendency to switch on the error stack in cases where a lot of data that were supposed to be filtered out causes process chain to fail.

'Real' records that suppose to be monitored in error stack are:
  • data related to mapping logic
  • data dependent of master data attribute to derive

The above points to the fact that the error stack management should fall into the master data,business user or SME as to correct those records, business logic is mandatory.

Unnecessary data which supposed to be filtered out (usually in start routine) are normally data that is not used from that module. Example records from certain sales organization are not needed to be reported can be filtered out before it went into the transformation level.

Sunday, May 8, 2011

0FI_GL_6 and when 0FI_GL_4

0FI_GL_4 Extractor works in different way . Init and delta is based on system date .So only one delta is possible in each day . This is applicable for older SAP PI release until 2004.1 and there is a OSS note to enable minute base delta. Please refer to OSS note 991429.

You can obtain detail information of 0FI_GL_4 extractor here.

In BI we extract 2 types of General Ledger information
  • Opening Balances (once a year, starts only on 1.1.2011) Take from 0FI_GL_4.
  • Line items postings (daily) Take from 0FI_GL_6.

If you have ECC version 5 above, you can opt to use 0FI_GL_10 for transaction figures (balance carry forward) and 0FI_GL_14 for line items. If you have EhP3 upgrade on your ECC, there is a new option to use 0FI_GL_20 for transaction figures and 0FI_GL_40 for line item. The downside for 0FI_GL_20 is this data source has after image delta and therefore a DSO is required to enable delta. For  0FI_GL_40, it is by default not delta enabled  because ist specifically created for the Remote Cube.

§

Outsourcing, BAU and CAB

In most big organization,resources are often categorized as 'Value Add' and 'Commodities'. 'Commodities' groups are often outsourced and this include developers and supports. 'Value Add' groups are BAU organization and often comprises of management and gatekeeper who contribute to impact analysis and decision making on certain action or escalation point when issue or change arise. They tend to form a lean organization structure who participate in advisory,coordination and management workforce. In order to have a standard agreeable escalation point either to project as enhancement or support as bug/break fixes, the Value Add group's decisions need to be governed by a set of rules call 'CAB exemption list'.

As the structure and solutions in an organization are different, the way this exemption list evolves is different. But the baseline of arriving at a standard agreeable ownership of issues and clear defined process has to be based on the content of the list. This list is an ongoing efforts to put down the ownership of issues based on the objects together with different type of scenarios that could possibly trigger the change of the object. Eg. In cases like a DTP,missing DTP is categorize as CAB1 as this impact the daily process chain while new DTP filtering rule are CAB2 as it may be an enhancement to cater for any new business rules. Any new scenarios need to be discussed in the CAB meeting to agree on which category it falls into, either CAB1 or CAB2. In a nutshell, CAB1 is a list of responsibilities held for bug and break fixes. CAB2 is a list of objects that involves enhancement to existing design. Small/easy enhancements such as adding values to the condition of filtering can be addressed by Support team wherelse multiple enhancements or big enhancement can be grouped into a mini project that falls into project.

It is still a debatable subject on whether SME falls under 'Commodities' or 'Value Add' group. This question back on the reliability to outsource entire development work to vendors with only indirect participation of BAU during build review session. SME of BI solutions requires a thorough understanding of technical and functional requirement for a particular solution and they are the best people to perform any impact analysis to the proposed solution. Most of the BI impact assessment are technical as they are the receiver system of the business change in the feed system of R/3,external system or APO.As such it is possible that the technical impact assessment can be done via the help of BI meta-repository and a where-used lookup. There are other exceptional cases in which changes in R/3 or feed system has to be impact analyze at BI level due to the nature of datasource or content being 'modified' at source system level (Please refer to my other post on this). This is because the projects are often the expert of the solutions because they design the solution and the gatekeeper will not have the necessary detailed insight of the design and build unless it's from blueprint and walk through sessions. The gatekeeper may not be involved in Support issues as well unless the support team is not performing according to their job scope.Support on the other hand is the group who understand the in and out of the flaws of the design as they are the front liner for any issues logged by business users.

It is also a common overlook of role feasibility when a BAU organization which aims to be a lean organization in IT services starts to streamline all 'subjects' (especially in BI where it consist of a range of different solutions for different reporting purposes) into one single SME role and further extend the role to a 'Value Add' gatekeeper role who held responsible for any changes that land into the system.

On top of all that, the next successful outsource strategy will evolve around the concept of 'long term partnership' and not base on mere vendor-client relationship as it takes a close knitted working relationship to form a successful IT organization in a corporate company.

The thing about APO-BI Integration

The major issues with the data reconciliation between APO and BI is:
1) In APO, the planning does not use source system but in BI, the materials are compounded to source system based on the nature of multiple R/3 regional boxes feeding data to one regional SAP BI instance which in turn consolidate with other regional BI instance to consolidate data at Global BI instance. There are incidents where same SKU may exist in 2 regional R/3 source but refer to different material.
2) In APO, the planning does not plan by base UOM or ISO UOM, it always plan in PUM. Hence when data is send to BI, the conversion factor must include converting base on PUM to the base UOM and then to the ISO UOM or CORE UOM.

Saturday, May 7, 2011

Common Bugs, Breaks and Enhancements

Whether it's in the phase of project transitioning into BAU or already in BAU mode, there are common major bugs,breaks and enhancements in SAP BI that are needed along with the standardization of master data (that runs in parallel with ERP convergence) . It is always a long back and forth discussion on who is the responsible party to be responsible for those changes and on which category the desire fix falls into whether it's a project defects(bugs), breaks (resulted from changes of other things) or an enhancement. This is important as it lead to the party who will fund the fix.

Here is a list of common bugs, breaks and enhancements in a highly integrated SAP BI platform:
Bugs
  • Web template functionality
  • Inconsistent coding logic applies in routines with the same business rules

Breaks
  • Changes in shared user exit
  • Changes in the shared infoobjects

Enhancement
  • Adding navigational attribute
  • New flow/enhanced datasource to pick new fields

Wednesday, March 9, 2011

Integrated system and impacts

BI in a change management environment has its challenges in terms of dependency to other source system such as ERP and APO. Generally changes in ERP and APO or any feed system has impact to BI as BI depend on the source system for data. The changes in feed system can be driven by new business requirement(functional) or technical. Example of these are:

New CO-PA analysis object
This impact CO-PA extraction as the generic extractor for CO-PA cost based is value field specific thus any new one will require a new datasource to be generated and new initialization.

Realignment of CO-PA
CO-PA realignment is done in ERP side when there is a need to move SD related data in CO-PA from profit center A to profit center B. Any configuration that is done at operating concern level and create additional entries in table CE4XXXX will require reinitialization of COPA datasource at BI side. The date of reinit has to get business user's approval as when the new change in r/3 has to be reflected in BI.

Any ERP realignment from Project or enhancement CR or any activity on transaction KEND in Production system via firefighter ID will impact the COPA delta extraction in BI.

Scenario:
Finance
1) COPA datasource is used in F17 reports for data above gross margin and sales volume.
2) Realignment occured twice in Production server region A on 05.01.2011 and 09.12.11 to move SD related data in COPA from BAHEXP to MEBIXP profit center and ZAFIXP to BSAIXP profit center.
3) This resulted in the delta extraction from COPA datasource (1_CO_PA500AM01_1) in S+ to fail on 06.01.2011 and 24.12.2010 because the delta updates is no longer possible due to the data is inconsistent between the OLTP and BI. This is a known SAP scenario as stated in OSS Note 400576.
4) Realignment in ERP will require reinitialization and reload of BI COPA datasource. The date of reinitialization is also dependent on whether the changes done also affect the historical data.

* Please refer to attachment for evidence of delta update in BI fail due to realignment
Note 400576
KEB2 printscreen

Marketing
1) Marketing extracts billing data from  ERP system region A on a daily basis and there is no data retrieved for end-market specific business rules since beginning of November 2010.
2) The interim solution from BI is to ask the end markets to enter the data via manual entry until the sales volume flow is switched to COPA from SD Billing.

COPA additional value field
Example would be VAT is calculated in COPA report in ERP system region A but not in BI report.To get the VAT amount, it needs to be captured separately into another value field for VAT.

Hence a new extractor for that particular controlling area need to be generated to capture the two new VAT value fields :-
VVU88 VAT on Bulk Discount
VVK89 VAT Amount

Material Classification characteristics
The attribute of material classification such as length and unit of measure has to be consistent across the r/3 instances so that the structure generated in development can be used in test,regression and production when it got imported to the other landscape during cutover. If not, the extraction from those material classification will fail.Eg: Material A UOM  maintained in Regression server as GM but nothing in Development box.

APO datasources
APO datasources such as MALO and MALORE which had been modified at APO end needs to be replicated to BI and coordinated within Business Release calendar so that it won't impact the daily data load.

Market migration
Market migration to other regional ERP box has impact to the data BI is extracting as new business rules has to be defined in BI to pull data for the migrated markets from the other system.

Changes or standardization of master data
Decommission of any material characteristic or its value which is used in BI has impact to the master data in BI. Depending on the type of attribute it was modeled in the cube , either navigational or display, the effort can involve the reload of data and missing historical value.

Logical system name conversion
Changes to logical system name has to be straightly coordinated with the dependent system such as BI and EBI APO so that the datasource will point to the correct logical system and has the correct technical name. This is very important to be reflected in the rsyslogmap as well so that the mapping of logical system is done correctly for the DTP and DTP in the process chain when transports are imported over to Test, Regression and Production system.

Master data

Example is changes of area and zone for a business unit. This is triggered from the business and channeled to the MDM team. There is a need to bridge the MDM to BI Support as a downstream impact on the master data change will require alignment in the hierarchy and infoprovider master data.

Monday, February 28, 2011

BI Reality

The reality is :
1) Adoption level and usage of the reports drive the BI business and needs of services from vendor and shared service
2) Business pays for the enhancements and new reports
3) Data accuracy is most important (follow by the availability of reports) and often a lot of build defects or missing/incorrect business rules are realized at later stage. In a multiple SAP R/3 environments, data accuracy and feasibility to consolidate data from multiple systems relies on the data standards and business rules. The data standards is usually adopted in the source system and extracted to BI but in some cases, there will be a requirement to standardized the master data in the BI layer due to the complexity of getting it done in R/3. Eg. to measure the consolidated sales of the same product which is name differently in different systems.

Wednesday, February 23, 2011

Global and Regional Authorization Concept

Concept:
- Global AA role (A)
- Global role (B)
- Global composite role (A+B+E)

- Regional AA role (C)
- Regional role (D)
- Regional composite role (C+D)(E)

Authorization can be inserted into roles that are used to determine what type of content is available to specific user groups.

Authorization Objects
Authorization objects enable you to define complex authorizations by grouping up to 10 authorization fields in an AND relationship to check whether a user is allowed to perform certain action. To pass an authorization test for an object, the user must satisfy the authorization check for each field in the object.

SU21 to maintain the authorization objects. Major one starts with RS.

Analysis Authorization (AA)
AA define semantic data slices a user is allowed to see in reporting, eg all data belonging to company code variable xxx that goes through user exit during query runtime. Infoobjects has to be defined as authorization relevant.

AA (Authorization Analysis)

Demand IT in BI

Ensures the reports are utilized
Bridge between end users and BAU organization to obtain the fund for valid/required change
especially in master data changes and standardization
Ensures business users availability of UAT

Business Release in BI

Changes to be promoted in Production system for all the IT system in a big organization that include several regional instances has to adhere to the business release time line.

This is to ensure the impacts are accessed and minimal risk is introduced to the production environment. There is also a need to cross check the dependent changes that ensures the correct sequence of importing the changes in all different systems are followed.

This is very important as:
  • Prevent newest changes to be overwritten by old transport(emphasize in orphan transport and transport not in build list)
  • Datasource not imported first or replicated after . Eg shared datasources like 2LIS_02_SCL (Purchase Order History) which is shared between SRM and BI (Procurement solution).
  • Data loading and transport sequence (top down dependency and cross solution dependency)
  • Inactive objects discovered later and development system is opened for new changes, thus re-transport is impossible
  • Shared datasource and dataflow are not impacted.eg OTIF(sales forecast) and COPA(sales volume)
  • The conversion of logical system name is done in parallel (done in the same BR) for all the 'target' system like BI and APO that feeds on the same ERP system when the feed system change its logical system name.

Change Management in BI

The key success for a Global platform is to have strong governance. In order to produce strong governance in a big organization, a robust and effective process has to be in place. This include the deep understanding of a changes in business and technical that impact BI. One of the rule of thumb of change management in BI is changes in the target system won't impact the feed system but changes in the source system might impact the target system. For example: COPA realignment is done inR/3 which changes the historical master data but BI must reinitialize the delta. New characteristics are created under a new class for new/existing material group. In order to report on the new attributes , BI needs to regenerate its material classification datasource in R/3. A criteria needed to ensure successful change management is the ability to understand the root cause of the issues technically and functionally and address its risk and effort level accurately. Knowing how to identify and fill the gaps between in-source responsibility with outsource capability is equivalently important to drive an efficient BAU process and avoid redundant processes to take place.

The objective of a change process is to ensure there is a standard and control for changes that lands into the system in order to mitigate risk of defects and impacts. But the control must be flexible enough to hold different type of scenarios that range from project mode, project into bau transitioning mode, shadow support mode,post go live, warranty fix mode,technical go live, business go live and etc. Eg. during warranty period or post go live mode, it is impossible to demand for a CR for each fixes as it is not uncommon at all to spot handful of bugs and breaks in those period. At some cases, when project and demand IT team are not in agreement of the project release timeline, there will be a release for technical go live and only when there adoption of the BI reports by business users, the project is moved to the business release go live stage. Such scenario is very complex to handle in bau management and project resource management, especially the technical warranty period is lapsed but more issues were encountered during business go live. At such, the process of change management should be flexible enough to apply different level of control at different stage.

Friday, February 18, 2011

BI Readiness in the Global Arena

  • Adoption level of the regional business users to utilize the BI reports as a reliable source to make decisions
  • A BI roadmap that ensures strategical implementation, maintainability and governance that adhere to a tactical operational model
  • Standard business rules that might impact the data mapping and conversion factor that needs to be in agreement by all regional stakeholders
  • Readiness of standard master data
  • Establishment of inter dependency between BI and ERP/APO/source system
  • Existence of Demand IT and body to govern the functional changes
  • Existence of Information Office to bridge the business users , the Demand IT, the Solution
  • Delivery and the Solution Center
  • Solid process and efficient of business release management process
  • Strong governance and efficient change management process
  • Good partnership of BAU organization with project and support team