Friday, April 9, 2010

UOM

Often in a complex multiple regional SAP instance environment that does not have a standard UOM defined or in the midst of doing so, BI is required to 'define' the standard UOM that is govern by an existing or new data standardization team.The standard UOM is specific to the company's business rules and not necessary refer to the international ISOCODE. In order to convert the transactional UOM to the standard UOM, the logic derived from the r/3 unit of conversion table such as matunit and t006. The standard UOM and base UOM also needs to be captured in the material group or material master data so that there is a base and target UOM that can be referred to for conversion.

Wednesday, April 7, 2010

Step 1,2,3 & Validation check + message class

The enhancement RSR00001 (BW: Enhancements for Global Variables in Reporting) is called up several times during execution of the report. Here, the parameter I_STEP specifies when the enhancement is called.

I_STEP = 1
Call takes place directly before variable entry. Can be used to pre populate selection variables

I_STEP = 2
Call takes place directly after variable entry. This step is only started up when the same variable is not input ready and could not be filled at I_STEP=1.

I_STEP = 3 In this call, you can check the values of the variables. Triggering an exception (RAISE) causes the variable screen to appear once more. Afterwards, I_STEP=2 is also called again.

I_STEP = 0
The enhancement is not called from the variable screen. The call can come from the authorization check or from the Monitor. This is where you want to put the mod for populating the authorization object.

Full text available here.

Global Exchange Rate Program

Exchange rate is shared by financial reports from different projects such as Management Information (MI) and Product Costing. Thus ensuring both are taking and defining the same rate is crucial. Rounding adjustment and inverse flag are some of the items that should be factored in when designing the reports.

In a global environment, the exchange rate is retrieved from a single point , eg. from ft.com.
The exchange rate is usually downloaded to r/3 and global transfer to BI. Either that, BI can also have its own exchange rate program that download the latest rate for Budget,COPA,COPC etc.

Global Authorization

I a global and regional BI system environment, it is crucial to have business access segregation through a set of controlled and standardized roles and analysis authorization. Hence the BI developer/gatekeeper and GRC team has to work closely to ensure the roles are used correctly and new menu roles and AA objects are introduced whenever there is a new set of reports developed. Portal team is involved in creating the menu link at portal as well.

One of the approach is to introduce the usage of a centralized Authorization DSO in which the user and their report access privileges are maintained in the DSO and the access check is executed  through CMOD whenever the report is run .The check aims to identify  the type of BI reports/solutions and the authorization analysis object the report is based on. The regional Authorized DSO is also replicated to the Global centralized DSO and this can ensure the users have the similar report access across regional and global level. The standard forms for user to request for new roles has to be in place first and existing old roles have to go through a cleanup to reflect the new set of standard authorization.

Integration Party

Regional BI business release requires an alignment with multiple parties such as
1) ERP (dependency on datasources)
2) APO (dependency on datasources)
3) Portal (dependency of report published in portal)
4) Global BI (it depends on the readiness of regional cutover as the data flows
bottom to top or vice-versa)
5) Any other feed system that connects to BI

Three Top Questions To Ask a BI Vendor

I stumbled upon a very good article by Boris Evelson at Forrester Blog that reveals the unpopular fact about the essence of BI in a big organization. The first point really hits the bull's eye as eventually what controls the changes in BI matters the most in terms of minimizing risk and cutting cost.

Q1: What are the capabilities of your services organization to help clients not just with implementing your BI tool, but with their overall BI strategy.

Most BI vendors these days have modern, scalable, function rich, robust BI tools. So a real challenge today is not with the tools, but with governance, integration, support, organizational structures, processes etc – something that only experienced consultants can help with.

Q2: Do you provide all components necessary for an end to end BI environment (data integration, data cleansing, data warehousing, performance management, portals, etc in addition to reports, queries, OLAP and dashboards)?

If a vendor does not you'll have to integrate these components from multiple vendors.

Q3. Within the top layer of BI, do you provide all components necessary for reporting, querying and analysis such as report writer, query builder, OLAP engine, dashboard/data visualization tool, real time reporting/analysis, text analytics, BI workspace/sandbox, advanced analytics, ability to analyze data without a data model (usually associate with in-memory engines)

If a vendor does not, believe me the users will ask for them sooner or later, so you'll have to integrate these components from multiple vendors.

I also strongly recommended that the editor discounts questions that vendors and other analysts may provide like:
  • do you have modern architecture
  • do you use SOA
  • can you enable end user self service
  • is your BI app user friendly
because these are all mostly a commodity these days.

CO-PA vs Billing

There are 2 type of datasources in SAP R/3 that can be extracted to BI for the sales volume figure:
1) CO-PA
2) Billing (2LIS_XXX - PO and SD)

The difference between these two extractors are the point of time the data is updated to either logistic or accounting table.

In order-to-cash business scenario, if the sales volume is required to be measured in the initial stage where the sales order is keyed into the system, then CO-PA can't be used as during that stage, no accounting document is created yet. Financial impact will only occur in good issue stage where table BKPF and BSEG are updated. The final stage of this process in logistic is invoice creation (updating table VBRK and VBRF). In accounting, the final stage would be when receivable is cleared and payment is received where another accounting document is created in BKPF and BSEG and open receivable is cleared and BSAD is updated.

In procure-to-pay scenario, the early stage (PR to PO) all the way to Good Receipt does not involve any financial process until invoice is received by other party and financial table BKPF and BSEG is updated.BSAK is updated when invoice is paid via payment run and payable is cleared.

Saturday, April 3, 2010

CO-PA Profitability Analysis

CO-PA Profitability Analysis is a drilldown report which allows slice & dice multi-dimensional sales profitability analysis by variety of analytical segment (which is called Characteristics) such as market / region, product group, customer group, customer hierarchy, product hierarchy, profit center etc.., most of those characteristics can be filled in by standard SAP functionality i.e. customer master or material master.
By making Distribution Channel as a characteristic, CO-PA enables more flexible analysis. Distribution channel is filled in when sales order is created. User can differentiate distribution channel if they need to take separate analysis (such as between wholesale / retail, goods sale / commission sale / service sale etc..) without maintaining master data. (The customer & material master maintenance hassle caused by multiple distribution channels can be solved by VOR1 (IMG Sales and Distribution > Master Data > Define Common Distribution Channels).)
CO-PA allows to take in non SAP standard data by using External Data Transfer, but the major benefit of CO-PA is that it has close relation with the SAP SD module. SD profitability data is automatically sent forward and stored within the same system. In fact, SAP SD module without CO-PA Profitability Analysis is like air without oxygen. It is pointless using SAP without it.
Overhead cost allocation based on sale is possible by CO-PA, which is not possible in the cost center accounting.
Gross Profit report is possible by sales order base (as a forecast, in addition to billing base actual) by using cost-based CO-PA.CO-PA is especially important since it is the only module that shows financial figures which is appropriate in terms of cost-revenue perspective. For this reason, BW(BI) is taking data from CO-PA at many projects. BW(BI) consultants are sometimes setting up CO-PA without FI/CO consultants' knowing. CO-PA captures cost of goods sold (COGS) and revenue at billing (FI release) at the same time (cost-based CO-PA). This becomes important when there is timing difference between shipment and customer acceptance. COGS should not be recognized, but FI module automatically creates COGS entry at shipment, while revenue entry will not be created until billing (or FI release). Such case happens when for example customer will not accept payment unless they finish quality inspection, or for example when goods delivery takes months because goods are sent across by ocean etc..It is especially delicate to customize the copa infosource to map to your specific operating concern. This seems to be easier since ECC6.0 Enh pack 4 (auto generated).
For this point, CO-PA (cost-based) does not reconcile with FI. But this is the whole point of CO-PA, and this makes CO-PA essential. Account-based CO-PA is more close to FI module in this point. Account-based CO-PA is added later on, and it could be that it is simply for comparison purpose with the cost-based. Cost-based CO-PA is used more often.
When CO-PA is used in conjunction with CO-PC Product Cost, it is even more outstanding. If fixed cost and variable cost in CO-PC cost component are appropriately assigned to CO-PA value fields, Break-Even-Point analysis is possible, not to mention contribution margin or Gross Profit analyses. Consider that BEP analysis or GP analysis are possible by detailed level such as market / region, product group, customer group etc..
This makes no wonder. CO-PC is for COGM and inventory. CO-PA is for COGS. What do you do without CO-PA when you use CO-PC? It’s a set functionality. This doesn’t necessary mean CO-PA is only for manufacturing business, though.
Conservative finance/sales managers are reluctant to implement SAP R/3 sometimes because they are frightened to expose financial figures of harsh reality. Needless to say it helps boosting agile corporate decision-making, and this is where Top-down decision to implement SAP R/3 is necessary. Those managers will never encourage R/3. SAP R/3 realizes this BEP analysis even for manufacturing company. No other ERP software has realized such functionality yet. Even today R/3 is this revolutionary if CO-PA is properly used.
Role of capturing COGS-Sale figures is even more eminent in the cases of sales order costing or Resource Related Billing, and variant configurable materials with PS module or PM/CS module. (Equipment master of PM/CS is also a must to learn.) After determining WIP by Result Analysis, CO-PA is the only module that displays cost-revenue-wise correct financial figures. PS is necessary for heavy industry or large organization, variant configurable materials are also handy for large manufacturer or sales company. RRB is usable to non-manufacturing industry. RRB is indispensable for IT industry or consulting companies. The importance of CO-PA will be proven if used with these.
Production Cost Variance analysis is possible by assigning variance categories to different COPA value fields in the customizing. There are projects who had to develop production variance reports because they kicked COPA out of the scope without ever considering SAP standard functionality. Why do you cripple standard SAP functionality, simply because you are ignorant of anything more than CCA Cost Center Accounting or PCA Profit Center Accounting? Naturally it takes time to apprehend overall SAP functionality. This is where experience makes difference, which makes no wonder.
Settling production variances to COPA raises one issue. Variances originated from WIP or Finished Goods at month end all go to COPA i.e. COGS. Actual Costing by using ML Material Ledger solves this issue for the most part. Variance reallocation whose origin is unknown is only made to COGS and FG, not made to WIP. This is something SAP should have rectified long time ago. They made excuses that they didn't have enough resource to do that, developing BW, SEM-BCS, or New G/L on the other hand. Realtime consolidation became impossible in SEM-BCS, and New G/L isn't adding much of new functionality other than parallel fiscal year variants, in a practical sense. What SAP does is, they spent all their resource and effort in only revising the same functionality using the new technology, but nothing much was made possible from an accounting point of view.
ML can also be used to reflect transfer pricing or group valuations with specific buckets into copa value fields.Actual cost component split is also possible with ML, but you have to plan well in advance lest you use up value fields.COPA can also handle planning and actual/plan reporting. it has a built in forecasting engine (planning framework) and can handle top down or bottom up planning (using different versions, as well as plan allocations and distributions from Cost centers.A FI/MM interface allows you to post to specific COPA value fields for FI or MM based transactions (overheads or non-trade specific charges that impact the product profitability) such as trade show expenses etc.).
Configuration of CO-PA can sometimes be a bit of hassle. But it is far easier, cheaper and quicker than building infocube in BW(BI) from the scratch. If you know what you do with BW(BI), in many projects they take data from CO-PA table. Then why bother creating additional work of building BW(BI)?
Training course of CO-PA is just 5 days. Competent consultant should not spare such small investment. You will see there is more to learn about it in addition to that.
SAP is whimsical and they sometimes exclude CO-PA, CO-PC from the academy curriculum. They are not eager to keep trainees to have the right understanding of how to use their product. This is why many FI/CO consultants are ignorant of CO-PA and CO-PC, and only an experienced consultant knows its necessity.
CO-PA is a must for experienced FI/CO consultant.
One point which has to be added is, keep Segment Level Characteristics as little as possible.
I sometimes hear users linked too many characteristics, and completely ruined CO-PA database. If you look in their config, they link 5 customer hierarchies, 6 user-defined product hierarchies on top of standard product hierarchy, material code, sales rep in the sales order line level, and 4 other user-defined derivation segments as Segment Level Characteristics. Now their CO-PA table doesn't respond other than short dump in 3 years usage.
SAP clearly explains and dissuades from just adding sales order as segment level, and it is always a struggle. Whatever they were thinking in adding 45 segment levels.
It was once a controversy. Data segregation from program logic, data normalization and elimination of duplicate entries. That gave birth to SQL database which SAP is running on. Now SAP users don't know such history, and repeat the same failure.
They have corporate reshuffling and need to revise product lineup, then maintain product hierarchy. Why adding segment levels every time you have reshuffling?
No matter how you have new tool, there's no end. It's not tool itself. It's people who is twisting the case.
Successful usage would be product hierarchy 1, and maybe 2. If characteristic is configured, segment data is stored at that level. You can download the segment data, this may be a remedy. Data feeding and presentation in BW maybe another way.

*Article plucked from it.toolbox.com

Flows to CO-PA

From FI
Value fields is the building blocks of CO-PA
The updates that flow to FI and CO from different processes
- during Supply Chain process
Supply chain processing is linked to SD (Sales Distribution) and MM (Materials Management) modules
- during Material Ledger process
At month end after completing the material ledger close the actual periodic price will be generated and cost of sales updated at its actual cost for FI and CO-PA
- during Project System and Cost Center Accounting
CO-PA can be reconciled to PS after Settlement
CO-PA can be reconciled to CCA after Assessment Cycle

From SD
http://learnmysap.com/sales-distribution/264-ke4s-post-billing-documents-to-co-pa-manually.html

Friday, April 2, 2010

LIS vs LO

LIS & LO extractors- LIS is old technique through which we can load the data. Here we use typical delta and full up-loading techniques to get data from an logistics application, it uses V1 and V2 update, it means that it was a synchronous or asynchronous update from the application documents to LIS structures at the time when the application document was posted.

LO Cockpit is new technique which uses V3 which is an update that you can schedule, so it does not update at the time when you process the application documents, but it will be posted at a later stage. Another big difference between LO and LIS is that, you have separate datasources for header level, item level and schedule line level available in LO, you can choose at which level you want to extract your data and switch off others, which results in reduced data.When you configure LIS structure in ECC, the system environment  needs to be opened for changes where else not required for LO.

Friday, March 26, 2010

Lesson Learnt from Business Release

Part of the datawarehouse governance in a corporate company is to have quarterly business release (BR) for the enhancements and new project to be transported into Production system. Thus there is a need to set up a regression environment to ensure thorough testing was done before the the cutover took place. The regression environment has to be the exact copy of Production system in terms of data and configuration.

During this time cutover for BR1 to Regression and Production environment, I encountered numerous issues which I can relate them to the nature of the change request, shared objects, transports,data loading and design.

Change Request
There were two particular change requests that causes impact to multiple objects (including the queries, web template, infoobjects, function modules, class, IP aggregates) in the system.The first one being the reverse sign for data which is supposed to be reported both in P&L and Overhead. The data is entered in Overhead manual entry layout. It'd be entered in positive sign. The data saved needs to be in reversed sign as P&L report for these particular account is supposed to be shown in negative. We can't make changes to the frontend P&L report as the reversal sign in frontend will cause all other non Overhead account to be reversed as well. So the only way is to change the layout for Overhead report to flip the sign and ensure the backend data is saved in negative value. This change involves changes in all the Overhead and P&L manual entries and actual reports; both queries and web template.

The second one refers to the inclusion of additional level in cost center hierarchy for manual entry input and reporting. If the relationship of data is defined through the navigation/display attributes, then it is important to ensure the master data is correctly updated. If the new level needs to be open for manual entries, a new infoobject for that level has to be created and included in all the aggregates level and cubes. Thus there's a need to recreate a new aggregate level and perform cube remodeling. New aggrgates level means new manual entry layout. So we can see changes involve all the way from infoobject attributes, aggregates level, queries (both manual entry and output) and web templates.

Shared Objects
Removal of an obsolete attribute in an infoobjects can cause some of the transformation to fail as those infoojects are still in used. Thus the governance of infoobjects is very important to ensure the changes to any infobject (especially those shared in APO and Finance like material and material group) are carefully assessed on the impact of change.

Transport
Transport order and its prerequisite are important so that no old changes overwrites the new ones. It is a good practice to keep one change in one transport request or to collect only necessary objects. There'd bound to be incident where the objects which are not related to the change or fix are collected together with the transport request and caused the object to be inactive when moved to production environment.

Data loading
Whenever there is a logic change in the transformation level that requires the historical data to be transformed again causes the need to reload data to reporting level cube. There are different ways to approach this scenario but previous steps done on the reporting level data such as selective deletion has to be considered. Below are some steps that can be taken:
1) Selective deletion (in this case we can see the importance of including the source module character as well in that level although the objective at reporting level is to minimize the data granularity for performance purpose)

2) Deletion by request (in this case we can see the importance of loading data by request to reporting layer). The only setback is the load can take a long time as the previous requests were loaded in on daily basis.

3) Offset data through self-loop. This step is quite safe as offset data is added (nothing is deleted).

Design
Usually there's a need to report from the reconciliation level(data that had gone through all the transformation) for data checking purposes. This means the data in transformation level and reporting level has to be the same. In order to ensure the data is the same in reconciliation reports and actual reports ,the best practice is not to allow any transformation to happen between the transformation level and reporting level.

Monday, March 15, 2010

SAP Authorization Tables


Authorization Objects Tables

Table Name Description
TOBJ Authorization Objects
TACT Activities which can be Protected (Standard activities authorization fields in the system)
TACTZ Valid activities for each authorization object
TDDAT Maintenance Areas for Tables
TSTC SAP Transaction Codes
TPGP ABAP/4 Authorization Groups
USOBT Relation transaction > authorization object
USOBX Check table for table USOBT
USOBT_C Relation Transaction > Auth. Object (Customer)
USOBX_C Check Table for Table USOBT_C

User Tables
Table Description
USR01 User master record (runtime data)
USR02 Logon data
USR03 User address data
USR04 User master authorizations
USR05 User Master Parameter ID
USR06 Additional Data per User
USR07 Object/values of last authorization check that failed
USR08 Table for user menu entries
USR09 Entries for user menus (work areas)
USR10 User master authorization profiles
USR11 User Master Texts for Profiles (USR10)
USR12 User master authorization values
USR13 Short Texts for Authorizations
USR14 Surchargeable Language Versions per User
USR30 Additional Information for User Menu
USH02 Change history for logon data
USH04 Change history for authorizations
USH10 Change history for authorization profiles
USH12 Change history for authorization values
UST04 User masters
UST10C User master: Composite profiles
UST10S User master: Single profiles
UST12 User master: Authorizations