Monday, April 4, 2016

Reality of Business Critical Reports

Recently I had a few sessions conducting asessment across the Support team to understand their concerns and criticality of the support procedures. I threw in questions like what if the BI system were to decommision now, what they forsee that will trigger a huge impact? And to sum up all the conversations that took place, there are many reports supported but the most critical ones being the financial management reports that use the BI platform for business view mapping and data transformation. The usage of these reports had become an integral part of the support mechanism as every month end of financial report submission, there is already a process in place that involve both IT Support and business users in area of data validation and tracing, mapping upload and changes all the way to rectifying posting issues when there is discrepencies reported in BI report total sum value when compared to the operational reports. Hence the criticality and success of BI is actually largely supported by the ongoing process that involve both IT Suport and business that enable the usage of it to be an integral part of bau.

Wednesday, March 23, 2016

Delta Load Implementation

As ETL tool such as Data Services and SSIS evolve, we can see there are request to extract SAP ECC tables directly out from the source (without going through SAP BW). There are a lot of considerations when we land on that approach mainly around the corporate datawarehouse roadmap and data policy, the standard EAI layer and the delta load mechanism.

When this approach is adopted, an organization should evaluate their existing datawarehouse such as SAP BW and the implication of single source of truth when data is extracted out into another system. Is the staging system supposedly to be a global datawarehouse in long run, is it planned to be the regional datawarehouse layer or just to cater for silo project as a 'dump' or staging area to transform some data to meet silo business unit reporting needs?

On the technical side, I always put emphasize on the delta mechanism as there is quite an in depth configuration at the ECC end. The common delta method may be timestamp and there are certain tables that store the timestamp records when transactional records were added. There are also log table to detect changes or deletion in records. And during data load there is additive or overwrite records functions. Another interesting area to take into account is the initial load. Some huge extraction from multiple tables with complex join require setup tables to populate huge historical dataset first before the delta take place. This is a pointer noted for performance consideration. 

I came across design using Data Services to extract data directly from huge ECC tables that prior to loading read from an ABAP program which identify the delta mechanism or logic such as reversal of posting etc. Any approach that the organization takes for data extraction, it is good to cover all the area during IT strategy or solution stage from BI roadmap to the technical and security assessment of the solution.

P/s this is a good read on delta methods

Monday, February 15, 2016

The Foundation Must Be Right

We hear a lot about the power of business intelligence through predictive analysis and data mining, the need of dashboard for overview and decision making and the empowerment of operational users through self service reporting. These are outcome of a successful BI practices supported by mandatory pillars - BI Framework , Data Foundation , Business Driven Values.

Thursday, January 14, 2016

BI Improvement

I was assigned to lead the BI improvement task in my organization and I started off with a very detail study of the existing BW environment from the start of the system to date. To tell a technical story is always based on facts and statistic.To link statistic to improvement is a challenge. There are thousands of BW objects and to start is a challenge itself.

So I started out by accessing the current state from three perspectives -> Common Shared Layer, Data Integrity and Usage. I tracked down the objects in a chronological manner by years and projects. To determine common shared objects and layers, I go into the main multiprovider that zooms down to the datasources. I listed down all the datasources and the direct propagation layers. Next I listed down all the main infoobjects use in the main query and map it to the business context. And I checked on the last access date of the query. 

With this analysis results, I am able to derive the statistic on the number of projects that do not shared the common layers and objects to prove that the projects run in silo, how many master data with the same business context are created to prove that there is no single source of truth for reporting within the same ECC module and the last access of the queries to prove the BI usage. This analysis will act as the current state to measure how much we are able to improve by comparing the same statistics after implementing the improvements plan.

And how does this relate to cost saving over the time? That is my next challenge to come up with facts and findings that can tell this 'story' from financial perspective. 


Monday, January 11, 2016

Data Lineage in SAP Environment

It is great to hear that SAP metadata management has the coverage of end to end SAP environment that leverage on Information Steward features such as Metadata Management and Metapedia. But the tool itself cannot be a standalone feature when it comes to the  purpose of data lineage initiatives that is to ensure the data accuracy and data standardization. We still need to enforce a process to govern the custom master data metadata in BW environment that must be rooted to the source of master data in ECC.

This is a good read on Metadata Management in SAP Landscape.