Thursday, January 25, 2018

BW Relationship with EPM 10.1


For those BW consultants who had experience with executing the UJ's command from the backend and writing script logic, the new EPM 10 interface is a breeze for them to relate the interface with the backend program.


The new interface that is designed to enhance user intuitiveness.

Web based EPM 10.1










Data Manager








Tcode:

UJFS -   Give you information about the files in BPC server, like AL11 for BPC








UJKT -  A test tools which allows you to test the script logic in backend such as perform carry
forward via backend which correlate to perform Carry Forward activity  under EPM 'My Activities'


At EPM 10.1 frontend
















At excel based Data Manager package











UJBR - Backup and restore tool for BPC objects



Table:

UJP_PROC_STEP or UJP_PROC_STEP_A - check the User ID in which environment and also the activities performed in BPC. Usually this is used to mark 'X' to 'kill' the user's job when faces error such as Journal posting.



Script Logic:

At UJFS:




















At the EPM Administration:














Useful reference:


https://corporateperformanceintelligence.wordpress.com/2013/08/21/some-important-sap-bpc-transaction-codes/

Tuesday, September 26, 2017

SAP HANA Studio for BW Modeling

For BW 7.5 on HANA, we  can use the HANA Studio to model the new BW objects based on Eclipse technology such as Advanced DSO, CompositeProvider, ABAP CDS view, ODP SAP Extractors (leverage HANA runtime) etc.

As this version is a bridge to BW4HANA, some modeling will still be done in the SAP GUI and some on HANA Studio.

To get this started, first we need to download SAP_HANA_STUDIO (it will be something like IMC_STUDIO2_97_3-80000323.SAR) and SAPCAR to extract  .SAR content. To extract SAR files, refer to this link.

After installation , go to Help and Install New Software


You will be prompted for an entry , type in Eclipse Neon repository - http://download.eclipse.org/releases/neon (The sequence of release for HANA studio starts with luna -> mars -> neon as in L,M,N)


Choose for  ABAP Development Tools & BW Modeling tools


After installation, restart the HANA Studio and change the perspective


Choose BW Modelling


Next create a New BW Modeling project


Select the SAP connections to connect to the SAP BW objects you need to work with or enhance


Once connected, you are able to view the SAP BW objects at your right panel


Note: Try to install from the luna version  if the components from luna version cannot be installed from your system and then upgrade the version.

In HANA Studio, the version of DSO can be differentiate from "Classic" and "Advanced"








Monday, January 30, 2017

All about Data

We have heard a lot about data technologies and tools. There is data modelling, data quality, metadata management, data governance, data stewardship, data lineage, data profiling and so on. Let's figure out just HOW does all these concepts fit together. So basically it means this -  We need to understand the data we have , design and arrange it in a logical manner, centralize the pool of data and then map those data to its business definition by enforcing a standardization and governance process. When we have those in place, we can further understand the quality of the data by setting up the business rules to check the data correctness and  subsequently trace it back to the original source. This process is a continuous journey and the further we go, the better our data quality and governance are, and yes corporate jargon call this as Data Strategy.So when we have all these concepts and understanding in place aka the Data Strategy, we can choose the solution from one or a combination of tools from vendors like SAP Power Designer, Information Steward, Informatica, Erwin , ER/Studio, Collibra that meet our budget and IT environment to implement it.





Monday, April 4, 2016

Reality of Business Critical Reports

Recently I had a few sessions conducting asessment across the Support team to understand their concerns and criticality of the support procedures. I threw in questions like what if the BI system were to decommision now, what they forsee that will trigger a huge impact? And to sum up all the conversations that took place, there are many reports supported but the most critical ones being the financial management reports that use the BI platform for business view mapping and data transformation. The usage of these reports had become an integral part of the support mechanism as every month end of financial report submission, there is already a process in place that involve both IT Support and business users in area of data validation and tracing, mapping upload and changes all the way to rectifying posting issues when there is discrepencies reported in BI report total sum value when compared to the operational reports. Hence the criticality and success of BI is actually largely supported by the ongoing process that involve both IT Suport and business that enable the usage of it to be an integral part of bau.

Wednesday, March 23, 2016

Delta Load Implementation

As ETL tool such as Data Services and SSIS evolve, we can see there are request to extract SAP ECC tables directly out from the source (without going through SAP BW). There are a lot of considerations when we land on that approach mainly around the corporate datawarehouse roadmap and data policy, the standard EAI layer and the delta load mechanism.

When this approach is adopted, an organization should evaluate their existing datawarehouse such as SAP BW and the implication of single source of truth when data is extracted out into another system. Is the staging system supposedly to be a global datawarehouse in long run, is it planned to be the regional datawarehouse layer or just to cater for silo project as a 'dump' or staging area to transform some data to meet silo business unit reporting needs?

On the technical side, I always put emphasize on the delta mechanism as there is quite an in depth configuration at the ECC end. The common delta method may be timestamp and there are certain tables that store the timestamp records when transactional records were added. There are also log table to detect changes or deletion in records. And during data load there is additive or overwrite records functions. Another interesting area to take into account is the initial load. Some huge extraction from multiple tables with complex join require setup tables to populate huge historical dataset first before the delta take place. This is a pointer noted for performance consideration. 

I came across design using Data Services to extract data directly from huge ECC tables that prior to loading read from an ABAP program which identify the delta mechanism or logic such as reversal of posting etc. Any approach that the organization takes for data extraction, it is good to cover all the area during IT strategy or solution stage from BI roadmap to the technical and security assessment of the solution.

P/s this is a good read on delta methods

Monday, February 15, 2016

The Foundation Must Be Right

We hear a lot about the power of business intelligence through predictive analysis and data mining, the need of dashboard for overview and decision making and the empowerment of operational users through self service reporting. These are outcome of a successful BI practices supported by mandatory pillars - BI Framework , Data Foundation , Business Driven Values.

Thursday, January 14, 2016

BI Improvement

I was assigned to lead the BI improvement task in my organization and I started off with a very detail study of the existing BW environment from the start of the system to date. To tell a technical story is always based on facts and statistic.To link statistic to improvement is a challenge. There are thousands of BW objects and to start is a challenge itself.

So I started out by accessing the current state from three perspectives -> Common Shared Layer, Data Integrity and Usage. I tracked down the objects in a chronological manner by years and projects. To determine common shared objects and layers, I go into the main multiprovider that zooms down to the datasources. I listed down all the datasources and the direct propagation layers. Next I listed down all the main infoobjects use in the main query and map it to the business context. And I checked on the last access date of the query. 

With this analysis results, I am able to derive the statistic on the number of projects that do not shared the common layers and objects to prove that the projects run in silo, how many master data with the same business context are created to prove that there is no single source of truth for reporting within the same ECC module and the last access of the queries to prove the BI usage. This analysis will act as the current state to measure how much we are able to improve by comparing the same statistics after implementing the improvements plan.

And how does this relate to cost saving over the time? That is my next challenge to come up with facts and findings that can tell this 'story' from financial perspective. 


Monday, January 11, 2016

Data Lineage in SAP Environment

It is great to hear that SAP metadata management has the coverage of end to end SAP environment that leverage on Information Steward features such as Metadata Management and Metapedia. But the tool itself cannot be a standalone feature when it comes to the  purpose of data lineage initiatives that is to ensure the data accuracy and data standardization. We still need to enforce a process to govern the custom master data metadata in BW environment that must be rooted to the source of master data in ECC.

This is a good read on Metadata Management in SAP Landscape.

Tuesday, December 29, 2015

A Look at SAP BI Servers Landscape

In many organizations that is going into the maturity stage of BI, there will be additional service offerings brought into the business and as the technology evolves. Many already embarked on Business Object 4.1 as their BI frontend reporting tool and SAP BW as their datawareouse. When the needs of Data Services and Information Steward arises for the purpose of data profiling and cleansing, the existing landscape has to be scalable to ensure that there is always the first emphasize made on single source of truth for reporting and this is very much refer to data structure and master data. Below is an example of the architecture landscape proposed on the aforementioned.



Please also note that the BO 4.1, DS and IS servers share the same server but with own repository and services. Architecturally, IS and DS are inseparable in that IS relies on DS.  In addition, they have a lot in common as well, they both leverage Information Platform Services (IPS).  Information Steward and Data Services both rely on CMS services for centralized user and group management, security, administrative housekeeping, RFC Server hosting and services for integrating with other SAP BusinessObjects software (i.e. BI Launch Pad).

For more information:
http://scn.sap.com/community/information-steward/blog?start=0


Friday, March 27, 2015

Relating Visual Analytics with daily life

Recently we have implemented Good Service Tax of 6% and this is how we can view it from a Heatmap, a way we can illustrate visual analytic in our daily event.The space indicate the proportion of the GST to monthly salary, the intensity of the color depicts the volume of the figures, the higher or bigger the darker the color.


Wednesday, August 20, 2014

Cloud Computing... I am coming

Lately I'd been bombarded with the 3 famous technical jargon - IaaS, PaaS,SaaS. Depending on the vendors, for SAP they put it as IaaS, DbaaS, PaaS.

As we mature into the cloud technology, it is really the time to think about a plan to adopt this robust framework and more importantly is to plan out a strategic transition phase to move from on premise platform to the cloud. There will also be an intermediate phase in which we need to factor in the extension of existing on-premise solutions to the cloud solutions.

It is exciting to imagine that in the future everything in the future is accessible from the cloud and there is nothing that we need to install or stored locally anywhere in our own devices. We can tap on to any device and start to work! We can use any applications through subscription base and in a way allowing deployment to be more agile and flexible.