Wednesday, August 20, 2014

Cloud Computing... I am coming

Lately I'd been bombarded with the 3 famous technical jargon - IaaS, PaaS,SaaS. Depending on the vendors, for SAP they put it as IaaS, DbaaS, PaaS.

As we mature into the cloud technology, it is really the time to think about a plan to adopt this robust framework and more importantly is to plan out a strategic transition phase to move from on premise platform to the cloud. There will also be an intermediate phase in which we need to factor in the extension of existing on-premise solutions to the cloud solutions.

It is exciting to imagine that in the future everything in the future is accessible from the cloud and there is nothing that we need to install or stored locally anywhere in our own devices. We can tap on to any device and start to work! We can use any applications through subscription base and in a way allowing deployment to be more agile and flexible.

Wednesday, May 21, 2014

Governance and Enforcement

Governance as we always hear it is nothing new at all in IT world. We create a set of standards and conventions in order to ensure things can be done more efficiently in long run and minimal risk of downtime incur to BAU. New developments or changes has to go through approval board, some call it CAB. However most approval process are merely a decision based on written regulation in the big governance book and not made to help the business achieve the goal of having governance at the first place - that is to enable things to run more efficiently in long run and ensuring minimal risk of BAU downtime.

This in turn result in the inefficiency of the governance framework in an organization as it is treated as enforcement rather than governance. A lot of governance framework in an organization is not revised over times and this contribute to the inefficiency of governance in an environment that is fluid and change over time. In BW project, a lot of project consultant finds it hard to perform after go live support that require immediate fix as they have to go through rounds of approval for transports and they are not allow to directly create objects in Production environment that support emergency data load such as infopackage. 

At the end, if the rules in the governance book is not bent to suit the business needs and its urgency, this is merely blind enforcement and not a practical way of governing a system. A lot may argue that if we allow it to happen one time, it will happen again. So in short , it is up to the governance board to exercise their power in tackling difficult situation and that is when in depth knowledge, experience and exposures are important criteria for an effective governance authority.



Tuesday, May 6, 2014

R - A Cool Guy with Serious Thoughts on Enterprise Technology

I got to know this cool guy 'R' who is an expert in telling you how disruptive technologies and new business models can impact the enterprise. I guess this is the kind of guy a lot of big organizations need to engage with before embarking on a serious of new technology adoptions. And not directly engage with Products vendors who is more bias in their own offerings.

Hi R,

I stumbled on your very good blog at http://blog.softwareinsider.org and here’s my POV.

Firstly, many organizations are still not mature in terms of BI and how it can really bring in the ROI. They started with BW as the backend and Bex query, Analyzer and WAD as the frontend. Initially dashboard is not that impressive when it was deployed from WAD so BO come into place with Xcelsius. Although Xcelsius can produce some impressive features, it is not really integrated with BW objects as compared to WAD and maintenance is very high. Most time was spent to develop the dashboards rather than looking into the data and business values. Over the next couple of months , we started to hear about Project Zen which eventually become Design Studio that looks something like WAD but with more features such as HTML5 and mobile integration (MOBI). We also have Advanced Analysis and BO Explorer that can directly consume data from HANA model.

As we continue with the technology advancement, we start to hear about Lumira as a Visual Storytelling tool. It is bundled in SAP HANA license so having HANA is mandatory. Lumira can produce very fast and interactive self-service dashboard when it consume directly from HANA model. But wait, remember in Gartner Magic Quadrant we have Tableau that rank the highest in data visualization and BI space. And even SAP has plug-ins ready for HANA to be consumed directly in Tableau. So there goes another debate for some companies, Lumira or Tableau.

Apart from that, not to mention there is also EPM tool for some power users who access the financial consolidation report. Imagine what the user will be getting when he opens his Excel early in the morning – he sees Bex Analyzer for reports which is not migrated yet , Advanced Analysis  for new reports and EPM for Consolidation and Planning reports. They also have the web based reports in BO portal (Webi and Xcelsius), self-service reports in BO Explorer and some old WAD and Bex reports in Enterprise Portal. In real life scenario, when we introduce new technologies, it does not happen in a clean cut manner but  through  gradual adoption and migration. All these incur cost and time. At the end, the management will ask the questions, is there any alternative besides getting in HANA, BO, Lumira etc. As for end users, they will be confused with so many tools they need to adopt.

On the application development site, how HANA become a game changer is through SAP River and Eclipse platform. So imagine data consume directly from HANA straight into web application layer on Eclipse using SAP River, what is the ‘new’ direction for application development now ? Do we shift from SAP platform to Eclipse platform now? Webdynpro ,workflow and ABAP to JAVA & SQL?

When HANA model become the major part of data acquisition for all the tools above, no doubts HANA Studio is going to be the new ‘RSA1’. And guess what, many BW developers out there are struggling to get a project that can give them the opportunity to pick up skills in HANA Studio.

In terms of BI, it is not only about technology advancement but also about the governance of data quality in the organization. So when the technology advances, most companies still struggle with the data quality or sap ECC convergence into global instance. To have a successful BI, these two main factors has to complement each other and be executed at the right track by the right party in the holistic BI roadmap.

Ok, back to real questions – HANA, so what does all these has to do with HANA and the direction? Simple. All of the new technology and tools are geared towards the HANA platform. It does not only change the company BI strategies but the People, Process , Innovation Trends and Jobs . So you can see now how big the impact is to the world out there with the new direction.

There you have it my POV. I am more on an enthusiast on the whole SAP BI innovations and how it evolves over the time, keen to gain and share and learn from feedbacks and POV from expert like you!


Saturday, May 3, 2014

The trends in Analytics - Tableau vs Lumira

We used to hear BI reports requests coming from everywhere, then we start to notice everyone suddenly is talking about Dashboards and now we call it 'Visual Storytelling'.  I am a BI developer all my career. On the presentation layer I'd worked on Crystal Reports all the way to Bex report, WAD,Webi, Xcelsius, Explorer etc. I also had the opportunity to work with non-SAP tool like Actuate and to some extend macro on excel.

In most cases, operational reports can be developed in Abap reports residing in ECC except those with complex mappings. For management reports, we need highly aggregated data and that's where dashboards come into place. It seems over the years, dashboards without the flow of guided data interpretation,self-service capability and mobility are not really effective to provide informed decision making data on the go. Hence you see that Visual Analytics tool such as Tableau, Lumira and Qlikview are fast becoming an interest in big organization.  The functions of a BI developer is more challenging and interesting as we need to use our technical expertise combine with our functional knowledge and interaction with business together with some creativity in visual representation of data to derive business visual story from the dashboard.

Among the tools for Visual Analytic I am working on is Tableau and Lumira. It seems Tableau is more mature in the Visual Analytic space and Lumira is actually SAP BO Visualizer that is wrap under HANA licence and it integrates directly with HANA model. Tableau by far fetch sits at the highest position in Gartner Magic Quadrant.But I always believe the position in Gartner Magic Quadrant does not mean anything concrete for the reason that product is selected as an organization  groupwide  BI tool. We must choose the right technology based on the integration of business processes and technology platforms because at the end ,accuracy and single source of truth speak the loudest in the Analytics space; not the impressive animation on the dashboard.So it is time for us to really test out the pros and cons of each technology. Having to say that, we need to bear in mind that the ultimate winning point should be the ability to tell a good business story accurately and not the name of the product or where it sits at the Magic Quadrant.


Thursday, May 1, 2014

Start looking at HANA as a Platform, not any platform but Innovation Platform!

For any debate or review process from Solution Architects over which  in-memory technology that best fit into an organization given that the fact Microsoft and Oracle database has its own in-memory capability build in, do not just look at HANA as a 'faster horse' but here are a couple of very valid and professional points from Wally Concil ,SAP North America. He iterates that we need to bring HANA in as a platform to enable transformation innovation. And what he meant by that is to put the innovation into business processes.Technology is a disconnected business process for most big organization and HANA platform can be expanded to produce an integrated end to end process for what used to be silo approach sitting on different department. And that can rapidly deliver on the breath of the solution.



Thursday, April 24, 2014

Does SAP HANA Replace BW? (Hint: Still, No.)

This time , I participated in the blog discussion, indeed a topic that can only grow over time.

http://www.saphana.com/community/blogs/blog/2014/04/17/does-sap-hana-replace-bw-hint-still-no#comment-5510

I am particularly interested in the below statement:
  • If you just have ERP as an operational standalone system then put ERP on HANA and use HANA Live
  • If you just need a data-mart alongside ERP then use HANA Enterprise and SLT for real-time replication
  • If you don't have BW and need an EDW then use BW on HANA, preferably with ERP on HANA and HANA Live


Monday, April 14, 2014

A pragmatic BI Vision



BI won’t work without single source of truth, BI governance, evolving innovation, streamlining with backend technology and self-servicing frontend. All these components are inter-related and play a vital role in the BI roadmap .

Monday, April 7, 2014

BW on HANA , how it speeds up things?

The BW OLAP engine has now become the OLAP Compiler for HANA, also known as the Analytic Manager, pushing down further OLAP operations down to HANA provides offering unrivaled query performance. Additional business insights can be achieved by overcoming existing ABAP based limits. A summary of the OLAP features pushed down into the HANA database is summarized below:

OLAP Features pushed down to HANA in BW 7.3x
  • Restricted key figures
  • Exception Aggregation CNT for quantity key figures without unit conversion
  • Exception Aggregation of currency key figures with optional currency conversion

OLAP Features pushed down to HANA in BW 7.4 SP5
  • Avoid intermediate result set materialization (e.g. Exception Aggregation)

OLAP Features pushed down to HANA in BW 7.4 SP6 and beyond
  • Stock coverage Key Figure
  • Hierarchy Handling
  • Formula exception aggregation
  • Processing of further query scenarios in HANA (Joins, Union, etc.)
  • Handling of inventory Key Figures
* Pluck from http://www.agilityworks.co.uk/our-blog/sap-bw-7-4-sp5-on-hana-better-smarter-faster/



Saturday, April 5, 2014

Everything suddenly seems NEW NEW NEW & BIG BIG BIG

I'd been doing some research over the weekend, well what a way to enjoy life but it is interesting yet troubling for me to grasp so many new things happening in SAP technology. And by this I am not just referring to the new kid in town - HANA Studio. 

SAP HANA Cloud Portal

http://www.sweetlets.com/w/2013/06/hana-cloud-portal-from-the-perspective-of-a-test-partner/



Unified development environment for BW via Eclipse

http://www.agilityworks.co.uk/our-blog/sap-bw-7-4-sp5-on-hana-better-smarter-faster/

                                                                  

SAP BW 7.4 Generates HANA Views accessible via SAP Lumira

http://www.youtube.com/watch?feature=player_embedded&v=33PR1w2_pzs










SAP RIVER

I registered for a trial version but the river is always flooded!! But in short , this is the new language to develop web application base on SAP HANA. 




It is mind boggling technology for a lot of  BW/BO Consultants out there. I bet most of them are hoping to play on the ground of  BW 7.4 on HANA now!


Tuesday, April 1, 2014

LSA and LSA++

LSA seems to be a hot BW jargon three years ago where most companies strive to standardize the data modelling design. Of course, I personally believe not all dataflow require LSA , example straight forward data from excel file. But nevertheless, the beauty of LSA is we minimize the risk of losing historical data (PSA then was supposed to have housekeeping and not serve as historical data storage as in LSA++) and changes in transformation layer can be reloaded from propagation layer safely without the need to re-initialize. Plus if we manipulate data in start routine from propagation DSO to transformation DSO , the key figures are not cumulative when looping the source package, unlike start routine between DSO and Infocube. And finally the aggregated data flows up to InfoCube for reporting. There is also Corporate Memory Layer available in DSO W/O to retain historical data.

Ok, with in-memory technology or famously known as HANA, snow flake schema got flatten out. No more SIDs meaning no longer reporting needs  InfoCube, in-memory-optimized DataStore objects, which can be used for reporting. So we now have LSA++. From 4 layers now it's 3 layers. We can also see they brought back 'InfoSource' in the picture, I'd always think InfoSource is useful when it come to 2 steps transformations.The data is acquired in the open operational data store layer. The PSA serves as the historical data foundation. No transformations or aggregations are defined in this layer.The data is then harmonized and transformed to the core EDW layer. The DataSource and the DSOs in the core EDW layer are connected by an InfoSource. A virtual data mart layer is used for reporting. InfoProviders that reside in this layer do not contain any data. Instead, they describe which data is accessed, and how it is displayed semantically to the end user. MultiProviders usually access the data from DataStore objects.


Monday, March 31, 2014

HANA Adoption Strategy

For most organizations,eventually it will come to a point where there will be hot debate on deciding to get HANA in or not. HANA is sometimes mistakenly took in as a new tool to improve reporting and planning performance and the investment evolves on funding the hardware, upgrade and licences. Well, it is not uncommon approach as scoping those mentioned areas already took up millions of dollars. And to take off HANA as new technology together with a strategic adoption program, it will cost triple fold of the initial set up cost.

HANA is actually a roadmap and a BI adoption program beside treating it as another 'Accelerator'. The implementation roadmap needs to be planned according to:
• Assessment of current BW environment to decide the migration strategy
• Alignment with SAP BO Upgrade and new innovations such as Lumira,Mobi & Explorer
• Alignment with Master Data initiatives
• SAP BI Governance Adoption
• Integration and adoption across other integrated systems, departments, projects
• Delivering business values and innovations in line with company long term goals and BI visions

Companies out there should avoid making mistakes to get only the hardware and tool in place without streamlining it as part of the long term BI roadmap that consist of stakeholder positioning in BI, delivering business values, adoption, governance, scalability, program sustainability in economic downturn, and global support. Having to say that, BI at the end of the day  is an expensive practice/tool/process that can only deliver back the ROI if it is strategically executed and planned.

Sunday, March 23, 2014

Potential Impacts of HANA Migration to the SAP BO

This is a good article for us to refer on the potential impact on BO when BW moves to HANA. Please take note that there is effort required on changing the data connection path on the universe we have currently and this may require to be done during the cutover window.


Friday, March 21, 2014

What is HANA?

I often hear people asking around 'What is SAP HANA'?

My definition
SAP HANA is a database itself, and it only runs on SUSE Linux. It is RAM working on Write/Read Controller modules on Data Volume and Log Volume store in disk. When you write data to SAP HANA, it is written to the logs first then moved into RAM. The ‘disk’ here meaning file system on a Fusion-IO SLC or MLC nand flash card or RAID 10 controller with an array with SSD disks (depending on the hardware vendor). This sounds like a lot of Basis stuffs and yes if we see it from the sole angle of migrating the database from rdbms to in-memory, this is it , something new and exciting for Basis guys.

So what is in store for BW guys? Ok, here is where BW consultants need to pay attention to. You see SAP sees HANA as a game changer, in a way, there are so many potentials lies within the area of Business Analytic. So they introduce to you SAP HANA Studio. This is where the future RSA1 and DB02 take place. And the peak of the action happens when ECC itself sits on top of HANA and  HANA Business Contents are made accessible directly from HANA Studio. If you are a BO guy, you already know data provisioning is available directly from HANA real time data straight into the dashboard or reports.

How about ABAPERs? HANA Studio itself is built from Eclipse. So if you are an application developer, having sound knowledge on Java is crucial as I can only imagine instead of developing workflow in webdynpro , we can do it in Java. Of course, on top of all this, we can't run away from the integration with ECC in which BADI and BAPI skillsets are always hot in that area. So ABAPERs, you are always in demand.

To get to the ultimate HANA landscape, just like any technology lifespan, there is transition and obsolete.
Well, let's not dwell into the word 'obsolete' in this case (Although I do believe it contributes to a lot of worry factors among BW consultants) .Focusing on the word 'transition', that is when we start to google up on words like HANA optimized Infocube, HANA optimized DSO, Semantic Partition, 'Sidecar' approach, SLT, DXC and the favourite for BPC-ABAP developers - Code Pushdown.

Quote from Vishal Sikka:
There is another possible reality. HANA offers an opportunity to rethink business processes. The business driving elements such as decentralisation and the explosion of interrest in mobile are in play. Cloud vendors are showing us important elements of the technology with which to get the job done. Despite what detractors might say, HANA really does have disruptive potential.


Here is a list of Google search results from various sources:
WIKI
SAP HANA is an in-memory, column-oriented, relational database management system developed and marketed by SAP AG.

Note:
Column-oriented organizations are more efficient when an aggregate needs to be computed over many rows.
Column-oriented organizations are more efficient when new values of a column are supplied for all rows at once.
Row-oriented organizations are more efficient when many columns of a single row are required at the same time.
Row-oriented organizations are more efficient when writing a new row if all of the row data is supplied at the same time.


SAPHANA.COM
SAP HANA is an in-memory data platform that is deployable as an on-premise appliance, or in the cloud.  It is a revolutionary platform that’s best suited for performing real-time analytics, and developing and deploying real-time applications. At the core of this real-time data platform is the SAP HANA database which is fundamentally different than any other database engine in the market today.

Wikibon
SAP HANA Enterprise 1.0 is an in-memory computing appliance that combines SAP database software with pre-tuned server, storage, and networking hardware from one of several SAP hardware partners. It is designed to support real-time analytic and transactional processing.

Techopedia
SAP HANA is an application that uses in-memory database technology that allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. This allows the application to provide instantaneous results from customer transactions and data analyses.HANA stands for high-performance analytic appliance.

Searchsap
SAP HANA is a data warehouse appliance for processing high volumes of operational and transactional data in real-time. HANA uses in-memory analytics, an approach that queries data stored in random access memory (RAM) instead of on hard disk or flash storage.

Cost Allocation Project

It'd been a while since I last blog. I was moving from HANA to BPC on and off in the fluid environment in my current organization. But getting into BPC was something very different from what I used to do as a SAP BI consultant. Actually there is a lot of Abap, CO config and BPC frontend and very little of BW development. Anyhow, the solution we rolled out is really neat and creative. An Abap program was developed to simulate Cost Allocation process from ECC and the data is updated in real time into a write-optimized DSO, in this case it acts as a table. These data is loaded into an Infocube and through a Data Manager package that call a BADI , the results is extracted into a BPC model. The model is used to enable planning in BPC for secondary cost.This was possible because ECC BAPI was triggered from BPC via a RFC connection and data was retrieved from AL11 back into BPC. The best thing about this program is we are able to create a snapshot view of each run for the cycle (sometimes the users may run more than one time for the same cycle after making changes in the cycle configuration or percentages of allocation), cycles are categorized for each Region accordingly (hence making it possible to have an authorization check for  user  over a particular cycle), data updated directly in BPC and preliminary check was done on dependent cycle (example a WBS Element that acts as a Receiver has to be run before a cycle that has that same WBS Element as a Sender). The checks on dependent cycle eliminates a lot of manual work in which it used to be a dreadful practice to run Cost Allocation by different focal person through a series of 'take turn' sequencing to cater for cost objects involve in inter company elimination. I got really used with tcode such as KSUB, KSU7, KSU8, KSU9,  CJI4, KAH3, KJH3, KP06 etc.The Abaper I worked with is an amazing coder who managed to have the work evolved 'beautifully' around this important function module K_ALLOCATIONS_RUN.

If the whole Cost Allocation process can be simulated from BPC , the whole planning process is possible to be shifted over from ECC to BPC. The approach is through  simulation because the BAPIs are only available in ECC contents and not in Netweaver. It can be interesting topic for licencing as if planning can be triggered from BPC, planning focals  may not be needing ECC access. Well, for existing BPC planning solutions, instead of performing extraction and retraction in and out from ECC, you can consider this approach.

*For Extraction mode, BADI that normally substitute the conversion file in Data Manager Packages is used and is triggered from transformation file (Script Logic) in Data Manager Package. As for Retraction ECC, a BW process chain can be used by declaring it in the transformation file and then run the Data Manager Package.