Multi-dimensional clustering in SAP NetWeaver Business Intelligence

Getting the best possible query performance out of a SAP Business Intelligence system is one of the main responsibilities of a BI consultant. Optimizing data models (e.g. remodeling and creating indexes) is one of the activities you could perform in order to achieve this. In some cases optimizing is not sufficient. Partioning data objects is sometimes forgotten, but might do the trick.

Partitioning data objects is a piece of cake from SAP BW 7.3 onwards (see attachment). However, when running an old version of SAP BW with a DB2 database, this could only be achieved by using multi-dimensional clustering (MDC). This feature, which is only applicable for DB2 databases, allows you to physically cluster data in a table along multiple dimensions. It is especially beneficial for query performance in business intelligence applications that use star and snowflake schemas and process large amounts of data with restrictions on several dimensions.
Multi-dimensional clustering option for data objects in SAP Business Warehouse
If you have tried almost everything to optimize your query performance then MDC could be your last option. In the end upgrading SAP BW to a higher version (which makes is possible to create semantic partitioned objects) and using an accelerator (like the BWA or HANA) will be more beneficial for the long term.

SAP Business Warehouse extractor and table analyzer

Building up a new dataflow can be very time-consuming, mainly when there is no data acquisition layer to extract data from. Setting up a proper data acquisition layer means considering which fields to extract towards the data warehouse system. Since not all fields of a table used in the source system it is advisable to first analyze which fields are useful. Extracting all fields results in empty and unused objects in the Data Warehouse system, which technically results in a bigger database.

In order to make everybody’s life easier I created the ABAP program “SAP Business Warehouse extractor and table analyzer”. This program can be used to analyze Business Content extractors (like 2LIS structures) or SAP tables.
SAP Business Warehouse extractor and table analyzer

Role generation tool (PFCG upload ready)

As of September 8th I am proud to be part of a BI project team at ASML. As a team we are responsible for a greenfield BW on HANA implementation. One of the requirements for this project was to define a new authorization concept. After a thorough research a framework was proposed containing the objects which should be created in order to make data authorization possible. This meant that for a certain area 360 authorization objects and roles had to be created.

For the mass creation of authorization objects there is currently no tool available (at least not one that I amaware off), resulting in spending over more than 6 hours of clicking and saving in transaction RSECADMIN (!). However for the mass generation of roles I knew that there was a upload possibility available in transaction PFCG. With this knowledge in mind I downloaded a already existing role  analyzed the content of this downloaded .SAP file. Concluding that this .SAP file is just a simple text file which could be easily generated by Excel.
Download / upload roles from PFCG in SAP
So in order to create 360 roles I started developing an Excel Macro which generates a .SAP upload file based on the given roles names, descriptions and authorization objects. As a result the “PFCG upload – role generation tool” was born. Refer to the attached document to view the result. Hopefully this tool could save you alot of time and effort!
PFCG upload – role generation tool

RFC destination for SAP BW is not maintained in source system

Today I encountered a return code 8 (RC=8) when a datastore object (DSO) was transported to the quality environment. Knowing that other transports were already imported successfully the error message below looked a bit odd to me.
RFC destination for BW is not maintained in source system
The error message indicates that there is something wrong with the RFC connection. With this message in mind I checked transaction SM59 to verify if everything was setup properly. Everything looked fine. Also the partner profiles, configured by using WE20, were correct.

After searching (again…) on the SCN website I bumped in to this thread, where table RSLOGSYSDEST is mentioned. The thread describes that the issue could be caused by the references found in this table. In my case record BWQ010 should point to destination BWQ010. As seen in the screenshot below the values were not aligned.
RSLOGSYSDEST difference in LOGSYS and DESTINATION
In order to align these values it is necessary to restore the BI source system in BW. After this action was executed the entries in table RSLOGSYSDEST were changed to the right values! It’s as simple as that!

Verify dimension sizes of InfoCubes

Maybe you are one of the lucky few who is currently working with BW on HANA and do not have to model InfoCubes anymore (from a performance perspective). But for those who still need to do this, ABAP program SAP_INFOCUBE_DESIGN could help you verify how well your InfoCubes are modelled in SAP Business Warehouse. This program collects data for every InfoCube in the BW system and counts the records available in the related dimension tables and fact table. The program then checks the ratio between the number of records in the dimension tables against the number of records in the fact table.
ABAP program SAP_INFOCUBE_DESIGNS
A ratio between 10% and 20% is considered as high cardinality and therefore indicated in red. In most cases you should avoid dimension tables larger than 10%. Of course this it all depends on how frequent the data is used in the mentioned dimension table. If possible use line item dimensions instead!
SAP_INFOCUBE_DESIGNS output
Note: The data which is used for this analysis comes from BW statistics, so be sure that if you want to analyze a specific InfoCube, statistics are created for it.