Wednesday, June 30, 2010

How to Change BI variable parameters

Once a variable has been created, the definition of some parameters (Processing By, Variable Represents and some others) can't be modified within Bex Query Designer.
 
There is a variable represented by single value, but according to some business process, we need this variable to be represented by multiply single value instead of single value.

In details TAB  the Variable represents - single value. need to change it to Multiple single values.

SAP recommended way to solve this issue is to delete this variable and create a new one with suitable parameter.
But what should we do if this variable already has been used within 100 queries ?
In this case we can use not recommended, but proper way - change these parameters directly via RSZGLOBV table.

Go to TC SE16 -> Table RSZGLOBV and enter variable's technical name in Variable Name (VNAM) field and select Vesrion : A. click on Execute button.

It will display the Record with our Variable name. select the variable and click on change button (F6)

In the next screen it is possible to change any parameter of variable. In our case we want to change Variable Represents (Select parameters in RSZGLOBV table) parameter.
Change the content of Select parameters (VPARSEL) field from P to M and save the changes.
image


Do the same for M version entry in RSZGLOBV table.
Now we need to generate all the queries that use this variable.
Go to TC RSRT - > Environment - > Gen. Queries Directly.
image

Type info provider technical name with the queries to be generated and press Execute (F8).
image

Trigger process chain using function Module, by clicking on button


This is a sample code to trigger the process chain in the background JOB, and trigger it on fly using a button in the web application.
Designed using the BPS exit planning function, functionality. the planning function is called as a button in the web interface. Here is the sample code.
its broken into two, a main program which trigger the process chain and an exit function module which calls the program in process chain.

*&---------------------------------------------------------------------*
*& Report Z_BPS_SELECTIVECOPY
*&
*&---------------------------------------------------------------------*
*&this program triggers the pc Z_BPS_SELECTCOPY_PROJ
*&and collects the messages which could be monitored.
*&---------------------------------------------------------------------*

REPORT Z_BPS_SELECTIVECOPY.

data L_LID type RSPC_LOGID.
data L_STATUS type RSPC_STATE.
data L_T_LOG type table of RSPC_S_MSG.
data L_S_LOG like line of L_T_LOG.
data: G_S_LOG type BAL_S_LOG.
data: L_S_MSG type BAL_S_MSG.
data: LS_RETURN type RSPC_S_MSG.
data: LT_RETURN type RSPC_S_MSG occurs 0.
data: L_LOG_HANDLE type BALLOGHNDL.
data: LT_LOG_HANDLE type BAL_T_LOGH.

call function 'RSPC_API_CHAIN_START'
exporting
I_CHAIN = 'Z_BPS_SELECTCOPY_PROJ'
importing
E_LOGID = L_LID
* EXCEPTIONS
* FAILED = 1
* OTHERS = 2
.
if SY-SUBRC <> 0.
MESSAGE ID 'ZBW_NIBS2' TYPE 'E' NUMBER 008.
endif.


if SY-SUBRC = 0.

L_STATUS = 'A'.

while L_STATUS = 'A'.

call function 'RSPC_API_CHAIN_GET_STATUS'
exporting
I_CHAIN = 'Z_BPS_SELECTCOPY_PROJ'
I_LOGID = L_LID
importing
E_STATUS = L_STATUS.

if L_STATUS = 'A'.
wait up to 60 seconds.
endif.
endwhile.

call function 'RSPC_API_CHAIN_GET_LOG'
exporting
I_CHAIN = 'Z_BPS_SELECTCOPY_PROJ'
I_LOGID = L_LID
tables
E_T_LOG = lt_return.

if L_STATUS eq 'R' or L_STATUS eq 'X'.
LS_RETURN-MSGTY = 'E'.
LS_RETURN-MSGID = 'ZBW_NIBS2'.
LS_RETURN-MSGNO = '006'.

append LS_RETURN to LT_RETURN.

elseif L_STATUS eq 'G'.
LS_RETURN-MSGTY = 'I'.
LS_RETURN-MSGID = 'ZBW_NIBS2'.
LS_RETURN-MSGNO = '007'.
append LS_RETURN to LT_RETURN.

endif.

G_S_LOG-EXTNUMBER = 'PROCESS CHAIN'.
G_S_LOG-ALUSER = SY-UNAME.
G_S_LOG-ALPROG = SY-REPID.
G_S_LOG-OBJECT = 'SEM-BPS'.
G_S_LOG-SUBOBJECT = 'FUNC'.


* create a log
call function 'BAL_LOG_CREATE'
exporting
I_S_LOG = G_S_LOG
importing
E_LOG_HANDLE = L_LOG_HANDLE
exceptions
others = 1.

if SY-SUBRC <> 0.

message id SY-MSGID type SY-MSGTY number SY-MSGNO
with SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.

endif.

loop at LT_RETURN into LS_RETURN.

* define data of message for Application Log
clear: L_S_MSG.
L_S_MSG-MSGTY = LS_RETURN-MSGTY.
L_S_MSG-MSGID = LS_RETURN-MSGID.
L_S_MSG-MSGNO = LS_RETURN-MSGNO.
L_S_MSG-MSGV1 = LS_RETURN-MSGV1.
L_S_MSG-MSGV2 = LS_RETURN-MSGV2.
L_S_MSG-MSGV3 = LS_RETURN-MSGV3.
L_S_MSG-MSGV4 = LS_RETURN-MSGV4.

call function 'BAL_LOG_MSG_ADD'
exporting
I_LOG_HANDLE = L_LOG_HANDLE
I_S_MSG = L_S_MSG
exceptions
LOG_NOT_FOUND = 0
others = 1.

if SY-SUBRC <> 0.
message id SY-MSGID type SY-MSGTY number SY-MSGNO
with SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
endif.
endloop.

if not LT_RETURN is initial.
append L_LOG_HANDLE to LT_LOG_HANDLE.

call function 'BAL_DB_SAVE'
exporting
I_T_LOG_HANDLE = LT_LOG_HANDLE.

if SY-SUBRC <> 0.
message id SY-MSGID type SY-MSGTY number SY-MSGNO
with SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
endif.
clear LT_LOG_HANDLE.

endif.

call function 'BAL_DSP_LOG_DISPLAY'
exceptions
others = 1.
if SY-SUBRC <> 0.
message id SY-MSGID type 'S' number SY-MSGNO
with SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
endif.

endif.


-------------------------------------------------------------------------------

FUNCTION zbps_selectivecopy.
*"----------------------------------------------------------------------
*"*"Local Interface:
*" IMPORTING
*" REFERENCE(I_AREA) TYPE UPC_Y_AREA
*" REFERENCE(I_PLEVEL) TYPE UPC_Y_PLEVEL
*" REFERENCE(I_METHOD) TYPE UPC_Y_METHOD
*" REFERENCE(I_PARAM) TYPE UPC_Y_PARAM
*" REFERENCE(I_PACKAGE) TYPE UPC_Y_PACKAGE
*" REFERENCE(IT_EXITP) TYPE UPF_YT_EXITP
*" REFERENCE(ITO_CHASEL) TYPE UPC_YTO_CHASEL
*" REFERENCE(ITO_CHA) TYPE UPC_YTO_CHA
*" REFERENCE(ITO_KYF) TYPE UPC_YTO_KYF
*" EXPORTING
*" REFERENCE(ETO_CHAS) TYPE ANY TABLE
*" REFERENCE(ET_MESG) TYPE UPC_YT_MESG
*"----------------------------------------------------------------------
DATA: ls_exitp LIKE LINE OF it_exitp,
ls_mesg LIKE LINE OF et_mesg,
l_jobname LIKE tbtcjob-jobname,
l_jobcount LIKE tbtcjob-jobcount,
l_bundle LIKE upf_bsteps-bundle.

MOVE 'PC_Z_BPS_SELECTIVECOPY' TO l_jobname.

CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = l_jobname
IMPORTING
jobcount = l_jobcount
EXCEPTIONS
cant_create_job = 1
OTHERS = 2.

IF sy-subrc <> 0.
CLEAR ls_mesg.
ls_mesg-msgid = sy-msgid.
ls_mesg-msgty = sy-msgty.
ls_mesg-msgno = sy-msgno.
ls_mesg-msgv1 = sy-msgv1.
ls_mesg-msgv2 = sy-msgv2.
ls_mesg-msgv3 = sy-msgv3.
ls_mesg-msgv4 = sy-msgv4.
APPEND ls_mesg TO et_mesg.
EXIT.
ENDIF.

* l_bundle = ls_exitp-chavl.


SUBMIT z_bps_selectivecopy
USER sy-uname VIA JOB l_jobname NUMBER l_jobcount
AND RETURN.

IF sy-subrc <> 0.
CLEAR ls_mesg.
ls_mesg-msgid = 'UPC'.
ls_mesg-msgty = 'E'.
ls_mesg-msgno = '306'.
ls_mesg-msgv1 = 'Z_BPS_SELECTIVECOPY'.
ls_mesg-msgv2 = l_jobname.
APPEND ls_mesg TO et_mesg.
EXIT.
ENDIF.

CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = l_jobcount
jobname = l_jobname
strtimmed = 'X'
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
OTHERS = 8.

IF sy-subrc <> 0.
CLEAR ls_mesg.
ls_mesg-msgid = sy-msgid.
ls_mesg-msgty = sy-msgty.
ls_mesg-msgno = sy-msgno.
ls_mesg-msgv1 = sy-msgv1.
ls_mesg-msgv2 = sy-msgv2.
ls_mesg-msgv3 = sy-msgv3.
ls_mesg-msgv4 = sy-msgv4.
APPEND ls_mesg TO et_mesg.
EXIT.
ENDIF.
CLEAR ls_mesg.
ls_mesg-msgid = 'UPC'.
ls_mesg-msgty = 'S'.
ls_mesg-msgno = '305'.
ls_mesg-msgv1 = l_jobname.
APPEND ls_mesg TO et_mesg.

ENDFUNCTION.

Function Modules in BI

Function Module Description (Function Group RRMX)
RRMX_WORKBOOK_DELETE Delete BW Workbooks permanently from Roles & Favourites
RRMX_WORKBOOK_LIST_GET Get list of all Workbooks
RRMX_WORKBOOK_QUERIES_GET Get list of queries in a workbook
RRMX_QUERY_WHERE_USED_GET Lists where a query has been used
RRMX_JUMP_TARGET_GET Get list of all Jump Targets
RRMX_JUMP_TARGET_DELETE Delete Jump Targets

Function Module Description
MONI_TIME_CONVERT Used for Time Conversions.
CONVERT_TO_LOCAL_CURRENCY Convert Foreign Currency to Local Currecny.
CONVERT_TO_FOREIGN_CURRENCY Convert Local Currency to Foreign Currency.
TERM_TRANSLATE_TO_UPPER_CASE Used to convert all texts to UPPERCASE
UNIT_CONVERSION_SIMPLE Used to convert any unit to another unit. (Ref. table : T006)
TZ_GLOBAL_TO_LOCAL Used to convert timestamp to local time
FISCPER_FROM_CALMONTH_CALC Convert 0CALMONTH or 0CALDAY to Financial Year or Period
RSAX_BIW_GET_DATA_SIMPLE Generic Extraction via Function Module
RSAU_READ_MASTER_DATA Used in Data Transformations to read master data InfoObjects

RSDRI_INFOPROV_READ
RSDRI_INFOPROV_READ_DEMO
RSDRI_INFOPROV_READ_RFC Used to read Infocube or ODS data through RFC

DATE_COMPUTE_DAY
DATE_TO_DAY Returns a number what day of the week the date falls on.
DATE_GET_WEEK Will return a week that the day is in.
RP_CALC_DATE_IN_INTERVAL Add/Subtract Years/Months/Days from a Date.

RP_LAST_DAY_OF_THE_MONTHS
SLS_MISC_GET_LAST_DAY_OF_MONTH Determine Last Day of the Month.
RSARCH_DATE_CONVERT
Used for Date Converstions. We can use in Info Package routines.

RSPC_PROCESS_FINISH To trigger an event in process chain

Monday, June 28, 2010

DTP - SDN Blog series links

  1. Blog: The "normal" Data Transfer Process... Link
  2. Blog: The "Only Get Delta Once" Data Transfer Process Feature... Link
  3. Blog: The "Get Data by Request" Data Transfer Process Feature... Link
  4. Blog: How to get DTP runtime information in the Transformation... Link
  5. Blog: Extended Capabilities to Debug a Data Transfer Process (DTP) Request... Link
  6. How to paper: How to... minimize reporting downtime during initial data upload in SAP NetWeaver BI7.0 ... Link
  7. Blog: Modeling a Write Optimized DataStore object in a EDW layered architecture in SAP NetWeaver BI7.0... In process
  8. Blog: 'Error DTP' - BI@2004s by KJ (Kamaljeet) ... Link

BI: Data Transfer Process with “Get Data by Request”

This Post describes the Data Transfer Process (DTP) feature "get Data by Request" based on a Scenario and in a Step by Step format.


Tip:
It is recommended to configure the DTP with upload mode “Delta”. The deletion of the PSA data is necessary before each data load, if a “Full” DTP is used.
A Full DTP extracts all Requests from the PSA regardless if the data has been already loaded or not. This means the Delta upload via a DTP from the DataSource (PSA) in the InfoCube is necessary, even if the data is loaded via a Full upload from the Source to the DataSource (PSA) by using an InfoPackage.

The following notes are important:
0001005568 70SP12: 'Only Get Delta Once' deactivated
0001061316 70SP15: Data automatically retrieved by request

Configuration:

  1. “Delta InfoPackage”
    image
    Picture 1

  2. “DTP definition” - The “Get Data by Request” flag is set
    image
    Picture 2

  3. “Process Chain”
    image
    Picture 3

  4. “Dataflow”
    image
    Picture 4


Step by Step:
  1. Loading 3 Requests into the PSA via Delta upload.
    In this case I started 3 times the same InfoPackage.
    Each upload Request consists of one record. The contents of the PSA after these 3 loads looks like:
    image
    Picture 5

  2. Execute the Process Chain to load the data from the Flat File into the InfoCube. The InfoPackage will extract the data from the Flat File into the PSA and the DTP will extract the PSA data (see Step 1) into the InfoCube.

    Status PSA after the execution of the process chain (data load):
    image
    Picture 6



Result:
Currently the DTP loads one (the oldest) Source Request into the InfoCube. This is called “Backlog”. Please check note 1061316 for further details. The current workaround is to load the backlog data via a Normal "DTP".

Contents of InfoCube after the data load:
image
Picture 7

Collective SAP BI Notes related to Query performance

Optimizations for queries (general)
1055044 Performance in SAPLRRK0, form s_data_fuellen
1063768 Performance of plan queries with high no. of key figures
1067433 Performance optimization when you use derivations
1084602 Performance of combination check in the input-ready query
1094422: Input-ready query: Corrections for various errors
1101187 The OLAP tunnel
Note: Bear in mind that Note 1101187 contains very complex correction instructions. Read this note carefully and do not forget the required manual steps. This note is also relevant for planning functions, in particular when processing mass data. This note also helps to reduce the memory consumption in complex planning scenarios.
1104114 Follow-on error: Optimizing the performance of derivations
1114620 No new lines after Note 1063768 & Note 1084602 implemented
1117348 Subsequent correction to Note 1101187

Optimizations for queries with two structures

1019326 Integrated planning: Ready for input status of cells
1020323: Input-ready query: Characteristic relationships, memory consumption
1039781: Input-ready query: Performance
1055965 Performance: Queries with two structures, collect_chafix
1056050 BI planning: Internal errors
1063768 Performance of plan queries with high no. of key figures
1084602 Performance of combination check in the input-ready query
1114620 No new lines after Note 1063768 & Note 1084602 implemented
1132992 Performance problem when initializing variables
Note:Notes 1063768 and 1084602 also help with queries with just one structure that processes a large amount of data. In this situation, you should also use Note 1067433.

Optimizations if you use a large number of data slices with variables
1059304 CL_RSPLFU_IOBJ_HELPER->VAR_EXISTS performance improvement
1060908 Performance when you instantiate data slices
1068150 Performance: Buffering of data slices and char. relationships

Optimizations when using planning functions
1096045 Planning functions: Parsing before the execution
1101313 Reading the planning buffer with the correct date
1115910 Planning functions: Performance improvements
1121202 Planning functions: Distribution with keys
1149337 Main memory consumption and planning functions

Optimizations if you have very large MultiProviders
1040293 IP: Optimizations for writable InfoProviders
1075101 Subsequent correction to Note 1040293
Note: Implement this note only after consultation with SAP development.This warning concerns all notes that have Note 1040293 as prerequisite. This note is very large and has many complex dependencies. Therefore, we recommend that you import Support Package 14 for MultiProvider optimizations.
1072982 Metadata buffer for MultiProvider is not deleted
1090490 Buffering the MultiProvider runtime object
1110997 Subsequent correction to Note 1090490
1128031 Correction instructions in Note 1090490
Note: Bear in mind that Note 1090490 contains very complex correction instructions. Read this note carefully and do not forget the required manual steps.

Optimizations for the runtime of characteristic relationships and data slices
1016632: Unnecessary instantiation of characteristic relationships
1020059 Inconsistent filter in input-ready query
1044708 IP: Performance of generic time derivation
1067433 Performance optimization when you use derivations
1104114 Follow-on error: Optimizing the performance of derivations

Optimizations if you have large hierarchies
1058679 Several key selections in the select statement
1065041 Performance Problems in Queries with huge key selections
1068357 Planning functions: Selections on hierarchy nodes
See also the section ‘Optimizations of the BEx Web Analyzer’

Optimizations of the memory requirements
1020323: Input-ready query: Characteristic relationships, memory consumption
1042896 Release of memory after you close query
1090119 Table SELDR requires a large amount of memory space
1098057 Query: Dump NO_ROLL_MEMORY or other memory overflow
1107072 Improving the input help for 0INFOPROV
1112519 Inaccuracies in OLAP cache
1118671 Unnecessary memory consumption
1144702 Memory release, additional corrections to Note 1101187
1146957 Releasing memory OLAP_CACHE
1149337 Main memory consumption and planning functions

Optimizations of the BEx Analyzer
1017965 Variable screen takes a long time to appear and to close
1044219 Analyzer variable screen: Performance and documentation
1054168 Performance problems during conversion to formulas
Tip:If you use BEx Analyzer queries with two structures,
the time until the variable screen is called may be significant if you use the reference view option in the data provider settings.
1094799 Corrections in the BEx Analyzer server runtime
Caution:This note requires manual activities before implementation.
1118671 Unnecessary memory consumption
1150242 Improving performance/memory in the BEx Analyzer

Optimizations of the BEx Web Analyzer
1055003 Performance probs for hierarchies in Java Runtime (1062537)
1053054 Poor performance in hierarchy nodes in the filter
1062608 Long runtimes if you use an info field item
1085446 Poor performance due to result suppression
1086332 Long runtimes: Queries with many characteristics/attributes
1092068 ‘Dropdown Box’ Web item: Performance problems
1111470 Poor performance during filtering with many single values
1113195 Improving performance when there are several data providers
1128508 Performance improvements for web template with multiple tabs
1162580 Analysis item: Slow line selection performance
Various optimizations
1024554 Improving performance in queries in SAPLRSEC_CHECKS
1055044 Performance in SAPLRRK0, form s_data_fuellen
1059381 InfoProvider restriction in input help using MultiProviders
1060170 Performance improvement during analysis authorizations
1069675 Further performance improvement
1121993 Analysis auth’s: Performance optimization for special situ’s
1132992 Performance problem when initializing variables

Other collective notes for BI topics
1025307 Composite note for NW2004s performance: Reporting
1055581 Recommendations for Support Package Stacks for BI 7.0
1077830 BI-IP Support Package Stack recommendations
1101143 Collective note: BEx Analyzer performance

Friday, June 25, 2010

Business content Delivered Variables in SAP BI



Use
SAP delivers a whole series of variables for characteristic values and texts. There are variables for time-dependent InfoObjects that are processed via automatic replacement using a predefined replacement path (processing type SAP Exit).

If you want to define a query that only ever displays the data from the current month, you drag the delivered variable for the characteristic value "current month" (tech. name 0CMONTH) into the query filter.
Before you can use delivered variables you must first activate them. You can get further information on activating the query objects under Installing Business Content.
Integration
You can find information on how you use the delivered variables for characteristic values and texts in the query definition in the section Creating Queries with Variables
List of Delivered Characteristic Value Variables
Technical
variable name
Description Replace with
0DAT Current calendar day Information from the system date
0CWEEK Current calendar week Information from the system date
0CMONTH Current calendar month Determines the current calendar month from the system date and replaces the variable with this value.
0CQUART Current quarter Information from the system date
0CYEAR Current calendar year Information from the system date
0FPER Current fiscal month Information from the fiscal year variant (function module: date_to_period_convert)
0FYEAR Current fiscal year Information from the fiscal year variant (function module: date_to_period_convert)
0CWD Current week day Information from the factory calendar (function module: date_convert_to_factory_date)
0FYTCD Interval from the first day of the current fiscal year (depending on the fiscal year variant) to the current day  
0FYTCFP Current year to current fiscal month  
0FYTLFP Current year to previous fiscal month  
0CYTCD Current year to current day  
0CYTCM Current year to current month  
0CYTLM Current year to previous month  
There are text variables (with replacement paths). See also: Variables for Texts
Technical name Replacement
0DATT Same as characteristic value variable belonging to it
0CWEEKT Same as characteristic value variable belonging to it
0CMONTHT Same as characteristic value variable belonging to it
0CQUARTT Same as characteristic value variable belonging to it
0CYEART Same as characteristic value variable belonging to it
0FPERT Same as characteristic value variable belonging to it
0FYEART Same as characteristic value variable belonging to it

Authorization in SAP NW BI

Good link for Authorization in SAP NW BI

How to find the code for the SAP EXIT Variables?

1) Go to SE37 (Function Builder : initial screen)
2) Function Module : Enter the FM name for exp: RSVAREXIT_0P_KEYD2

FM name for the SAP EXIT variables is RSVAREXIT_variablename

Click on Display button. And check the Code in the next screen.

Note:
This FM will be available only for variables having "SAP Exits" as processing type Not for all SAP delivery variables.

Brief Information about the SAP EXITS

You can use transaction SE16 to display all delivered SAP Exit variables in table RSZGLOBV with the
settings OBJVERS=D, IOBJNM = and VPROCTP =4 for a characteristic. SAP Exit variables are predominately used for the variable type characteristic value variable (VARTYP =1). The ABAP coding belonging to a SAP Exit variable can be found in the function module RSVAREXIT_ (se37). Nevertheless, the usual SAP Exit variables for time characteristics are filled using the BW function module RREX_VARIABLE_EXIT (older version). With each new SAP Exit variable, a function module must be created with the name RSVAREXIT_. A user can view/copy the interface in the existing module
RSVAREXIT_OP_FVAEX. The module is to be created in its own function group for the application (such as BWCO for SAPExists in the Controlling area), so that any errors do not influence other programs. For the interface: I_VNAM contains the variable name (redundant as already in the name of the module), I_VARTYP, I_IOBJNM and I_S_COB_PRO give information about the variable and the corresponding InfoObject, I_S_RKB1D and I_S_RKB1F contain information about the query (such as fiscal year variant in I_S_RKB1F-PERIV if not a variable) and I_THX_VAR contains the already filled values of the variables. Here you can find where appropriate values of a variable for 0FISCVARNT providing that I_S_RKB1FPERIV is empty. In table E_T_RANGE, only fields SIGN, OPT, LOW and HIGH are
allowed to be filled. SIGN and OPT are also to be filled for parameter or interval variables (with I and EQ or I and BT).
The variable processing type “Customer Exit” can be used in a similar way like the SAP Exit variables delivered by the SAP Business Content.

Thursday, June 17, 2010

How to avoid Attribute Change Run Collisions in SAP BI

Attribute and hierarchy change run : If you change master data (navigation attributes) or hierarchies of a characteristic that is contained in aggregates, you must adjust these aggregates. This ensures that queries that access the InfoCube or assigned aggregates are consistent. Unlike in the aggregates, no data that refers to navigation attributes or hierarchies is stored in the InfoCube. The master data or the hierarchy tables are joined with the tables of the cube when the query is executed.Regardless of whether or not aggregates exist, the system does not automatically transfer master data record changes, rather you must activate this master data explicitly. If aggregates are involved, you must adjust them using the change run before you can 'release' the data record changes (the corresponding InfoObjects or hierarchies are registered for the next change run: Transaction RSA1 -> Tools -> Apply Hierarchy/Attribute Change -> InfoObject List or Hierarchy List).

Problem: In any SAP System only One attribute change run can run at one point of time. i. e If one attribute change run is running in system from any process chain or for any project and 2nd one fails, if start at same time due to locking problem. Due to this entire data load fails.

Solutions:
1. Shift Change runs from info package level to global level and/or process chain level in order of priority. Means instead of keeping ACR after each infopackage update we can accumulate all in one ACR at end of Process Chain or in a separate chain, if this data is not dependent for next loads(means no look ups).

2. Increase CR_MAXWAIT time. CR_MAXWAIT is to create a delay for second attribute change run while the first attribute change run is running in the system.
At any single point in time, there can only be one CR in the startphase. Every other CR will immediately fail when unsuccessfullytrying to acquires the startlock. No wait is done here.
Above option is helpfull When the running CR is in the workphase, only then will a second CR enter the start phase and wait as long as specified in CR_MAXWAIT.And when the second CR is in the startphase, all other CRs trying to start will again immediately fail.
For more info @ SAP Note : Note 825927 - The BW Changerun: CR_MAXWAIT .so this is not 100% solution.

3. Created an ABAP Program and included before each Attribute Change Run in Process Chains. This Program checks if any ACR running in the system or not. If any ACR is running it delays 10 Secs. Recursively it checks and delays 10 secs every time. Program execution finishes only when there is not ACR running in the system and allows next process(Attribute Change Run) to trigger in Process Chain. After implementation of this program no ACR failures in our system. Check the coding at end of this blog in appendix.

How to implement:

Step1: Create an ABAP program(eg: ZRSDDS_CHANGERUN_MONITOR), coding available at Appendix. create a variant aswell for no of sces neets to wait at max.

Step2. Include this ABAP program in Process Chains between infopackage and Attribute Change runs as shown in the screenshot.

step3: Goto Process Chain(TCode: RSPC) and choose perticular process chain, then goto processes and choose ABAP program provide technical name and description. Provide created ABAP program and Variant save and activate Process Chain.


Appendix:
How to Integrate an ABAP Program in a Process Chain
Note 903886 - Hierarchy and attribute change run
Note 825927 - The BW Changerun: CR_MAXWAIT
Code :
*REPORT ZRSDDS_CHANGERUN_MONITOR.
TYPE-POOLS: rsdds, rrhi, rsd.
SELECTION-SCREEN BEGIN OF BLOCK NO_TIMES WITH FRAME.
PARAMETERS : L_TIMES TYPE I. --> Create a variant provide value(ex: 100, then it will wait at max 100*10 secs).
SELECTION-SCREEN END OF BLOCK NO_TIMES.
DATA: l_cr_state TYPE rsdds_cr_state,
l_t_chanm TYPE rsd_t_iobjnm,
l_s_chanm TYPE rsd_s_iobjnm,
l_t_hieid TYPE rshi_t_hieid,
l_hieid TYPE rshi_hieid,
l_t_aggrstate TYPE rsdds_t_aggrstate,
l_s_aggrstate TYPE rsdds_s_aggrstate,
l_t_msg TYPE rs_t_msg,
l_s_msg TYPE rs_s_msg.
write: /1 'Date' color 7, 12 'Time' color 7,
22 'Attribute change run Status'
color 7 .
Write AT /1(52) SY-ULINE.
DO L_TIMES TIMES.
CALL FUNCTION 'RSDDS_CHANGERUN_MONITOR'
IMPORTING
e_cr_state = l_cr_state
e_t_chanm = l_t_chanm
e_t_hieid = l_t_hieid
e_t_aggrstate = l_t_aggrstate
e_t_msg = l_t_msg.
case l_cr_state.
when rsdds_c_cr_state-finished.
write: /1 sy-datum, 12 sy-timlo,
22 'No Attribute change run Running'
color col_positive . --> We can check this program is usefull or not... how many times it saved dataload failues due to ACR Collision
(SPOOl we can check at TCODE: SP01)
EXIT.
when rsdds_c_cr_state-start.
write: /1 sy-datum, 12 sy-timlo,
22 'Attribute change run Started'
color 3.
wait up to 10 SECONDS.
when rsdds_c_cr_state-running.
write: /1 sy-datum, 12 sy-timlo,
22 'Attribute change run Running'
color col_negative.
wait up to 10 SECONDS.
when rsdds_c_cr_state-canceld.
write: /1 sy-datum, 12 sy-timlo,
22 'Attribute change run Cancelled'
color 4.
EXIT.
endcase.
ENDDO.

Wednesday, June 9, 2010

Difference between Costing based and Account based CO-PA



Account based is tied to a GL account posting. Costing based is derived from value fields. Account based would be more exact to tie out to the GL. Costing based is not easy to balance to the GL and more analytical and expect differences. Costing based offers some added revaluation costing features
Implementing costing based is much more work but also gives much more reporting possibilities especially focused on margin analyses. Without paying attention to it while implementing costing based COPA, you get account based with it, with the advantage of reconciled data.
COPA accounting based is for seeing at abstract level whereas costing based is the detailed level, 90% we go for costing based only.
COPA Accounting is based on Account numbers; where as cost accounting is based on cost centers.
COPA Tables: Account base COPA tables are COEJ, COEP, COSS and COSP

FAQs related to Reporting

1.What is BEX Download Scheduler?
The BEX Download Scheduler is an assistant that takes you through an automatic, step-by-step process for downloading pre-calculated Web templates as HTML pages from the BW server onto your PC.
2. Difference between Calculated key figure and Formula?
Formula & calculated key figures are functionality wise same.
Calculated Key Figure is global where as Formula is local (for that query only.
CKF will have tech name and description where as Formula will have only description.
CKF is available across all Queries on same InfoProvider where as formula is available  only for that Query.
While creating CKF, certain function will not available from formula builder where as while creating formula, all the function will be available from formula builder.
3. What is difference between filter and restricted key figure?
Filter restricts whole Query result where as RKF restricts only selected KF.

for Example: Lets assume we have 'company code' in filter and it is restricted by '0040'.Query output will be for only '0040'.

if u restrict a KF with '0040' in RKF, then only that KF data will be restricted by '0040'.
Restricted key figures are (basic) key figures of the InfoProvider that are restricted (filtered) by one or more characteristic selections. Unlike a filter, whose restrictions are valid for the entire query.
For a restricted key figure, only the key figure in question is restricted to its allocated characteristic value or characteristic value interval. Scenarios such as comparing a particular key figure for various time segments, or
plan/actual comparison for a key figure if the plan data is stored using a particular characteristic, can be realized using restricted key figures.
4. What is the use of Structure in BEX/Query?
Combination of characteristics and key figures are called Structure.
Structures are basically a grouping of key figures which can be created for a Infocube and reused in any other queries for that cube.
Structures find the biggest use in financial reports. Take an example of a financial report which has about 20 normal keyfigures, 10 calculated keyfigures and another 10 restricted keyfigures. Now assume that someone asks for a new report with all of these as well as 5 more keyfigures. Normally you would have to create a new query and manually re-create all the complex key-figures. However, if you had saved them as a structure, you just have to drag-and-drop the structure into the query. So if there was a change in the calculation of one key-figure, you just have to change the key-figure in the structure and not change all the 10 reports which show that key-figures.

We get a default structure for key-figures. That is most people use structures for key-figures and SAP has designed it that way.
Within a query definition you can use either no structures or a maximum of two structures. Of these, only one can be a key figure structure.
5. Difference between filter and condition in report
Filters act on Characteristics; Conditions act on Key Figures. You do not use KF in the filter area. Only char values can be restricted in the filter area, whereas Conditions are created to key figures.
6. Reporting Agent 
Definition: The Reporting Agent is a tool used to schedule reporting functions in the background.
The following functions are available:
  • Evaluating exceptions
  • Printing queries
  • Pre-calculating Web templates
  • Pre-calculating characteristic variables of type pre-calculated value sets.
  • Pre-calculation of queries for Crystal reports
  • Managing bookmarks
Use
You make settings for the specified reporting functions.
You assign the individual settings to scheduling packages for background processing.
You schedule scheduling packages as a job or within a process chain.

7. RRI: Report-Report-Interfacing is the terminology used to describe linking reports together. Report-Report-Interfacing uses Jump Targets that are created using the transaction code RSBBS (see Question #4). A Query with RRI functionality can be identified by clicking on the Goto icon in the BEx Analyzer toolbar.
8. What are the restrictions on ODS reporting? Active, retired and terminated employees can be separated using different ODS for detail reports.
9. Difference between ODS & Cube Reporting
ODS is 2 dimensional format and it is not good to analyze the data in multi dimensional way. If you want to take flat reporting then go for ODS reporting.
Cube is multidimensional format and you can analyze data in different dimensions, so if your requirement is multidimensional report go for Cube.

Example: List of purchase orders for a vendor is two dimensional reports whereas sales organization wise, sales area wise, customer wise sales for last quarter and comparison with earlier quarters is a multi-dimensional report.

Two dimensional reports are similar to reporting on a table. ODS active table is a flat table like an r/3 table. Reporting is done on active table of ODS. Other tables are for handling the deltas.

Cube structure is a star schema structure. Hence Reports on cubes are multidimensional reports.

10. Why we need to use 0Recordmode in ODS?
0Recordmode is an InfoObject for loading data into ODS. The value indicates how the data should be updated and which type.

Field 0RECORDMODE is needed for the delta load and is added by the system if a DataSource is delta-capable. In the ODS object the field is generated during the creation process.

What are the Query Tuning you do when you use reporting?


a) Install BW Statistics and use of aggregates for reporting

b) Avoid using too many characteristics in rows and columns, instead place it in free characteristics and navigate / drill-down later.

c) OLAP Cache (Change Cache TCode RSCUSTV14): It’s a technique that improves query performance by caching or storing data centrally and thereby making it accessible to various application servers. When the query is run for the first time, the results are saved to the cache so that next time when similar query is run, it does not have to read from the data target but from the cache.

d) Pre-calculated web templates

e) Use small amount of data as of starting points and do the drill down

f) Instead of running the same query each time save the query results in workbook to get the same query results for different users. Each time you run the query, it refreshes the data /same data should not fetch from data targets.

g) Complex and large reports should not run online rather they should be scheduled run during off-peak hours to avoid excessive contention for limited system resources. We should using RA to run those off-peak hours in batch mode.

h) Queries against remote cubes should be avoided as data comes from different systems.

i) If you have choice between using hierarchies and characteristics or navigational attributes, you should choose char or navigational attributes.

j) Create additional indexes

k) Use compression on cubes since the E tables are optimized for queries.

l) . Turn off warning messages on queries

Tuesday, June 8, 2010

Reasons for adopting Costing-Based COPA

SAP CO-PA was intended for use with a cost-based approach that stores different currencies, quantities and values from SD, FI, MM and PP as PA value fields to manipulate for a variety of reports.  This is the recommended path as it allows more variability in collecting data for PA reports (related to details of cost components for variances, etc.)
In case of High-Tech industry companies using SAP CO-PA, the majority of them utilitise Costing-Based COPA to enable more detail level of Cost of Sales analysis. 
 
Costing-Based CO-PA sometimes does not match with legal book values.  Such discrepancies can be explained mainly by 3 big factors:

Timing differences: When the Delivery step is performed in SAP SD, but Billing is not, nothing gets booked into COPA, but COGS is already booked in the FI legal book.  During the SD Prototype, since Billing Due List (a batch program) will be executed each day, which perform the billing step for Sales Order with Delivery but not yet billed, the COGS and Revenue will be in syn in both FI and COPA for AAA.
 
Accruals: It is possible that accrued values are posted in COPA (might be triggered by program in Sales Order conditions), without any posting in FI legal book
 
Rounding differences from Foreign Currency Translations
Note:
Management using/ viewing these COPA reports need to be acknowledged the fact that due to the intended design of the Costing-Base COPA, values not necessarily always tie to FI legal book. Discrepancies to FI might occur, but explainable.s