Sunday, August 28, 2011

HANA



HANA: SAP HANA is a combination of hardware and software specifically made to process massive real time data using In-Memory computing
2 Tera bytes RAM……….
It wasn't discussed at Sapphire, but the other potential beneficiaries of Hana are third-party BI vendors and those customers. Hana supports both SQL and MDX query, simplifying access by third-party tools such as IBM Cognos, Oracle BIEE, and MicroStrategy. These vendors are currently forced to rely on BW's BAPI interface, which can be slow.

HARDWARE DESIGN:
Huge amount of data is divided into multiple sets which are then crunched separately by the Blades as shown below.
blades1
blades2
Pic : Data is divided into 4 blades with 2 standby blades
The Blades are composed of multiple CPU’s per blade and each CPU has multiple cores per CPU.
This means if you say for example 8 cores per CPU and 4 such CPU’s per blade. So mere 4 Blades will have 128 cores crunching data in parallel.
SOFTWARE DESIGN:
HANA stores data in Column wise for fast computing. The below diagram compares, how data is stored row wise and column wise.
software-hana
For example: If system wants to find aggregate of the second column i.e. 10+35+2+40+12.
In Row wise: The system has to jump memory address to collect subsequent values for aggregation.
That is data records are available as complete tuples in one read which makes accessing of few attributes expensive operation.
In Column wise: A single scan would fetch the results much faster.
HOW IT WORKS:
saphana
SAP HANA can be used without disturbing our current IT landscape. In the above diagram show that data in the database can be replicated near real time into HANA and can be used for reporting with a number of BI tools directly sitting on the top of the HANA.

Saturday, January 1, 2011

One stage stop to know all about BW Extractors-Part2 and Part3

One stage stop to know all about BW Extractors-Part2

This blog focuses on customer generated extractors behavior (EX: CO-PA and FI-SL).
Along with that ,this blog also explains on data recovery methods incase of lost delta.
Please refer "One stage stop to know all about BW Extractors-Part1" to get an idea on BW content extractors.
Note: With regards to Generic extraction, only helpful links are provided.
Application specific-customer generated extractors:
Controlling:
Controlling is broken down into following sub modules:
  • Cost Element Accounting
  • Cost Center Accounting
  • Internal Orders
  • Activity-Based Costing ( ABC)
  • Product Cost Controlling
  • Profitability Analysis
  • Profit Center Accounting
Note: Only discussing (CO-PA) briefly because of the complexity of the area.
CO-PA:
Profitability analysis allows Management to review information with respect to the company's profit or contribution margin by business segment.
It can be obtained by the following methods:
  • Account-Based Analysis
  • Cost-Based Analysis
Note:The details will be discussed once after understanding the CO-PA data flow.

How the data Extraction happens?
When the data is requested from SAP BW, the extractor determines which data source the data is to be read from. This depends on the
  • Update mode (full, initialization of the delta method, or delta update)
  • On the definition of the DataSource (line item characteristics (apart from field REC_WAERS) or calculated key figures)
  • On the available summarization levels.
The extractor always tries to select the most appropriate data source, that is, the one with the smallest data volume.











Once an Info-Package is executed, the SAP BW Staging Engine calls the CO-PA transaction data interface. CO-PA extraction program for the SAP BW uses the same replication method as the update program for CO-PA updating summarization levels. On the BW side, only data that is "at least 30 minutes old" is received .This is to secure data integrity.Because the time stamps from different application servers can be slightly different.
This retention period of 30 minutes is often described as a "security delta/Safety delta" The system only extracts data that is at least 30 Min Old.

Account-Based Analysis
For account-based CO-PA extraction, only Full Update from summarization levels is supported for releases up to and including Release PI2001.1.
In this case we can carry out delta using pseudo delta technique. Here we need to do selective full load based on some selection conditions (Fiscal period) and then we need to selectively drop the requests for the last period and reload the data that have changed.
From Release PI2001.2, the delta method can also be used.
Initialization: The initialization must be performed from a summarization level.
Delta update: Delta will be read from line items.
During the delta load controlling area, fiscal period fields should be mandatory.
Note: If the data needs to be read from a summarization level, then the level must also contain all the characteristics that are to be extracted using the Data Source (entry * in maintenance transaction KEDV). Furthermore, the summarization must have status ACTIVE.
Account based CO-PA is part of the CO module. This means the data which is posted in account based CO-PA is always in synchronize with the CO-module (CCA, OPA, PA, PS etc).
The CO tables are COEP, COBK (for line items) COSS and COSP (for the totals).

Cost-Based Analysis:
In the case of costing-based CO-PA, data can only be read from a summarization level if no characteristics of the line item are selected apart from the Record Currency (REC_WAERS) field, which is always selected.
An extraction from the segment level, that is, from the combination of the tables CE3XXXX / CE4XXXX (where XXXX stands for the operating concern), is only performed for Full Updates if no line item characteristics are selected (as with summarization levels).
Initialization: There are two possible sources for the initialization of the delta method. One is from Summarization levels (if no characteristics of the line item are selected) and the other one is from line item level.
In case of Summarization level, it will also record the time when the data was last updated / built.
If it is not possible to read data from a summarization level, data is read from line items instead.
Delta update: Data is always read from line items.
Costing Based CO-PA data is statistical data. This means that the update of CO-PA is not always equal to what is stored in the CO modules or in finance. The cost element is also not always updated and there are also more key-figures used to store info about the type of costs or revenues.
Understanding various tables(CE1/CE2/CE3/CE4) that are involved in co-pa extraction, please read BW data extraction .
CO-PA Delta Mode:
Extraction is based on Timestamp.
When data is extracted from CO-PA, a "safety delta" of half an hour is used with the initialization and the delta upload. This always ensures that only records that are already half an hour old since the start of the upload are loaded into SAP BW. Half an hour was chosen as the safety delta to overcome any time differences between the clocks on the different application servers.
Please check the below links for more information:
Profitability analysis
FI-SL:
There are two types of ledgers in the FI-SL System:
Standard Ledger: Delivered by SAP, Ex: General Ledger Accounting (FI-GL)
Special Purpose Ledgers: These will be designed as per business needs (User defined,Ex:FI-SL)
The FI-SL Data Source can supply the data both at totals record level and also at line item level

How the data extraction happens?
Prerequisite:
Since FI-SL is a generating application, the Data Source, transfer structure and assignment of the DataSource to the InfoSource must be created manually.
FI-SL line items:
Line item Data Source provides actual data at line item level.
Full and Delta mode: FI-SL line items can be extracted both in full and delta upload mode. The time stamp (TIMSTAMP field in the extract structure) is used to identify the delta load, which is supplied from the CPUDT and CPUTM fields in the line items table. It uses safety delta concept set to one hour. This means that posted line items can be loaded into BW after an hour.
Constraint:
The extract structure does not contain the BALANCE field. Refer note 577644 to find out alternative ways to populate this field.
FI-SL Totals Records:
This DataSource can provide both actual and plan data at totals record level
Full update: The full update DataSource can be used to determine the balance carry forward, since the line items DataSource does not supply this.
Usually Plan data will be transferred using the totals datasource in full update mode.
Delta Update: The delta method can only be used for actual data with the selection (0VTYPE = 010). The Delta method is based on Delta queue technology. That means after initialization during updating, the relevant data is posted to the Delta queue.
Before running the delta, please check the restrictions in the below link
Delta-Special Ledger

Part3: Cross application -Generic extractors
When none of the SAP- predefined extractors meeting the business demand, then the choice is to go for Generic extraction
We will go for Generic extraction:
  1. When Business content does not include a data source for your application.
  2. Business content requires additional enhancements that need data that is not supplied by SAP BW.
  3. The application does not features it's own generic data extraction method
  4. When the requirement demands to use your own programs to fill your tables in SAP Systems
Check the below link for more information:

Data recovery:

Scenario 1: The last run delta was failed(Not applicable to ALE based datasources)
Solution:
Make the QM status red, delete the request from all targets
Re-schedule the load this time it will prompt a window as shown below








Click on request again, it will recover the failed request
Senario2: Everyday delta was running fine but you find suddenly delta is missing for certain period (the reason may be anything),
Solution:
1. Reload data from the PSA
2. Reload data from an ODS Object or an Info Cube (in a layered
Architecture, EDW approach)
Applicable to Logistics:
Please refer "One stage stop to know all about BW Extractors-Part1" to get an idea on Logistics extraction .
Option 1 and 2 are not applicable, the only choice is to extract the data from sources system
Check this OSS notes: 691721: Restoring lost data from a delta request
Here again we have one more constraint
As explained in the above OSS, because of huge data we can't bear the downtime due to re-initialization, we have a workaround here
1. in BW,transfer the existing target contents to an external source using open hub services
2. Then selectively fill the setup tables for the missing data for the respective duration.
3. And run initialization, schedule V3 jobs to enable delta postings
(Here we have a drawback, since we are deleting setup tables and refilling them using selections, setup tables won't contain entire historical data)
Check this interesting document:
How to minimize downtime for delta initilization
Further-602260: Procedure for reconstructing data for BW
In case of ODS
You can go for repair full load(739863: Repairing data in BW)
Senario3:
Accidentally good data was deleted consequently all the data which was loaded later was deleted.(assuming no further data mart, no aggregates to avoid the complexity, if considered the solution is more dynamic and situational)
Check these links for more intricate details to handle the above situation.

One stage stop to know all about BW Extractors-Part1

Types of Extractors: 









Application specific BW content extractors:
Lo Extraction:
Logistics refers to the process of getting a product or service to its desired location upon request which involves transportation, purchasing, warehousing etc.
Main areas in logistics are:
Sales and Distribution (SD) : application 11, 13, 08 (in LBWE T-code)
Materials Management (MM) : application 03, 02
Logistics Execution (LE) : application 12
Quality Management : application 05
Plant Maintenance (PM) : application 04, 17
Customer Service (CS) : application 18
Project System (PS) : application 20
SAP Retail : application 40,43,44,45
How the data extraction happens?
Extraction can be done using either Full update/delta update.
Full load: Incase of logistic application, Full/Initialization will extract the data from setup tables (contains only historical data).
So if you have decided to go for full load, wait a minute there is a road block
For full update the data will be taken from setup tables, so in order to capture the changes you need to fill setup tables every time ,which will be a laborious task.
So, it is always suggestible to go for delta loads which makes loading life easier.
Read the below note to get details on delta load-:
Initialization: Data will be fetched from application table to setup tables (In Lo extraction, the extractor won't allow the direct communication with the application tables) from here, data finally reaches the target (info cube/ODS).Remember this process is for onetime.
Pre-requisites: Prior to initialization make sure the following steps are completed:
  1. Maintain Extract Structure
  2. Maintain data sources
  3. Activate Extract Structure
  4. Delete Setup tables
  5. Fill setup tables
Delta load: Once after successful initialization, we can use delta update to capture the changed /new records
Once a new transaction happens/an existing record is modified, upon saving it goes to the respective application table. From here depending on the update mode Direct/queued/Unserialized V3 update the data will be populated in delta queue (RSA7) and finally reaches to BW.








Pre-requisites: Prior to delta loads make sure the following steps are completed:
1.Define periodic V3 update jobs 2. Setting up the update mode (direct/queued/Un serialized v3 update)
LO- Delta Mode:
Info object 0Recordmode helps in identifying the delta
Check the field "delta "in ROOSOURCE /RODELTAM table
Incase of Lo extraction it is "ABR"
ABR: An after image shows the status after the change, a before image the status before the change with a negative sign and the reverse image also shows the negative sign next to the record while indicating it for deletion. This serializes the delta packets.This process supports an update in an ODS object as well as in an Info Cube.

FI Extraction:
FI Module deals with accounting and financial needs of an organization.
Financial Accounting is broken down into the following sub-modules:
  • Accounts Receivables
  • Accounts Payable
  • Asset Accounting
  • Bank Accounting
  • Consolidation
  • Funds Management
  • General Ledger
  • Special Purpose Ledger
  • Travel Management
Note: Only discussing key areas (AP/AR/GL/SL) briefly because of the complexity of the area
We can extract the financial data at totals level / line item level.
In general, we will use R/3 line item tables as the data source for extracting the data to allow drill down capability from summarized data to line-item details.
Financial Accounting data can be extracted directly from the tables.
Depending on the business requirement we can use either FI-SL or standard BW content extractors (FI-AR, FI-AP, and FI-GL) to fetch FI data.
Note: FI-SL will be discussed under "one stage stop to know all about BW Extractors -Part2 "which explains about Application specific customer generated extractors
FI-AR, FI-AP, and FI-GL:
General Ledger: All accounting postings will be recorded in General Ledger. These postings are real time to provide up-to-date visibility of the financial accounts.
Account Receivable: Accounts Receivables record all account postings generated as a result of Customer sales activity. These postings are automatically updated in the General Ledger
Accounts Payable: Accounts Payables record all account postings generated as a result of Vendor purchasing activity. Automatic postings are generated in the General Ledger as well.
Standard FI data sources:
0FI_GL_4 (G/L Accounts- line items)
Takes the data from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS).
0FI_AP_4 (AP-line items) and 0FI_AR_4 (AR- line items
Selections are made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable)
With old G/L in R/3, the tables GLT0 (totals), BSEG, BKPF (Line Item) get filled in SAP BI side you should have to use data model.

0FI_GL_1 --> 0FI_GL_1 --> 0FIGL_C01 (for Totals)
0FI_GL_4 --> 0FI_GL_4 --> 0FIGL_O02 (for Line item)

With New G/L in R/3, the tables FAGLFLEXT (totals), FAGLFLEXA, BSEG, BKPF (Line Item) get filled in BI side you have to use data model

0FI_GL_10 --> 0FI_GL_10 --> 0FIGL_C10 (for Totals)
0FI_GL_14 --> 0FIGL_O14 (for Line item)

Functionally, this effects other financial modules like AP, AR, PCA (Profit Center Accounting)

for ex: while implementing new G/L in BI side, this fulfills most of profit center Accounting requirements, and you do not have to implement PCA module separately.
When I was in FI implementation, there was no 0FI_GL_14 datasource...

We had existing 0FI_GL_1 & 0FI_GL_4 flows and we implemented new GL totals flow i.e. 0FI_GL_10...

FAGLFLEXT --> 0FI_GL_10 --> 0FIGL_O10 --> 0FIGL_C10.... new GL Totals implementation was quite smooth, since this flow is completely different from old GL totals (GLT0 --> 0FI_GL_1 --> 0FIGL_C01).

We recreated existing (on 0FIGL_C01) queries on 0FIGL_C10 (&V10) and used jump targets (RRI) to old line item (0FIGL_O02) wherever required...

You can go ahead with new GL lineitems (FAGLFLEXA & BSEG & BKPF) --> 0FI_GL_14 --> 0FIGL_O14 in parallel with existing old one (BSEG & BKPF) --> 0FI_GL_4 --> 0FIGL_O02.
How the data extraction happens?
In FI extraction 0FI_AR_4 and 0FI_AP_4 are linked with 0FI_GL_4 in order to maintain consistent data transfer from OLTP system (it is called coupled data extraction, Ref OSS notes 428571).
Note: Uncoupled" extraction possible with Plug-In PI 2002.2, see OSS note 551044
0FI_GL_4 writes the entries into the time stamp table BWOM2_TIMEST in the SAP R/3 System with a new upper limit for the time stamp selection.
And now, 0FI_AP_4 and 0FI_AR_4 will copy this new upper limit for the time stamp selection during the next data extraction in the SAP R/3 System. This ensures the proper synchronization of accounts payable and accounts receivable accounting with respect to G/L accounting.
Full load: Not a valid choice because of large volumes of detailed R/3 transaction data.
Delta load:
Note: Here the delta identification process works differently for new financial records and for changed financial records.
New Financial accounting line items which are posted in SAP R/3 sytem will be identified by the extractor using the time stamp in the document header (Table BKPF-(field) CPUDT).
By scheduling an initialization IP all the historical data can be loaded into BW from the application tables and it also sets "X" indicator in field LAST_TS (Flag: 'X' = Last time stamp interval of the delta extraction).That means after the last delta, initialization was done.




After this, daily delta loads can be carried out depending on timestamp by scheduling delta info packages.
During the delta load , the SAP R/3 system logs two time stamps that delimit a selection interval for a Data Source in table BWOM2_TIMEST(fields TS_LOW and TS_HIGH).








In case of changed FI documents, selections will be based on tables:
BWFI_AEDAT and (timestamp table) BWOM2_TIMEST (See OSS note 401646 for more details).
Delta extraction using delta queue method can also be possible incase if we want,
  • Serialization of the records
  • To distribute delta records to multiple BW systems.
FI -Delta Mode:
A time stamp on the line items serves to identify the status of the delta. Time stamp intervals that have already been read are then stored in a time stamp table (BWOM2_TIMEST).
(Info object 0Recordmode plays vital role deciding delta's .Check the field "delta "in ROOSOURCE /RODELTAM table to identify the image)
The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method).
AIE: This delta method is not suitable for filling Info Cubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (Info Cubes) can be provided with data from this ODS object.
It uses delta type E(pull) means the delta data records are determined during the delta update by the data source extractor, updated to the delta queue and passed on to BI directly from there.

Phases of a SAP Project (ASAP)

Phases of a SAP Project (ASAP)


Phase I: Project Preparation 
  • Decision makers define clearly Project objectives and an effective decision making process.
  • Define Project organization and Roles.
  • Implementation scope is finalized
  • System Landscape and technical requirements are finalized
  • Infrastructure (Hardware/Interfaces)
  • High level Strategies for client
  • Archiving strategy
  • Issues of Data bases
  • Other issues like
  • Unanticipated tasks
  • Normal tasks that can not be completed
  • External factors that need to be dealt with
Phase II: Blue Print Phase 
  • Scope of R/3 implementation is defined.
  • Business blue print is created.o Tools used for this is ASAP Implementation assistant
  • Question and Answer Data
  • Base (Q&A dB) Business Process Master List (BPML)
  • R/3 Structure
  • Modules Business Navigator and External Modeling tools?
  • Project Management (Activities like…)
  • Conducting Status Meeting for Project Meeting
  • Conducting Steering committee meetings
  • Addresses the issues like Organizational Change Management
  • Other Activities Like
  • Project Team Training
  • Developing the system environmento Defining Org Structure
  • Define Business Process
Phase III: Realization Phase
  • Configuring the R/3 System
  • Defining the Authorizations in R/3
  • Defining the work flow
  • Creating Use Documentation
  • System manager procedures
  • Developing the System Test plans
  • Define the Service level Commitment
  • Establishing the System Administration function
  • Setting up the Quality Assurance environment
Phase IV: Testing and Final Preparation
  • Testing, user Training, System management and Cutover activities
  • Test Plan has the activities of:
  • Testing the conversion procedures and programs
  • Total Interface programso Volume and Stress testing
  • Final User acceptance testingo Developing a Final Go-Live strategy
  • Redirection to Go-Live? Preparation of End-user documentation
  • Training for the End Users
  • Technical environment installation on Production system and testing
  • Transfer of Legacy data and preparation plan for Go-Live
Phase V: GO-Live Phase
  • Production support facilities are carried out
  • Validations of Business process and their configuration
  • Follow up training for End Users
  • Sign–Off.

Process Chain Creation using SAP BW 3.5

Process Chain Creation using SAP BW 3.5

Process chains are used extensively in SAP Business Warehousing for executing loads, running reports etc. sequentially or parallel.
There can be number of steps with in a process chain. You can also decide what has to be done incase a particular step fails. In case of failure you can decide to sent notification email to concern person who can correct the issue.

In this section we will see how to create a process chain using SAP BW 3.5. Using this process chain we will load data in Infocube.

1) Execute transaction RSPC

2) Press 'Create', enter 'Name' and 'Description' of the process chain, press 'Continue'

3) First step in the process chain will be a 'Start' step,
press 'Create' to create a process variants

4) Enter 'Name' and 'Description' of the process variant

5) 'Save' process variant and go back, 'Change selections' is used to schedule the process chain, we will revisit this again later

6) Press 'Continue'
7) Select the InfoPackage which is used to load data in
Infocube, drag InfoPackage to right panel

8) Select the process variants as InfoPackage using fucntion key F4

9) Select the InfoPackage ZAAA and press 'Continue'

10) Following steps will be added in the process chain
11) Join 'Start' step with step 'Indexes' by selecting 'Start' step and
dragging the line to 'Indexes'

12) Go back to 'Start' step and right click to 'Maintain Variant'

13) Select 'Immediate' to schedule the process chain immediately

14) 'Check' the process chain and 'Activate' the process chain

15) Execute the process chain, enter the server name in the following screen

16) Process chain will start executing, to check the status, Press

17) Select 'Yesterday and Today' to check the status of the process chain

18) Following screen shows that load is still not completed


19) Once completed, the status will appear as follows,
incase there is any failure, the status will appear as cancelled

SAP R/3 or ECC DataSources for SAP BI 7.0

SAP R/3 or ECC DataSources for SAP BI 7.0

Financial Accounting: General Ledger:FI-GL (BKPF, BSEG TABLES)
0FI_GL_1:General ledger: Transaction figures
0FI_GL_2:General ledger: Transaction figures - Cost of sales ledger
0FI_GL_6:General Ledger Sales Figures via Delta Extraction
0FI_GL_7:General Ledger Cost of Sales Ledger via Delta Extraction
0FI_GL_8:General Ledger: Statistical Key Figures
0FI_GL_10:General Ledger: Leading Ledger Balances
0ACAC_CALC:Manual Accruals: Calculated Accrual Values
0FI_TX_4:Taxes: Line Items by Means of Delta Extraction
0ACE_CALC_RESULT:Accrual Engine: Calculated Accrual Values

Financial Accounting: Customers:FI-AR (BSID, BSAD TABLES)
0FI_AR_1:Customers: Transaction figures
0FI_AR_2:Customers: Item
0FI_AR_3:Customers: Line Items
0FI_AR_4:Customers: Line Items with Delta Extraction
0FI_AR_5:Customers: Payment History
0FI_AR_6:Customer Sales Figures via Delta Extraction
0FI_AR_7:Customer SGL Sales Figures via Delta Extraction
0FI_AR_8:Customer Credit Managemt. Central Data via Delta Extraction
0FI_AR_9:Customer Credit Managmt. Cntrl Area Data via Delta Extractn
0FI_AR_10:Customer Payment Behavior via Delta Extraction

Financial Accounting: Vendors:FI-AP (BSID, BSAK TABLES)
0FI_AP_1:Vendors: Transaction figures
0FI_AP_2:Vendors: Item
0FI_AP_3:Vendors: Line Items
0FI_AP_4:Vendors: Line Items with Delta Extrcation
0FI_AP_6:Vendor Sales Figures via Delta Extraction
0FI_AP_7:Vendor SGL Sales Figures using Delta Extraction

Financial Accounting: Asset Accounting:FI-AA
0FI_AA_001:ANNUAL VALUES
0FI_AA_002:POSTED DEPRECIATION
0FI_AA_003:REVALUATION/YEAR
0FI_AA_004:INVESTMENT SUPPORT
0FI_AA_005:FI-AA: ACQUISITIONS
0FI_AA_006:TRANS. W/ VAL.ADJ.
0FI_AA_11:FI-AA: Transactions
0FI_AA_12:FI-AA: Posted Depreciations

Master Data Financial Accounting in General:FI-IO
0GL_ACCOUNT_ATTR:Account Number
0GL_ACCOUNT_TEXT:Account Number
0ACCOUNT_0109_HIER:Account Number
0ACCT_TYPE_TEXT:Account Type
0BUS_AREA_TEXT:Businees Area
0BUS_AREA_ATTR:Business Area
0CHRT_ACCTS_TEXT:Chart of Accounts
0CHRT_ACCTS_ATTR:Chart of Accounts
0COMPANY_TEXT:Company Code
0COMP_CODE_TEXT:Company Code
0COMP_CODE_ATTR:Company Code
0AC_DOC_TYP_TEXT:Document Type
etc..

Master Data Financial Accounting: Asset Accounting:FI-AA-IO
0ASSET_CLAS_TEXT:Asset Class
0ASSET_ATTR:Asset Subnumber
0ASSET_TEXT:Asset Subnumber
0ASSET_ATTR_TEXT:Asset Subnumber with Description
0DEPRAREA_TEXT:Depreciation area real or derived
0ASSET_AFAB_ATTR:Depreciation Area Real or Derived
etc...

Materials Management:MM-IM2LIS_03_BX:Stock Initialization for Inventory Management
2LIS_03_BF:Goods Movements From Inventory Management
2LIS_03_UM:Revaluations
etc..

Materials Management:MM-PUR
2LIS_02_ACC:Purchasing Data (Account Level)
2LIS_02_HDR:Purchasing Data (Header Level)
2LIS_02_ITM:Purchasing Data (Item Level)
2LIS_02_SCL:Purchasing Data (Schedule Line Level)
2LIS_02_S011:Purchasing groups
etc...

Materials Management Master Data:MM-IO
0DISMM_ATTR:MRP Type
0DISMM_TEXT:MRP Type
0MATL_GROUP_TEXT:Material Group
0MAT_VEND_ATTR:Material Number Compounded to vendor
0MATL_CAT_TEXT:Material Type
0WM_MVT_TYP_TEXT:Movement type for Warehouse Management
0PUR_GROUP_TEXT:Purchasing Group
0PURCH_ORG_TEXT:Purchasing Organization
0PUR_REASON_TEXT:Reason for Ordering
0VENDOR_ATTR:Vendor Number
0VENDOR_TEXT:Vendor Number
etc.....

Types of DSOs in SAP BI and a Few Terminologies in new release

Types of DSOs in SAP BI and a Few Terminologies in new release

SAP BI Terminology has been changed: With SAP NetWeaver 7.0, the following terminology changes have been made in the area of Warehouse Management:
The Administrator Workbench is now called Data Warehousing Workbench.
The ODS object is now called DataStore object.
The transactional ODS object is now called DataStore object for direct update.
The transactional InfoCube is now called real-time InfoCube.
The RemoteCube, SAP RemoteCube and virtual InfoCube with services are now referred to asVirtualProviders.
The monitor is now called the extraction monitor, to distinguish it from the other monitors.
OLAP statistics are now called BI Runtime Statistics.
The reporting authorizations are now called analysis authorizations. We use the term standard authorizations to distinguish authorizations from the standard authorization concept for SAP NetWeaver from the analysis authorizations in BI.
Note: You may still come across instances of the old terminology in the documentation(SAP Help)
.


Types of DataStore Objects (DSOs) : -





1.Standard DSO : -











2. Write Optimized DSO:-












3. DSO for Direct Update: -




Info @ a Glance: The DataStore object for direct update differs from the standard DataStore object in terms of how the data is processed. In a standard DataStore object, data is stored in different versions (active, delta, modified), whereas a DataStore object for direct update contains data in a single version. Therefore, data is stored in precisely the same form in which it was written to the DataStore object for direct update by the application. In the BI system, you can use a DataStore object for direct update as a data target for an analysis process. More information: Analysis Process Designer(APD).
The DataStore object for direct update is also required by diverse applications, such as SAP Strategic Enterprise Management (SEM) for example, as well as other external applications.

The Procedure to create Direct Update DSO is as usual as Standard DSO, but Under Settings of DSO creation screen -->Type of DataStore Object --> Direct Update.
If you want to entere data manually. Go to T.code: RSINPUT. It Just look like this
Note: Not Suitable for Productive Use and Only for creating Test Data and Rapid Prototyping.






Just click on Change/Create ->
It allows to enter records (Entries) and displays on Top.
The DataStore object for direct update is available as an InfoProvider in BEx Query Designer and can be used for analysis purposes.