Notes
Notes
Why we need SAPBW SAP Business Information Warehouse is a key business component of the SAP framework. It Allows business reporting and decision support that is both user-friendly and sophisticated. SAP BW is the end to end solution of Data warehousing...............BW is used to build reports and also used to extract data from others like oracle, informatica, excel or xml files or even from SAP tables etc.......... SAP BI is the data warehouse solution provided by SAP AG, SAP is an ERP solution SAP stands for System Application Products and BI stands for Business Intelligence SAP BIW/BW/BI major releases: Versions 2.0 3.0 3.1 3.5 7.x Approximate Release date 1999 2001 2002 2004 2005 onwards
SAP mainly useful to accelerate the process, to reduce the cost and to automate the process SAP BI is multi dimensional data modeling concept BI is basically designed based on star schema and extended star schema Data extraction means fetching the data from source systems (we have two types of source systems (1) SAP Source systems and (2) Non SAP source systems) Fetching data from SAP to SAP by ALE technique and from non sap systems to BW we use BAPI techniques Bex designer or analyzer is the reporting tool to generate reports in BI, Bex Browser is to view the reports SAP BI: - The main aim and goal of SAP BI is to provide data warehouse solution, The major goal of SAP BI is to offer a complete end to end data warehouse solution Definition the data warehouse we defined as a subject oriented, integrated, non volatile and time variant collection of data in support of management decisssions. Subject oriented deal with customer, sales, deliver, bills etc, and the data in the data warehouse is organized around for a major business purpose Classical example includes subject specific analysis of data example sales customer of SD aplicacaiton Integrated data from different subject area can be rationalized or grouped with each other Non volatile data in the data ware house is not updatable if once a record is properly placed in the BW it is not subject to change Time variant data in the BW is specific to a movement of time i.e. each record contain time stamp i.e. data & time Functions of data ware house 3 important functions we have in BW/BI 1. Data extraction. 2. Data modeling and 3. Reporting What is the Star Schema for Data Warehouse Design
A Star Schema: Most business intelligence warehouses use what is called a dimensional model, where a basic fact table of data e.g. sales or support calls is surrounded and linked with other tables holding the dimensions of the fact table.
Another example/definition of star schema Star schema A fact table surrounded by dimensional tables resembles like a star, so the schema is called as star schema. Fact table contains all operational attributes and is large in size, contains all duplicate records.i.e, transaction data .where as dimensional table contain descriptive attributes, and is small in size when compared to fact table, no redundancy in the tables i.e, master data SAP BW- types of cube
This particular fact table has four main dimensions - Customer, Time, Product and Staff. These dimensions are then linked to the fact table through indexes (highlighted in yellow) to enable tables to be joined to permit fast queries, reports and data consolidations to be carried out. For example, how many transactions for product x have we had this quarter? This data model or schema is simple, allows fast retrieval, can be readily extended without changing all the existing standard reports and queries. The disadvantage is that there is some data redundancy which could cause inconsistency if not all of the redundant data is kept up to date Extended Star Schema: Many of the problems associated with the basic star schema are resolved with the BW extended star schema. With the extended star schema, attributes are removed from the dimensions and placed outside the Info Cube in master data tables.
The BW extended star schema differs from the basic star schema. It is divided by a solution dependent part (Info Cube) and a solution independent part (attribute, text and hierarchy tables) which is shared among other InfoCubes.In BW, attributes located in the dimensions are called characteristics. In BW, attributes located in a master data table of a characteristic are called attributes of the characteristic. When designing a solution, it is a great challenge to decide whether an attribute should reside in a dimension table and thus in the Info Cube or in a master table or even both. Data is loaded separately into the master data (attributes), text and hierarchy tables. The SID table provides the link between the master data and the dimension tables. The fact table and the relevant dimension tables of an Info Cube are connected with one another relationally using the dimension keys. The dimension key is provided by the system per characteristic combination in a dimension table. With the execution of a query the OLAP processor checks the dimension tables of the Info Cube to be evaluated for the characteristic combinations required in the selection. The dimension keys determined in this way point the way to the information in the fact table. Dimension tables consist of a maximum of 248 characteristics. The Time dimension holds the time characteristics needed for analysis. The Unit dimension contains the unit of measure and currency characteristics needed to describe the key figures properly. The Data Packet dimension is used to identify discrete packets of information loaded into the Info Cube. In this way, packets can be deleted, reloaded or maintained individually. Extended star schema means a fact table surrounded by 16 dimension table, from these extended star schema we can analyal data in 16 angles, it is supported to multi language, exteneral hererical, faster access in numerical data,mater data is reusable things Multidimensional Data Model Multidimensional data model is to view it as a cube. The cable at the left contains detailed sales data by product, market and time. The cube on the right associates sales number (unit sold) with dimensions-product type, market and time with the unit variables organized as cell in an array. This cube can be expended to include another array-price-which can be associates with all or only some dimensions. As number of dimensions increases number of cubes cell increase exponentially.
Dimensions are hierarchical in nature i.e. time dimension may contain hierarchies for years, quarters, months, weak and day. GEOGRAPHY may contain country, state, city etc.
Dimension Tables A star schema stores all of the information about a dimension in a single table. Each level of a hierarchy is represented by a column or column set in the dimension table. A dimension object can be used to define the hierarchical relationship between two columns (or column sets) that represent two levels of a hierarchy; without a dimension object, the hierarchical relationships are defined only in metadata. Attributes are stored in columns of the dimension tables. Fact Tables Measures are stored in fact tables. Fact tables contain a composite primary key, which is composed of several foreign keys (one for each dimension table) and a column for each measure that uses these dimensions. The following table summarizes the major differences between OLTP and OLAP system design.
OLTP System Online Transaction Processing (Operational System) Source of data Purpose of data What the data Inserts and Updates Queries Processing Speed Space Requirements Database Design Backup and Operational data; OLTPs are the original source of the data. To control and run fundamental business tasks Reveals a snapshot of ongoing business processes Short and fast inserts and updates initiated by end users Relatively standardized and simple queries Returning relatively few records Typically very fast Can be relatively small if historical data is archived Highly normalized with many tables Backup religiously; operational data is critical to run
OLAP System Online Analytical Processing (Data Warehouse) Consolidation data; OLAP data comes from the various OLTP Databases To help with planning, problem solving, and decision support Multi-dimensional views of various kinds of business activities Periodic long-running batch jobs refresh the data Often complex queries involving aggregations Depends on the amount of data involved; batch data refreshes and complex queries may take many hours; query speed can be improved by creating indexes Larger due to the existence of aggregation structures and history data; requires more indexes than OLTP Typically de-normalized with fewer tables; use of star and/or snowflake schemas Instead of regular backups, some environments may consider
Recovery
the business, data loss is likely to entail significant monetary loss and legal liability
Another set of difference between OLTP and OLAP Current data Short database transactions Online update/insert/delete Normalization is promoted High volume transactions Transaction recovery is necessary
OLAP Current and historical data Long database transactions Batch update/insert/delete Renormalization is promoted Low volume transactions Transaction recovery is not necessary
SAP BW-ARCHITECTURE The SAP Business Information Warehouse allows you to analyze data from operative SAP applications as well as all other business applications and external data sources such as databases, online services and the Internet. The Administrator Workbench functions are designed for controlling, monitoring and maintaining all data retrieval processes.
The SAP Business Information Warehouse enables Online Analytical Processing (OLAP), which processes information from large amounts of operative and historical data. OLAP technology enables multi-dimensional analyses from various business perspectives. The Business Information Warehouse Server for core areas and processes, pre-configured with Business Content, ensures you can look at information within the entire enterprise. In selected roles in a company, Business Content offers the information that employees need to carry out their tasks. As well as roles, Business Content contains other pre-configured objects such as Info Cubes, queries, key figures, and characteristics, which make BW implementation easier. With the Business Explorer, the SAP Business Information Warehouse provides flexible reporting and analysis tools for analyses and decision-making support in your enterprise. These tools include query, reporting and OLAP functions. As an employee with access authorization, you can evaluate past or current data on various levels of detail and from different perspectives not only on the Web but also in MS Excel.
Types of attributes Display attribute, nagaviation attribute, time dependent attribute, time independent, compounding attribute, excutive attribute, transation attribute Types of attributes Basically there r 2 types of attributes.... 1. Display Attributes: These can be used to display additional information in reports & queries. These are used for display purpose only. Again in this we will have.....Time dependent display attributes, Time independent display attributes. 2. Navigational Attribues: These can be used for display and for navigation and filtering in BEx queries. Again in this we will have Time dependent navigational attributes, Time independent navigational attributes.... Time characteristics We cant create Time Characteristic in SAP...It is business content object Slowly changing dimensions A Dimension is changing with the Time Period ex. A sales Person changing his Job Title according to the time Data types DATATYPES 1. CHAR A-Z 2. NUMC 0-9 3. DATS YYYY/MM/DD 4. TIME HH/MM/SS Name space for BW All tables begin with /BI0 for SAP..... All tables begin with? BIC for customers.... All SAP objects starts with 0 and customer is A-Z. All generated objects start with 1-8(like export data source).... Modeling Modeling is defining the info objects which are the core building blocks of SAP BW which defines the business process and are used for reporting and analysis. Modeling Features Metadata modeling provides a Metadata Repository where all the metadata is stored and a Metadata Manager that handles all the requests for retrieving, adding, changing, or deleting metadata.
Reporting and scheduling mechanism: Reporting and scheduling are the processes required for the smooth functioning of SAP BW. The various batch processes in the SAP BW need to be planned to provide timely results, avoid resource conflicts by running too many jobs at a time and to take care of logical dependencies between different jobs. These processes are controlled in the scheduler component of AWB. This is achieved by either scheduling single processes independently or defining process chains for complex network of jobs required to update the information available in the SAP BW system. Reporting Agent controls execution of queries in a batch mode to print reports, identify exception conditions and notify users and pre compute results for web templates. Administering ETL service layer in multi- tier level: SAPs ETL service layer provides services for data extraction, data transformation and loading of data. It also serves as the staging area for intermediate data storage for quality assurance purposes. The extraction technology of SAP BW is supported by database management systems of my SAP technology and does not allow extraction from other database systems like IBM, IMS and Sybase. It does not support dBase; MS Access and MS Excel file formats. However, it provides all the functionality required for loading data from non- SAP systems as the ETL services layer provide open interfaces for loading non-SAP data. Attributes Attribute is also an info object which gives more meaning to the Info object E.g. If Customer number is one info object the attributes which can add more meaning to the customer number are Address, Name, region...etc Characteristics Any descriptive matter is called char e.g.: Customer Number, Customer Name, customer address...etc Key figure Any quantitative measure is called Key figure Tables for master data The tables created for Master data are: P table: Time independent display attributes Q table: Time dependent display data attributes M view: Combines P and Q X table: time independent navigational attributes Y table: time dependent navigational attributes Dimensions Dimension containing character tic, Type of dimension in info cube Line dimension Online dimension Slowly changing dimension Time disturebation dimension Dimensions Dimensions contain characteristics........ In cube v r having 16dimensions..........in that 3 (unit, time, and request) SAP defined....13 r customers defined...... Max no. of key fig IN INFO CUBE WE R HAVING 233 FOR K.F SID
Sid is called as surrogate ID, it main advantage is to convert from alpha-numerics to numerics for faster access the data SID SID is Surrogate ID table....it is the interface between master data and dimension tables.... SID Advantages are: Uses numeric as indexes for faster access Master data independent of info cubes Language support Slowly changing dimension support Max no. of characteristics Max no of character is 233 Max no. of characteristics Max no of character is 233 in dimension There are two types of cubes 1. Physical Cubes -- Contains Data 2. Virtual Cubes -- Contains No Data Physical cubes are again two types 1. Basic Cube -- For the existing data 2. Transaction Cube -- For forecasting Eg: SAP BPS (Business planning and Simulation) Virtual info cubes are again three types 1. SAP remote cube -- for other modules like CRM etc 2. Non SAP remote cube -- for other databases 3. with services -- for a particular functional module
Who used to make the Technical and Functional Specifications? Technical Specification: Here we will mention all the BW objects (info objects, data sources, info sources and info providers). Then we are going to say the data flow and behavior of the data load (either delta or full) also we can tell the duration of the cube activation or creation. Pure BW technical things are available in this document. This is not for End users document. Functional Specification: Here we will describe the business requirements. That means here we are going to say which are all business we are implementing like SD, MM and FI etc., then we are going to tell the KPI and deliverable reports detail to the users. This document is going to mingle with both Function Consultants and Business Users. This document is applicable for end users also. An SAP BW functional consultant is responsible for the following: Key responsibilities include: Maintain project plans Manage all project activities, many of which are executed by resources not directly managed by the project leader (central BW development team, source system developer, business key users) with key users to agree reporting requirements, report designs Translate requirements into design specifications( report specs, data mapping / translation, functional specs) Write and execute test plans and scripts . Coordinate and manage business / user testing Deliver training to key users Coordinate and manage product ionization and rollout activities Track CIP (continuous improvement) requests, work with users to prioritize, plan and manage CIP An SAP BW technical consultant is responsible for: SAP BW extraction using standard data extractor and available development tools for SAP and non-SAP data sources. -SAP ABAP programming with BW Data modeling, star schema, master data, ODS and cube design in BW Data loading process and procedures (performance tuning) Query and report development using Bex Analyzer and Query Designer Web report development using Web Application.
Administrator Workbench: The Administrator Workbench is the tool to maintain, control and monitor the Business Information Warehouse. Therefore, the tool is used within all layers of the BW System. Design of all components of the Business Information Warehouse including customizing/maintenance scheduling the data load from several sources of data executing the data load. Monitoring the data load and the data update
SAP R/3 Systems, Strategic Initiative Products and external systems are represented as OLTP systems in the lower section. SAP delivers Production Data Extractors that prepare SAP R/3 OLTP data per application for the extraction into the Business Information Warehouse. For non-SAP OLTP systems, the BW BAPI interface allows to use third-party extraction tools to prepare and extract data. Meta data and application data is managed on the Business Information Warehouse server. You manage the various source systems using the Administrator Workbench of the Business Information Warehouse. In addition, you schedule and monitor the transfer of Meta and transaction data from the assigned legacy systems using the components of the Administrator Workbench - the aptly named Scheduler and Monitor. The data presentation layer is third consisting of the Business Explorer tool set, the ODBO interface for third party reporting products and Web Browser access to BW information. The individual areas will be explained in detail in the following units.
10
As components of the Business Framework, OLTP applications and the Business Information Warehouse communicate using Business Application Programming Interfaces (BAPI) the top layer is the reporting environment, it can be Business Explorer (BEx) or a third party reporting tool. BEx consists of two components. 1. BEx Analyzer 2.BEx Browser BEx Analyzer: It is a Microsoft Excel with a BW add in, it allows users to create queries without coding SQL statements. BEx Browser: This works much like an information center allowing users to organize and access all kinds of information. The middle layer BW server carries 3 tasks administering the BW system. Storing data. Retrieving data according to users requests. Administrator Workbench: maintains Meta data and all BW objects it has two components BW Scheduler for scheduling jobs to load data. BW Monitor for monitoring the status of data loads. Metadata Repository: contains information about data warehouse. Metadata comprises data about data. This contains two types of metadata Business related (ex-definitions and descriptions used for reporting) and technical (structured and mapping rules used for data extraction and transformation) Staging Engine: Implements data mapping and transformation triggered by BW scheduler, it sends requests to a source system for data loading, and the source systems then select and transfers data into BW. Staging engine consists of several processes that gather data from data sources clean and perform data transformation and populate Info cubes. Staging process is described in 5 steps 1. SAP R/3 OLTP prepares needed data using the extract structure 2. Extractor in SAP R/3 OLTP moves data to transfer structure. 3. Using ALE technology, SAP R/3 copies data to SAP BW in the form of transfer structure. 4. Using Transfer rules defined in SAP BW, SAP R/3 data is transferred into the communication structure. 5. The Update rules in SAP BW update Info cubes out of the communication structure. Info cube: Info Cubes are made up of a number of Info Objects. All Info Objects (characteristics and key figures) are available Info Cubeindependently. Characteristics relate to master data with their attributes and text descriptions. An Info Cube consists of several Info Objects and is structured according to the star schema. This means there is a (large) fact table that contains the key figures for the Info Cube as well as several (smaller) dimension tables which surround it. The characteristics of the Info Cube are stored in these dimensions. An Info Cube fact table only contains key figures, in contrast to an ODS object, whose data part can also contain characteristics. The characteristics of an Info Cube are stored in its dimensions. The dimensions and the fact table are linked to one another via abstract identification numbers (dimension IDs), which are in the key part of the particular database table. As a result, the key figures of the Info Cube relate to the characteristics of the dimension. The characteristics determine the granularity (the degree of fineness), in which the key figures are managed in the Info Cube .Characteristics that logically belong together (district and area, for example, belong to the regional dimension) are grouped together in a dimension. By adhering to this design criterion, dimensions are to
11
a large extent independent of each other, and dimension tables remain small with regards to data volume, which is desirable for reasons of performance. This Info Cube structure is optimized for Reporting. ODS: Overwrite capability of key figures and characteristics. Store data on the document level it is not possible to overwrite data in info cube. Whenever data is added to the info cube, this data is aggregated. Data can be overwritten in the ODS and this provides a significant capability to BW. Prior to the existence of the ODS, decisions on granularity were based solely on the data in the info cube. Now the info cube can be less granular with data held for a longer period of time versus the ODS which can be very granular but hold data for a shorter period of time. ODS data can be reported on in a variety of ways. From the PSA we have the possibility to upload the data to the ODS (Operational data store). The ODS can have one layer, but depending on the business scenario, the BW ODS can be structured with multiple levels. Thus the ODS objects offer data that are subject-oriented, consolidated and integrated with respect to the same process on different source systems. Data from the ODS can be updated into the appropriate Info Cubes or other ODS objects. Reporting on the ODS can be done with the OLAP processor or directly with an ODS query. Info Object: Business evaluation objects are referred to as Info Objects in BW. There are different types of Info Objects. These are characteristics, key figures, units, time characteristics, and technical characteristics. Info Objects are the smallest components in BW. They are used to structure the information that is needed to create larger BW objects, such as Info Cubes or ODS Objects. Info Objects with attributes or texts can be pure data targets or Info Providers. Data Extraction:
The Extract Source Structure is used to stage the retrieved data temporarily in the source system. Data is moved from the extract structure into the transfer structure on the source system. By means of ALE or TRFC data is then transferred from the source system into the BW system. An Info Source is a set of logically associated information. Info Sources can contain transaction data (this data is stored in Info Cubes) and master data (attributes, texts, and hierarchies - this data is stored in separate tables). Info Sources describe all the information available for a business transaction or a type of business transaction (e.g. cost center accounting). Transfer structures support the transfer of data in a Data Source between a source system and the associated SAP BW System. The transfer structure transports the Data Source data from a source system to an SAP BW System and passes it on to the Info Source using transfer rules. The Communication Structure is independent of the source system and is generated from the Info Source. It is filled from the transfer structure in accordance with the transfer rules. The communication structure contains all the fields in an Info Source. Each Info Cube can have one set of update rules that determines how the data in the Info Source is stored in the Info Cube.
12
Business Content:
The BW Data Sources help prepare the data that is to be updated in the Business Information Warehouse. The Info Source with a Data Source allows the BW system to import the extracted data from the extraction structure in the SAP R/3 OLTP System transfer the data to the BW transfer structure using the OLTP System transfer structure convert the data using the transfer rules in BW and transfer the data to the communication structure in BW.The Info Source prepares the data for permanent storage in the Info Cube. FI-SL Data Sources cannot be delivered by SAP with BW Business Content for non-standard ledgers, because the ledgers and their FI-SL totals tables may link to user-defined fields and structures. All other FI Data Sources (FI-GL, FI-AR, FI-AP) are delivered with Business Content. To provide an FI special ledger Data Source for BW, an extract structure (=transfer structure) must be generated for the FI-SL totals table in question. You then have to define the Data Source for the totals table ledger. The resulting Data Sources transfer the totals records from the FI-SL. No single line items are transferred. Generate a transfer structure for your totals tables. Use this structure to transfer transaction data from your ledgers into the Business Information Warehouse. The transfer structure contains fields from the totals table, from the object table and derived fields from Customizing. The system derives the name of the transfer or extract structure according to the following rule for naming totals tables: If the name of the table ends in a T, the T is replaced with a B.If the name does not end in T, a B is added to the end of the name of the totals table. If the same totals table is regenerated, the existing structure is overwritten. Data Source for a ledger is defined and assigns the Data Source to the ledger. When you carry out this step, all non-standard ledgers are displayed first, which already have extract structures for their totals tables. Formulated differently: Reasons for which a ledger is not displayed in step 2 could be: The ledger is a standard ledger (starts with a figure, for example, 00=general ledger 0F=Sales costs. There are Business Content Data Sources for these.
13
Ledger's totals table does not have an extract structure. The column status shows whether a Data Source was already assigned to the ledger. If this is the case, the traffic light is green. Select the ledger to which you want to assign a Data Source. Enter the name of the Data Source you want to assign to the ledger. Naming convention: The name must start with the number 3. The system proposes a name with the prefix 3FI_SL_ and the name of the ledger. You cannot change the name once you have saved the assignment. Please note that ledgers are client-specific and Data Sources are cross-client objects. Within a client, a Data Source can only be assigned to one ledger. Result: The Data Source delivers transaction data from an FI-SL ledger on a totals table level. The Data Source must be replicated before it can be used in BW. Communication structures are created for Info Sources and thus represent part of an application in the OLTP System Transfer rules specify which fields in the transfer structure are to be transferred to which fields in the communication structure. You can create detailed conversion rules to change or add to data. The data is available in the BW server for Data modeling. Data is available for reporting.
Business Explorer (BEx): BEx consists of two applications: BEx Analyzer, which you use to create queries and reports, and BEx Browser, which you use to organize reports.
14
BEx requires the gateway service to communicate with the SAP application server. BEx Browser is just a workbook organizer and that BEx Analyzer displays our query results. A BW query is a selection of characteristics and key figures for the analysis of the data in an Info Cube. A query refers to only one Info Cube, and its result is presented in a BEx Excel worksheet. The maximum number of characters allowed for the technical name is 30. A BW workbook is an Excel file with a BEx query result saved in BDS. BW assigns a 25-digit ID to each workbook. Users only need to name a workbook's title. Drill-down is a user navigation step intended to obtain further detailed information. Queries, Reporting and Analysis The data in the Business Information Warehouse is structured into self-contained business data areas (Info Providers).By selecting and combining Info Objects (characteristics and key figures) or reusable structures in a query, you determine the way in which you navigate through and evaluate the data in the selected Info Provider. Analyzing data on the basis of multi-dimensional data sources (OLAP reporting) makes it possible to analyze several dimensions at the same time (like, for example, time, location, product) you have the option of carrying out any number of variance analyses (for example, plan-actual comparison, fiscal year comparison). The data, displayed in the form of a pivot table, serves as the start point for a detailed analysis to answer a variety of questions. A large number of
15
interaction options, such as sorting, filtering, swapping characteristics or recalculating values allow flexible navigation through data for the runtime. You can also display data in graphics (for example, bar charts or pie charts). In addition, you can evaluate geographical data (for example, characteristics such as customer, sales region, country) on a map. Furthermore, using exception reporting, you can establish those objects that deviate from the norm or are critical, send messages automatically (through background processing in the Reporting Agent) about deviating values by email or SMS, or calculate the values at a glance in an alert monitor. You analyze the dataset of the Business Information Warehouse by defining queries for Info Providers in the BEx Query Designer. By selecting and combining Info Objects (characteristics and key figures) or reusable structures in a query, you determine the way in which you navigate through and evaluate the data in the selected Info Provider. The most significant components of the query definition are the filter and navigation: The selections in the filter have a limiting effect on the whole query. With the filter definition, you select characteristic values from one or more characteristics or from a key figure. All of the Info Provider data is aggregated using the filter selection of the query. The filter selection cannot be changed by the navigation. For the navigation you select user-defined characteristics and determine the content of the rows and columns of the query. You use this selection to determine the data areas of the Info Provider over which you want to navigate. The arrangement of the contents of the rows and columns also determines the default view of the query and, with it, the rows and columns axes in the results area. After it is inserted into the Web browser, a query is displayed in the default initial view. By navigating through the query, you can generate different views of the Info Provider data, by dragging one of the user-defined characteristics into the rows or columns of the query, for example, or by filtering a characteristic according to a single characteristic value. With the definition of a query, the Info Provider data can be evaluated specifically and quickly. The more precisely the query is defined, the quicker its execution and navigation. BW Roles Query User: Views/executes existing queries Can input variables as selection criteria Ability to re-order/change sequence of characteristics/dimensions in local view mode Query Developer: All capabilities of Query User Can create/save queries that can be used by everyone BW Super User: All capabilities of Query User & Query Developer Info Cube Manager for selected Info Cubes Query Management provides oversight of Query Developers and developed queries Can create calculated & restricted key figures for selected Info Cubes SAP BUSINESS INFORMATION WAREHOUSE Data modeling and Reporting using Business Content and Enhancements SAP provides business contents, which are used in BW for reporting and analysis. Steps involved in using the business content and generating reports: 1. Log on to the SAP R/3 server. 2. Install the Business content. The Transaction code (T_Code) is RSA5.
16
3. Checking if the Business content is ready for use. T_Code RSA6. 4. Enhancing the structure. Here I am using BOM (Bill of Materials) From the table VBAP from Sales and Distribution (SD).
4. Activate the enhanced structure. 5. Enhance the field BOM in to the structure, Hence it should be mapped to 0_FI_GL_4 from VBAP when the condition is satisfied. User exit is written for this purpose.
6. Include the following code in the user exit. TABLES: AUFK. "Table inserted by Vijay
17
TABLES: VBAP. "Table inserted by vijay DATA: Temp_sorder like DTFIGL_4. DATA: Temp_ZZSEQUENCNUMBER like DTFIGL_4-ZZSEQUENCENUMBER. DATA: Temp_KDAUF like DTFIGL_4-KDAUF. DATA: Temp_ZZORDERCAT like DTFIGL_4-ZZORDERCAT. DATA: Temp_ZZBOM like DTFIGL_4-ZZBOM. Case i_datasource. When '0FI_GL_4'. Loop at C_t_data into Temp_sorder. Select single SEQNR from AUFK into Temp_ZZSEQUENCENUMBER Where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS. Select single KDAUF from AUFK into Temp_KDAUF where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS. Select single AUTYP from AUFK into Temp_ZZORDERCAT where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS. Temp_sorder-ZZSEQUENCENUMBER = Temp_ZZSEQUENCNUMBER. Temp_sorder-KDAUF = Temp_KDAUF. Temp_sorder-ZZORDERCAT = Temp_ZZORDERCAT. Select single STLNR from VBAP into Temp_ZZBOM where GSBER = Temp_sorder-GSBER and AUFNR = Temp_sorder-AUFNR. Temp_sorder-ZZBOM = Temp_ZZBOM. Modify C_t_data from Temp_sorder. End loop.
7. Activate the user exit. 8. Business content has been installed and activated. A field Bill of Material (ZBOM) from VBAP (SD) has been enhanced to FI_GL_4 structure.
18
9. Log on to BW server. 10. Create an Info Object for the enhanced field ZZBOM.
12. Assign Data source. Map the fields manually since ZZBOM is an enhancement, it should be mapped to the Info Object created by us. 13. Create ODS.
19
14. Assign the objects corresponding to the Key fields and Data fields.
15. Create Info Source and InfoPackage.The data is extracted from the SAP R/3 while we schedule Package. The Data is then loaded into the ODS which is the data target for the Info Package.
20
17. The data is loaded in to the ODS which could be monitored. The data load is successful. 18. Create Info Cube.
19. Assign the corresponding characteristics, Time Characteristics and Key figures. 20. Create relevant Dimensions to the cube. 21. Activate the Info Cube and create update rule in order to load the data from the ODS.
22. Update ODS and the Info Cube is successfully loaded. Manage the Info Cube to view the available data.
21
24. As per the requirement the General Ledger Info Cube is created and ready for Reporting. Create a new query using a query designer. 25. Query designer for 0_FI_GL_4. 26. General Ledger Balance Sheet for the period 1995, 1996 and 1997.
The SAP Business Information Warehouse allows you to analyze data from operative SAP applications as well as all other business applications and external data sources such as databases, online services and the Internet. The SAP Business Information Warehouse enables Online Analytical Processing (OLAP), which processes information from large amounts of operative and historical data. OLAP technology enables multi-dimensional analyses from various business perspectives. The Business Information Warehouse Server for core areas and processes, pre-configured with Business Content, ensures you to look at information within the entire enterprise. In selected roles in a company, Business Content offers the information that employees need to carry out their tasks. As well as roles, Business Content contains other pre-configured objects such as Info Cubes, queries, key figures, and characteristics, which make BW implementation easier.
22
Basic SAP GUI Navigation Data warehousing Need with an Example. SAP History SAP BW vs Other Data warehousing Tools SAP BW vs SAP BI SAP BW Terminology o Master Data o Transaction Data o Landscape Areas of SAP BW o Extraction o Modeling o Reporting Data Warehousing Categories SAP BW & EDW o 2-Dimensional Modeling o Multi Dimensional Modeling ( Star Schema ) o SAP BW Star Schema ( Extended Star Schema )
Administrator Workbench ( AWB ) { RSA1 } o Info Area o Info Object Catalog o Info Objects ( Building Blocks ) Characteristics Key Figures o Application Component Area o Info source, Types Flexible , Direct PSA ( Persistent Staging ARea ) Source System Data Source Info Package Flat file extraction ( Master Data ) o Extract Structure o Transfer Structure Transfer Rules ( Types ) o Communication Structure Update Rules ( Types ) o Data Targets ODS Info Cubes, Types Multiproviders
23
Reporting ( Week 3 )Business Explorer Bex Query Designer Bex Analyzer Restricted Key Figure Free Characteristics Filters Variables Calculated Key Figures Navigation Options Customer Exits SAP Exits Extraction o LO Cockpit ( LBWE ) o Generic Extraction ( Master Data Upload ) Web Application Designer RRI ( Jump Reports ) Solution Manager Transportation
o o o o o o o o o o
SAP BI ( Business Intelligence ) 7.0 Data Modeling (Week 4 )Data Warehouse Workbench Types of Data
o o o o o o o o o o
Master Data Transactional Data Meta Data Data Model BI Master Data Tables BI Transaction Data Tables Star Schema Extended Star Schema SID Tables Structure of Dimension Tables Dimension Modeling BI Objects Info Area Info Object Catalog Characteristics / Attributes Attribute Types Time Characteristics Unit Characteristics Technical Characteristics Key Figures Info Object Definition Attribute Types Time Characteristics Unit Characteristics Technical Characteristics Key Figures Info Object Definition
24
Info Object Catalog Definition: An Info Object catalog is a collection of Info Objects grouped according to application-specific criteria. There are two types of Info Object catalogs: Characteristic and Key figure. Use: An Info Object catalog is assigned to an Info Area. An Info Object catalog is an organizational aid and is not for intended for data analysis purposes. For example, all the Info Objects that are used for data analysis in the area of Sales and Distribution can be grouped together in one Info Object catalog. This makes it much easier for you to handle what might turn out to be a very large number of Info Objects for any given context. An Info Object can be included in several Info Object catalogs. In Info Provider definition, you can select an Info Object catalog as a filter for the template. Info Object Definition: Business evaluation objects are known in BI as Info Objects. They are divided into o Characteristics (for example, customers), o Key figures (for example, revenue), o Units (for example, currency, amount unit), o Time characteristics (for example, fiscal year) and o Technical characteristics (for example, request number). Use: Info Objects are the smallest units of BI. Using Info Objects, information is mapped in a structured form. This is required for constructing Info Providers. Info Objects with attributes or texts can themselves also be Info Providers (if in a query). Structure: Characteristics are sorting keys, such as company code, product, customer group, fiscal year, period, or region. They specify classification options for the dataset and are therefore reference objects for the key figures. In the Info Cube, for example, characteristics are stored in dimensions. These dimensions are linked by dimension IDs to the key figures in the fact table. The characteristics determine the granularity (the degree of detail) at which the key figures are kept in the Info Cube. In general, an Info Provider contains only a sub-quantity of the characteristic values from the master data table. The master data includes the permitted values for a characteristic. These are known as the characteristic values. The key figures provide the values that are reported on in a query. Key figures can be quantity, amount, or number of items. They form the data part of an Info Provider. Units are also required so that the values for the key figures have meanings. Key figures of type amount are always assigned a currency key and key figures of type quantity also receive a unit of measurement. Time characteristics are characteristics such as date, fiscal year, and so on. Technical characteristics have only one organizational meaning within BI. An example of this is the request number in the Info Cube, which is obtained as ID when loading requests. It helps you to find the request again. Special features of characteristics: If characteristics have attributes, texts, or hierarchies at their disposal then they are referred to as master data-bearing characteristics. Master data is data that remains unchanged over a long period of time. Master data contains information that is always needed in the same way. References to this master data can be made in all Info Providers. You also have the option of creating characteristics with references. A reference characteristic provides the attributes, master data, texts, hierarchies, data type, length, number and type of compounded characteristics, lower case letters and conversion routines for new characteristics. 25
A hierarchy is always created for a characteristic. This characteristic is the basic characteristic for the hierarchy (basic characteristics are characteristics that do not reference other characteristics). Like attributes, hierarchies provide a structure for the values of a characteristic. Company location is an example of an attribute for Customer. You use this, for example, to form customer groups for a specific region. You can also define a hierarchy to make the structure of the Customer characteristic clearer. Special features of key figures: A key figure is assigned additional properties that influence the way that data is loaded and how the query is displayed. This includes the assignment of a currency or unit of measure, setting aggregation and exception aggregation, and specifying the number of decimal places in the query. Integration Info Objects can be part of the following objects: 1. Component of an Info Source
An Info Source is a quantity of Info Objects that logically belong together and are updated in Info Providers. 2. Composition of an Info Provider:
An Info Provider consists of a number of Info Objects. In an Info Cube, the characteristics, units, and time characteristics form the basis of the key fields, and the key figures form the data part of the fact table of the Info Cube. In a Data Store object, characteristics generally form the key fields, but they can also be included in the data part, together with the key figures, units and time characteristics. 3. Attributes for Info Objects
We have different tabs in characteristics screen Tab Page: General Use In this tab page, you determine the basic properties of the characteristic. Structure Dictionary: Specify the data type and the data length. The system provides you with a list to choose from in the F4 Help. The following data types are supported for characteristics: Char: Numc: Dats: Tims: Numbers and letters Numbers only Pricing Date Time Character length 1 - 60 Character length 1 - 60 Character length 8 Character length 6
Miscellaneous Lowercase letters allowed / not allowed If this indicator is set, the system differentiates between lowercase letters and capital letters when you use a screen template to input values. If this indicator is not set, you have to give all data in capital letters. 26
Different types of tabs in characteristics screen are General Business explorer, Hierarchy Attributes Creating Info Objects: Key Figures Procedure 1. 2. 3. 4.
Using the context menu, select Create Info Object from your Info Object catalog for key figures. Enter a name and a description. Confirm your entries. Save and activate the key figure you have created.
Key figures have to be activated before they can be used. We have three tabs in Key figures screen Tab Page: Type/Unit. Tab Page: Aggregation.
If you created your key figure with a reference, then you get the additional tab page, Elimination. Data Type Specify the data type. For the amount, quantity, and number, you can choose between the decimal number and the floating point number, which guarantees more accuracy. For the key figures date and time, you can choose the decimal display to apply to the fields. The following combinations of key figure and data type are possible: Key Figure Type Data Type AMO Amount; CURR: Currency field, created as DEC; FLTP: Floating point number with 8 byte precision QUA Quantity; QUAN: Quantity field, created as DEC; FLTP: Floating point number with 8 byte precision NUM Number; DEC: Calculation or amount field with comma and +/- sign. FLTP: Floating point number with 8 byte precision INT integer; INT4: 4 byte integer, whole number with +/- sign DAT Date; DATS: Date field (YYYYMMDD), created as char(8); DEC: Calculation or amount field with comma and +/- sign. TIM Time: TIMS: Time field (hhmmss), created as char(8); DEC: Calculation or amount field with comma and +/- sign. Currency/Quantity Unit You can assign a fixed currency to the key figure. If this field is filled, the key figure bears this currency throughout BW. You can also assign a variable currency to the key figure. In the field unit/currency, determine which InfoObject bears the key figure unit. For quantities or amount key figures, either this field must be filled, or you must enter a fixed currency or amount unit Data Warehousing Workbench Purpose The Data Warehousing Workbench (DWB) is the central tool for performing tasks in the data warehousing process. It provides data modeling functions as well as functions for controlling, monitoring, and maintaining all the processes in SAP NetWeaver BI that are related to the procurement, retention, and processing of data. 27
Structure of the Data Warehousing Workbench The following figure shows the structure of the Data Warehousing Workbench:
Navigation Pane Showing Functional Areas of Data Warehousing Workbench When you call the Data Warehousing Workbench, a navigation pane appears on the left-hand side of the screen. You open the individual functional areas of the Data Warehousing Workbench by choosing the pushbuttons in the navigation pane. The applications that are available in these areas are displayed in the navigation pane; in the modeling functional area, you see the possible views of the object trees. Object Trees or Applications in the Individual Functional Areas If object trees or applications are assigned to a functional area in the navigation pane, you call them by clicking them once in the right-hand side of the screen. Application Toolbar In all functional areas, the Data Warehousing Workbench toolbar contains a pushbutton for showing or hiding the navigation pane. It also contains pushbuttons that are relevant in the context of the individual functional areas and applications. Menu Bar: The functions that you can call from the menu bar of the Data Warehousing Workbench depend on the functional areas. Status Bar: The system displays information, warnings, and error messages in the status bar. Different tabs available in Data warehouse workbench screen Modeling Monitoring Reporting Agent Transport Connection 28
Documents
BI Content
Translation
Metadata Repository
Persistent Staging Area Purpose The Persistent Staging Area (PSA) is the inbound storage area for data from the source systems in the BI system. The requested data is saved, unchanged from the source system. Request data is stored in the transfer structure format in transparent, relational database tables of the BI system in which the request data is stored in the format of the DataSource. The data format remains unchanged, meaning that no summarization or transformations take place, as is the case with InfoCubes. When loading flat files, the data does not remain completely unchanged, since it is adjusted by conversion routines if necessary (for example, the date format 31.21.1999 is converted to 19991231 in order to ensure uniformity of data). Features: A transparent PSA table is created for every Data Source that is activated. The PSA tables each have the same structure as their respective Data Source. They are also flagged with key fields for the request ID, the data package number, and the data record number. Info Packages load the data from the source into the PSA. The data from the PSA is processed with data transfer processes. With the context menu entry Manage for a Data Source in the Data Warehousing Workbench you can go to the PSA maintenance for data records of a request or delete request data from the PSA table of this Data Source. You can also go to the PSA maintenance from the monitor for requests of the load process.
Info cubes: info cubes are multidimensional data storage containers for reporting and analysis of data. They consist of key figures and characteristics of which latter is organized as dimensions facilitation users to analyze data from various business perspectives. Use: Info Cubes are filled with data from one or more Info Sources or other Info Providers. They are available as Info Providers for analysis and reporting purposes. Structure: The data is stored physically in an Info Cube. It consists of a number of Info Objects that are filled with data from staging. It has the structure of a star schema. Info Providers Info Providers refer to all the data objects that are present in the SAP BI systems, these include all the data targets viz. Info Cubes, DSO objects and master data tables/info objects, along with Info sets, remote Info cubes and Multi Providers. Data Sources Data Sources are flat data structures containing data that logically belongs together; they are responsible for extracting and staging data from various source systems. Use: Data Sources supply the metadata description of source data. The Info Package controls the transfer of data from the source to the entry layer of BI. The data transfer process controls the distribution of data within BI. The graphic illustrates a simplified example of a data update from the Data Source to an Info Provider. Data Transfer Process Definition: Object that determines how data is transferred between two persistent objects. Use: You use the data transfer process (DTP) to transfer data within BI from a persistent object to another object in accordance with certain transformations and filters. In this respect, "it replaces the data mart interface and the Info Package". The Info Package only loads data to the entry layer of BI (PSA). 29
The data transfer process makes the transfer processes in the data warehousing layer more transparent. Optimized parallel processing improves the performance of the transfer process (the data transfer process determines the processing mode). You can use the data transfer process to separate delta processes for different targets and you can use filter options between the persistent objects on various levels. For example, you can use filters between a Data Store object and an Info Cube. Data transfer processes are used for standard data transfer, for real-time data acquisition, and for accessing data directly. Integration: The following graphic illustrates how the data transfer process is positioned in relation to the objects in the dataflow of BI and the other objects in BI process control: Features: With a data transfer process, you can transfer data either in full extraction mode or in delta mode. In full mode, the entire dataset of the source is transferred to the target; in delta mode, only the data that was posted to the source since the last data transfer is transferred. The data transfer process controls delta handling and therefore allows you to fill several targets with different deltas from one source. The data transfer process supports you in handling data records with errors. The data transfer process also supports error handling for DataStore objects. When you define the data transfer process, you can determine how the system responds to errors. At runtime, the incorrect data records are sorted and written to an error stack (request-based database table). A special error DTP further updates the data records from the error stack into the target. It is easier to restart failed load processes if the data is written to a temporary storage after each processing step. It also allows you to find records that have errors. In the monitor for the data transfer process request or in the temporary storage for the processing step (if filled), you can display the data records in the error stack . In data transfer process maintenance, you determine the processing steps after which you want to store data temporarily. After creation the technical name starts with DTP_XXXXXXXXXXXXXXXX for DTP Other simple definitions Info Area: Element for grouping meta-objects in the BI system. Each InfoProvider is assigned to an InfoArea. The resulting hierarchy is displayed in the Data Warehousing Workbench. In addition to their properties as an InfoProviders, InfoObjects can also be assigned to different Info Areas. Info Area is a Top Level Object, which contains the data models in it. In general, Info Areas are used to organize InfoCubes and InfoObjects. Each InfoCube is assigned to an Info Area. Through an InfoObject Catalog, each Info Object is assigned to an Info Area as well. The Info Area contains all objects used to evaluate for a business logical process. Info Area in which Info Providers are stored.
Info Object Catalogs: An Info Object catalog is a collection of InfoObjects grouped according to application-specific criteria. There are two types of InfoObject catalogs: Characteristic and Key figure. Info Objects, (characteristics and key figures) are the basic data model of SAP Business information warehouse(BW/BI). And the Info objects are stored in folders, it also called the InfoObject catalogs, the infoobject catalogs are also stored in a folder called Info Areas. An Info Object catalog is assigned to an Info Area. An Info Object can be included in several Info Object catalogs.
Info Objects: Business evaluation objects are known in BI/BIW as InfoObjects. They are divide into characteristics (for example, customers), key figures (for example, revenue), units (for example, currency, amount unit), time characteristics (for example, fiscal year) and technical characteristics (for example, request number). Info Objects are the smallest information units in BI/BW. They structure the Information needed to create the data targets. Info Objects with attributes or texts can be either a pure data target or an Info Provider (if it is being reported). 30
Application Component: Application Components are used to organize Data Sources. They are analogous to the Info Areas . Info Package: An InfoPackage specifies when and how to load data from a given source system. BW generates a 30-digit code starting with ZPAK as an InfoPackage's technical name. PSA: Persistent Staging Area is a data staging area in BW. It allows us to check data in an intermediate location, before the data are sent to its destinations in BW. The PSA stores data in its original source system format. In this way, it gives us a chance to examine / analyse the data before we process them to further destinations. Most probably it is a temporary storage area, based on the client data specifications and settings.
31
Step-by-Step Guide 1. Create info area click on info object in modeling tab, then right click on root node of info object on right screen and select create info area from context menu Give technical name and description and click on continue button Item info area got created at the bottom of the root 2. Create info object catalog right click on your info area and select create info object catalog from the context menu Give the technical name and description then select char or key figures radio button options on left side of the screen and click on create icon, this will navigate you to another screen where you have to check and activate the object which you have created 3. Create info char & key figures info object right click on info object catalog and select create info object from the context menu Give the technical name description then click on continue button, system will navigate you to another screen, where you have to enter short & Long description, enter data types like CHAR, NUMC, DATS, TIMS according to your requirement and check lower case check box if you have small case letters in your data else leave it We have different types of tab in this screen General Business explorer Master data Attributes Hierarchy Compounding 4. Similarly create key figures right click on your key figure info object catalog and select create info object from context menu 32
Give the technical name and description then click on continue button, system will navigate you to another screen, where you have to enter short & Long description and then select data type like AMOUNT, QUANTITY, NUMBER, INTEGER, DATA & TIME. And also dont forget to give basic currency or unit of measure like INR, USD, etc, In this screen we have different types of tabs Type/Unit Aggregates Additional properties Follow the above steps to create more characteristics and key figures 5. Create data source click on data source in modeling tab, right click again on root node of data source and select create application component, give technical name and description of application component, newly created application component will appear on the bottom of the data source screen. Now create data source, to create data source right click on your application component and select create data source from context menu give the technical name and description of data source, then it will navigate it to another screen where you will find different tabs like General info tab Extraction tab Proposal tab Fields tab preview tab
In general tab enters short, medium & long description details and dont change any other details, now go to extraction tab and select file name by browsing you flat file i.e. your csv (comma separate values file) then enter values in Header Rows to be Ignored text box, select data format then separate by give , and escape sign ; then click on fields tab and enter all your info objects both characteristics and key figures and press on enter button, now SAP will prompt you a message where it will ask to you to copy all the details of the info objects which you have entered in fields tab just click on copy button then you will get all your info objects now go to preview tab and click on read data command button. Before that please prepare your flat file, your flat file will be in excel file enter you values and save file as csv format.
6. Now create info package to create info package right click on your data source and select create info package from context menu, enter details in the next screen then SAP will navigate you to next screen where again you have different tabs Data selection External data Processing Update Schedule
In external data you will get all the details of the flat file which you have entered in data source extraction tab, then in processing tab where only PSA option will be in select mode, then in update tab only full update option will available by default that will be selected, again in schedule tab we have two options like start data load immediately and start later in background here select data load immediately option and then click on start command button. Now the data will be requested from source, to check whether data was requested or not click on monitor icon where the status of request will be in green if it is in red means that something went wrong in your work. Now click on PSA maintenance icon where you have to select no of records you want in display screen, you can restrict the no of records you want by entering the values, then click on continue button now SAP will display data which was loaded by you till PSA, remember till now we have loaded data till PSA only. 7. Now we have to create info providers to create info providers click on info providers option in modeling tab, as usual right click on your info area which you have create in 1st step, info area will appear bottom of right side screen and select create info cube from the context menu 33
In next screen you have to select info object catalog icon and select your info object catalog from list of catalogs, once you have selected all your char info objects will appear in a list just select and drop them in dimension 1 folder if you want you can create new dimension also, it depends on your client requirement, once this step is completed now again select your key figure catalog and expand your key figures catalog and select your key figures and drag and drop into key figures folder, now check and activate you info cube. In next step you have to create transformations right click on your info cube and select create transformation option from the context menu, SAP will navigate you to next screen where you have to check your mappings now check activate you transformations In next step you have to create DTP (Data Transfer Processes) Again here you have tab here make necessary selections and make sure that you have entered all details then click on save button and finally click on execute button This will load data from PSA to info cube, and to check that data was loaded or not in info cube right click on info cube select manage option then again you have different tabs there i. Contents ii. Performance iii. Requests iv. Roll up v. Collapse vi. Reconstruction
Click on requests tab and make sure that your requests have green status and request for reporting available icon is there or not if everything is perfect then click on contents tab and select info cube contents, now SAP will take you in another screen where you to select Fld selection for output command button and select what are all the fields required for you in output just check the objects which are required in output screen, after selection click on execute button then again click on execute button now you can able to view data which you have loaded in info cube So till now you have loaded data to PSA with the help of info package and from PSA to info cube with the help of Data Transfer Processes (DTP). To view the flow right click on info cube then select display data flow, here in the screen select downwards or upwards option which ever you want to view and then click on continue button, SAP will display flow like in below screen
34
Transformations Use: The transformation process allows you to consolidate, cleanse, and integrate data. You can semantically synchronize data from heterogeneous sources. When you load data from one BI object into a further BI object, the data is passed through a transformation. A transformation converts the fields of the source into the format of the target. Features: You create a transformation between a source and a target. The BI objects DataSource, InfoSource, DataStore object, InfoCube and InfoSet serve as source objects. The BI objects InfoSource, InfoObject, DataStore object and InfoCube serve as target objects. The following graphic illustrates how the transformation is integrated in the dataflow:
A transformation consists of at least one transformation rule. Various rule types, transformation types, and routine types are available. These allow you to create transformations from very simple to highly complex: Transformation rules: Transformation rules map any number of source fields to at least one field in the target. You can use different rules types for this. Rule type: A rule type is a specific operation that is applied to the relevant fields using a transformation rule. Transformation type: The transformation type determines how data is written into the fields of the target. Transformation group: A transformation group is a group of transformation rules. A transformation group contains one transformation rule for each key field of the target. A transformation can contain multiple transformation groups. Transformation groups allow you to combine various rules. This means that you can create different rules for different key figures for a characteristic. The source contains three date characteristics: Order date Delivery date Invoice date The target only contains one general date characteristic. Depending on the key figure, this is filled from the different date characteristics in the source. Create three transformation groups which, depending on the key figure, update the order date, delivery date, or invoice data to the target. Routine: You use routines to implement complex transformation rules yourself.
35
MultiProviders Definition: A MultiProvider is a type of InfoProvider that combines data from a number of InfoProviders and makes it available for reporting purposes. The MultiProvider does not itself contain any data. Its data comes entirely from the InfoProviders on which it is based. These InfoProviders are connected to one another by a union operation. Use: A MultiProvider allows you to run reports using several InfoProviders. InfoCube and InfoCube: You have an InfoProvider with actual data for a logically closed business area and an equivalent InfoProvider with planned data. You can combine the two InfoProviders into one MultiProvider so that you can compare the actual data with the planned data in a query. In BW release 2.0B/2.1C, this combination of two InfoCubes was called a MultiCube. InfoCube and InfoObject: You have an InfoCube with your products and sales. You combine this InfoCube with the 0MATERIAL InfoObject. This allows you to display any "slow-moving items", since products that do not result in sales are also displayed. For a detailed description of the procedure, see the Slow Moving Items Scenario. Structure: A MultiProvider can consist of different combinations of the following InfoProviders: InfoCube, DataStore object, InfoObject, InfoSet, VirtualProvider, and aggregation level
A union operation is used to combine the data from these objects into a MultiProvider. Here, the system constructs the union set of the data sets involved. In other words, all values of these data sets are combined. As a comparison: InfoSets are created using joins. These joins only combine values that appear in both tables. In contrast to a union, joins form the intersection of the tables As a comparison, see InfoSet. In a MultiProvider, every characteristic in each of the InfoProviders involved must correspond to exactly one characteristic or navigation attribute (wherever these are available). If it is not clear, at the MultiProvider definition stage, you have to specify to which InfoObject you want to assign the characteristic of the MultiProvider. The MultiProvider contains the characteristic 0COUNTRY and an InfoProvider contains the characteristic 0COUNTRY as well as the navigation attribute 0CUSTOMER__0COUNTRY. In this case, select exactly one of these InfoObjects in the assignment table. Select a key figure contained in a MultiProvider from at least one of the InfoProviders involved. In general, exactly one of the InfoProviders provides the key figure. However, there are cases where it is better to select from more than one InfoProvider InfoSet 36
Definition: Description of a specific kind of InfoProvider: InfoSet describes data sources that are defined as a rule as joins of DataStore objects, standard InfoCubes and/or InfoObjects (characteristics with master data). A time-dependent join or temporal join is a join that contains an InfoObject that is a time-dependent characteristic. An InfoSet is a semantic layer over the data sources. Use: InfoSets allow you to report on several InfoProviders by using combinations of master data-bearing characteristics, InfoCubes and DataStore objects. The information is collected from the tables of the relevant InfoProviders. When an InfoSet is made up of several characteristics, you are able to map transitive attributes and report on this master data. You create an InfoSet using the characteristics like Business Partner (0BPARTNER) Vendor (0VENDOR) Business Name (0DBBUSNAME), and can then use the master data for reporting. Structure: You can include every DataStore object, every InfoCube and every InfoObject of the type Characteristic with Master Data in a join. A join can contain objects of the same object type, or objects of different object types. The individual objects can appear in a join any number of times. Join conditions connect the objects in a join to one another (equal joincondition). A join condition determines the combination of records from the individual objects that are included in the resulting set. Integration: InfoSet Maintenance in the Administrator Workbench VirtualProviders Definition: InfoProviders with transaction data that is not stored in the object itself, but which is read directly in reporting. The relevant data can be from the BI system, or other SAP or non-SAP systems. VirtualProviders only allow read access to data.
DataStore Objects Definition: A DataStore object serves as a storage location for consolidated and cleansed transaction data. This data can be evaluated using a BEx query. A DataStore object contains key fields (for example, document number/item) and data fields that can also contain character fields (for example, order status, customer) as key figures. The data from a DataStore object can be updated with a delta update into InfoCubes and/or other DataStore objects or master data tables (attributes or texts) in the same system or across different systems. Unlike multi-dimensional data storage using InfoCubes, the data in DataStore objects is stored in transparent, flat database tables. Fact tables or dimension tables are not created. Use: The cumulative update of key figures is supported for DataStore objects, just as it is with InfoCubes, but with DataStore objects it is also possible to overwrite data fields. This is particularly important with document-related structures. If documents are changed in the source system, these changes include both numeric fields, such as the order quantity, and non-numeric fields, such as the ship-to party, status and delivery date. To map these changes in the BI system in the DataStore objects, the relevant fields in the DataStore objects must be overwritten and set to the current value. Furthermore, a source can be made delta-capable by overwriting it and using the existing change log. This means that the delta that, for example, is further updated in the InfoCubes, is calculated from two, successive after-images. There are different DataStore object types: Standard (see Standard DataStore Objects) For direct update (see DataStore Objects for Direct Update) Write-optimized (see Write-Optimized DataStore Objects) 37
Differences between the DataStore Object Types: Type Structure Data Supply Standard Consists of three tables: activation queue, table of active data, change log Consists of the table of active data only Consists of the table of active data only From data transfer process
No No
Yes Yes
Integration Integration with the Data Warehousing Workbench - Modeling Metadata: The DataStore objects are fully integrated with BI Metadata. They are transported just like InfoCubes, and installed from BI Content (for more information see Installing Business Content). The DataStore objects are grouped with the InfoCubes in the InfoProvider view of the Data Warehousing Workbench - Modeling, and are displayed in a tree. They also appear in the data flow display. Update: The transformation rules define the rules that are used to write to a DataStore object. They are very similar to the transformation rules for InfoCubes. The main difference is the behavior of data fields in the update. When you update requests in a DataStore object, you have an overwrite option as well as an addition option. See also Transformation Type.The delta process , which is defined for the DataSource, also influences the update. When loading files, the user must select a suitable delta method so that the correct transformation type is used. Scheduling and Monitoring: The processes for scheduling the data transfer process for updating into InfoCubes and DataStore objects are identical. Error handling is an exception here. It cannot be used with DataStore objects. It is also possible to schedule the activation of DataStore object data and the update from the DataStore object in the related InfoCubes or DataStore objects. The individual steps, including the processing of the DataStore object, are logged in the Monitor. There is a separate detailed monitor for executed request operations (such as activation or rollback). Loadable DataSources In full-update mode, every transaction data DataSource contained in a DataStore object is updated. In delta-update mode, only those DataSources that are flagged as DataStore-delta compatible are updated
38
FI : Master data SKA1 Accounts BNKA Bank master record Accounting documents // indices BKPF Accounting documents BSEG item level BSID Accounting: Secondary index for customers BSIK Accounting: Secondary index for vendors BSIM Secondary Index, Documents for Material BSIP Index for vendor validation of double documents BSIS Accounting: Secondary index for G/L accounts BSAD Accounting: Index for customers (cleared items) BSAK Accounting: Index for vendors (cleared items) BSAS Accounting: Index for G/L accounts (cleared items) Payment run REGUH Settlement data from payment program REGUP Processed items from payment program CO : TKA01 Controlling areas TKA02 Controlling area assignment KEKO Product-costing header KEPH Cost components for cost of goods manuf. KALO Costing objects KANZ Sales order items - costing objects Cost center master data CSKS Cost Center Master Data CSKT Cost center texts CRCO Assignment of Work Center to Cost Center Cost center accounting COSP CO Object: Cost Totals for External Postings COEP CO Object: Line Items (by Period) COBK CO Object: Document header Cost center accounting COSP CO Object: Cost Totals for External Postings COEP CO Object: Line Items (by Period) . COBK CO Object: Document header 5.1 FI : Master data SKA1 Accounts BNKA Bank master record Accounting documents // indices BKPF Accounting documents BSEG item level BSID Accounting: Secondary index for customers BSIK Accounting: Secondary index for vendors BSIM Secondary Index, Documents for Material BSIP Index for vendor validation of double documents BSIS Accounting: Secondary index for G/L accounts BSAD Accounting: Index for customers (cleared items) BSAK Accounting: Index for vendors (cleared items) BSAS Accounting: Index for G/L accounts (cleared items) Payment run REGUH Settlement data from payment program; REGUP Processed items from payment program CO : TKA01 Controlling areas; TKA02 Controlling area assignment; KEKO Product-costing header KEPH Cost components for cost of goods manuf; KALO Costing objects KANZ Sales order items - costing objects Cost center master data CSKS Cost Center Master Data; CSKT Cost center texts CRCO Assignment of Work Center to Cost Center Cost center accounting COSP CO Object: Cost Totals for External Postings COEP CO Object: Line Items (by Period) . COBK CO Object: Document headerc
39