Dylan's BI Notes
My notes about Business Intelligence, Data Warehousing, OLAP, and Master Data Management
Updated: 3 hours 38 min ago
I wrote the article RFM Analysis earlier. We recently posted a more detailed description about how Oracle BI Apps implements this concept in the product. Customer RFM Analysis RFA related customer attributes are good examples of aggregated performance metrics as described in this design tip from the Kimball Group Design Tip #53: Dimension Embellishments By putting this […]
I believe that Conformed Dimensions are playing a key roles in data mining. Here is why: A conformed dimension can bring the data together from different subject area, and sometime, from different source system. The relevant data can be thus brought together. Data Mining is a technique to find the pattern from the historical data. […]
1. Market Segmentation in the academic world Market Segmentation is part of marketing process. It is described in Philip Kotler’s book as part of the step of defining the market strategy.The idea is to segment consumer market by some variables and to divide the market into different segments. Selecting the segments for your products is the result of the […]
Here are the various ways that we can use Data Mining inside BI. We can build Advanced Analytics applications. The scoring function can be called within the opaque view or with EVALUATE function. The opaque view method may provide a better flexibility since multiple columns can be exposed. Here is an old Oracle white paper […]
Here is how I view data mining: The target is to build a scoring engine. It accepts an input and produces the output. The development process can be separate as Requirement, Design, Coding, and Deploy. Similar to typical software development phases.
1. Use the data warehouse data as the training set Data Mining requires the training data to train the learning algorithm. The data warehoucing processes provide the following services: Consolidate the data from different sources Aggregate the data: for example, we have the order return transactions but the training data can be # of returns […]
These video shows how the data are loaded to BI Cloud Service and are modeled as dimensions and facts. We do not need to use the BI admin tool to create model. For BICS, we can create model using the browser.
Moved the content into a page – Dimensional Modeling
Typical data warehousing ETL process involves Extract, Transform, and Load. The concept of Source Dependent Extract (SDE) and Source Independent Load (SIL) are unique part of the BI Apps ETL since BI Apps has a universal data warehouse. Since the staging schema are designed according to the universal data warehouse design, the logic of loading data […]
BI Apps data warehouse design is based on an assumption that the data warehouse schema design is independent from OLTP system. The staging schema is an universal staging and the data warehouse is an universal data warehouse. The assumption is that no matter what the source system you are using, the business questions the BI […]
I think that it would be easier if the cloud application can be aware of the data integration needs and publish the interfaces proactively. Here are some basic requirements for the applications that can be considered as data integration friendly: 1. Publish the object data model This is required for source analysis. For example, here is […]
I found that Cloud to Cloud ETL is different from the regular ETL running in the on premise env in several ways: 1. Direct Database Access is not available 2. Complex / Nested Objects 3. Paging 4. Volume Restriction 5. File base approach 6. Data Streaming 7. Data Security / Protection 8. Change data capture […]