Feed aggregator

Hierarchical query with count of leave attributes

Tom Kyte - Fri, 2017-10-13 17:26
Hello Experts. I want to calculate the sum of the count of the leaves' attributes in a hierarchical query <code> create table hq_test (parent_id NUMBER, child_id NUMBER); INSERT INTO hq_test (parent_id, child_id) VALUES (25,26); INSERT INTO hq...
Categories: DBA Blogs

bank Statement reconciliation

OracleApps Epicenter - Fri, 2017-10-13 11:41
Reconciliation provides a means of determining the balance on the bank accounts in Oracle Financials is equivalent to the balance shown on the bank statements. The High Point of the Reconciliation activity in Cash Managment includes: All transactions processed via the Oracle Accounts Payable, General Ledger and Oracle Accounts Receivable is reconciled against transactions on […]
Categories: APPS Blogs

Vertex and Oracle Cloud

OracleApps Epicenter - Fri, 2017-10-13 09:28
Oracle ERP Cloud Transaction Tax Automation is consist of Comprehensive Tax Partner Offerings Robust infrastructure for tax partner solutions Tax Content and Tax Reporting services providing fast ROI option for mid-market companies adopting Oracle ERP Cloud Integrated Cloud-to-Cloud tax calculation services for enterprise grade companies Infrastructure has been optimized for ERP Cloud customers using end-to-end […]
Categories: APPS Blogs

Switchover and Switchback simplified in Oracle 12c

Syed Jaffar - Fri, 2017-10-13 07:51

Business continuity (Disaster Recovery) has become a very critical factor for every business, especially in the financial sectors. Most of the banks are tending to have their regular DR test to meet the central bank regulation on DR testing capabilities.

Very recently, there was a request from one of the clients to perform a reverse replication and rollback (i.,e switchover & switchback) between the HO and DR for one of the business critical databases. Similar activities performed with easy on pre 12c databases. However, this was my first experience with Oracle 12c. After spending a bit of time to explore whats new in 12c Switchover, it was amazing to learn how 12c simplified the procedure. So, I decided to write a post on my experience.

This post demonstrates how Switchover and Switchback procedure is simplified in Oracle 12c.

The following is used in the scenario:

·        2 instances Oracle 12c RAC primary database (IMMPRD)
·        Single instance Oracle 12c RAC Standby database (IMMSDB)

Look at the current status of the both databases:
-- Primary
IMMPRD> select status,instance_name,database_role from v$database,v$instance;

------------ ---------------- ----------------
OPEN         IMMPRD1           PRIMARY

-- Standby
IMMSDB> select status,instance_name,database_role from v$database,v$instance;

------------ ---------------- ----------------

Before getting into the real action, validate the following to avoid any failures during the course of role transition:

·        Ensure log_archive_dest_2 is configured on PRIMARY and STANDBY databases
·        Media Recovery Process (MRP) is active on STANDBY and in sync with PRIMARY database
·        Create STANDBY REDO logs on PRIMARY, if not exists
·        FAL_CLIENT & FAL_SERVER parameters set on both databases
·        Verify TEMP tablespaces on STANDBY, add them if required, as TEMPFFILES created after STANDBY creation won’t be propagated to STANDBY site.

Pre-Switchover in 12c

For a smooth role transition, it is important to have everything in-place and in sync. Pre-Oracle 12c, a set of commands used on PRIMARY and STANDBY to validate the readiness of the systems. However, with Oracle 12c, this is simplified with the ALTER DATABASE SWITCHOVER VERIFY command. The command performs the following set of actions:

·        Verifies minimum Oracle version, i.e, Oracle 12.1
·        Verify MRP status on Standby database

Let’s run the command on the primary database to validate if the environments are ready for the role transition.

IMMPRD>  alter database switchover to IMMSDB verify;
 alter database switchover to IMSDB verify
ERROR at line 1:
ORA-16475: succeeded with warnings, check alert log for more details

When the command is executed, an ORA-16475 error was encountered. For more details, lets walk through the PRIMARY and STANDBY databases alert.log file, and pay attention to the SWITCHOVER VERIFY WARNING.

--primary database alert.log

Fri Oct 13 11:16:00 2017
SWITCHOVER VERIFY: Send VERIFY request to switchover target IMSDB
SWITCHOVER VERIFY WARNING: switchover target has no standby database defined in LOG_ARCHIVE_DEST_n parameter. If the switchover target is converted to a primary database, the new primary database will not be protected.

ORA-16475 signalled during:  alter database switchover to IMSDB verify...

The LOG_ARCHIVE_DEST_2 parameter was not set on the STANDBY database and the VERIFY command produced the warning. After setting the parameter on the STANDBY, the verify command was re-ran, and it went well this time.

IMMPRD> alter database switchover to IMMSDB verify;

Database altered.

PRIMARY database alert.log confirms no WARINGS

alter database switchover to IMMSDB verify
Fri Oct 13 08:49:20 2017
SWITCHOVER VERIFY: Send VERIFY request to switchover target IMMSDB
Completed: alter database switchover to IMMSDB verify

Switchover in 12c 

After successful validation and confirmation about the DBs readiness for the role transition, execute the actual switchover command on the primary database. (advised to view the alert.log files of PRIMARY and STANDBY instances).

IMMPRD> alter database switchover to IMMSDB;

Database altered.

Let’s walk through the PRIMARY and STANDBY database alert.log files to review what Oracle has internally done.

--primary database alert.log

alter database switchover to IMMSDB
Fri Oct 13 08:50:21 2017
Starting switchover [Process ID: 302592]
Fri Oct 13 08:50:21 2017
Waiting for target standby to receive all redo
Fri Oct 13 08:50:21 2017
Waiting for all non-current ORLs to be archived...
Fri Oct 13 08:50:21 2017
All non-current ORLs have been archived.
Fri Oct 13 08:50:21 2017
Waiting for all FAL entries to be archived...
Fri Oct 13 08:50:21 2017
All FAL entries have been archived.
Fri Oct 13 08:50:21 2017
Waiting for dest_id 2 to become synchronized...
Fri Oct 13 08:50:22 2017
Active, synchronized Physical Standby switchover target has been identified
Preventing updates and queries at the Primary
Generating and shipping final logs to target standby
Switchover End-Of-Redo Log thread 1 sequence 24469 has been fixed
Switchover End-Of-Redo Log thread 2 sequence 23801 has been fixed
Switchover: Primary highest seen SCN set to 0x960.0x8bcd0f48
ARCH: Noswitch archival of thread 2, sequence 23801
ARCH: End-Of-Redo Branch archival of thread 2 sequence 23801
ARCH: LGWR is scheduled to archive destination LOG_ARCHIVE_DEST_2 after log switch
ARCH: Standby redo logfile selected for thread 2 sequence 23801 for destination LOG_ARCHIVE_DEST_2
ARCH: Noswitch archival of thread 1, sequence 24469
ARCH: End-Of-Redo Branch archival of thread 1 sequence 24469
ARCH: LGWR is scheduled to archive destination LOG_ARCHIVE_DEST_2 after log switch
ARCH: Standby redo logfile selected for thread 1 sequence 24469 for destination LOG_ARCHIVE_DEST_2
ARCH: Archiving is disabled due to current logfile archival
Primary will check for some target standby to have received all redo
Waiting for target standby to apply all redo
Backup controlfile written to trace file /u01/app/oracle/diag/rdbms/imprd/IMPRD1/trace/IMPRD1_ora_302592.trc
Converting the primary database to a new standby database
Clearing standby activation ID 627850507 (0x256c3d0b)
The primary database controlfile was created using the
'MAXLOGFILES 192' clause.
There is space for up to 186 standby redo logfiles
Use the following SQL commands on the standby database to create
standby redo logfiles that match the primary database:
Archivelog for thread 1 sequence 24469 required for standby recovery
Archivelog for thread 2 sequence 23801 required for standby recovery
Switchover: Primary controlfile converted to standby controlfile succesfully.
Switchover complete. Database shutdown required
USER (ospid: 302592): terminating the instance
Fri Oct 13 08:50:44 2017
Instance terminated by USER, pid = 302592
Completed: alter database switchover to IMMSDB
Shutting down instance (abort)

--standby database alert.log

SWITCHOVER: received request 'ALTER DTABASE COMMIT TO SWITCHOVER  TO PRIMARY' from primary database.
Fri Oct 13 08:50:32 2017
Maximum wait for role transition is 15 minutes.
Switchover: Media recovery is still active
Role Change: Canceling MRP - no more redo to apply

SMON: disabling cache recovery
Fri Oct 13 08:50:41 2017
Backup controlfile written to trace file /u01/app/oracle/diag/rdbms/imsdb/IMMSDB1/trace/IMMSDB1_rmi_120912.trc
SwitchOver after complete recovery through change 10310266982216
Online logfile pre-clearing operation disabled by switchover
Online log +DATAC1/IMMSDB/ONLINELOG/group_1.3018.922980623: Thread 1 Group 1 was previously cleared
Standby became primary SCN: 10310266982214
Switchover: Complete - Database mounted as primary
SWITCHOVER: completed request from primary database.
Fri Oct 13 08:51:11 2017

At this point-in-time, the new PRIMARY database is in MOUNT state, so you need to OPEN the database.

IMMSDB> alter database open

And startup the STANDBY database and enable MRP: (below is the active standby database command)

IMMPRD> startup
IMMPRD> recover managed standby database using current logfile disconnect from session;

Post Switchover, run through the following:

IMMSDB> alter system switch logfile;

IMMSDB> select dest_id,error,status from v$archive_dest where dest_id=2;

IMMSDB> select max(sequence#),thread# from v$log_history group by thread#;
IMMSDB> select max(sequence#)  from v$archived_log where applied='YES' and

On Standby database

IMMPRD> select thread#,sequence#,process,status from gv$managed_standby;
-- in 12.2, use gv$dataguard_status instead of gv$managed_standby view

IMMPRD> select max(sequence#),thread# from v$archived_log group by thread#;

You can also enable the trace on primary and standby before performing the role transition to analyze any failures during the procedure. Use the below procedure on the PRIMARY database to enable the tracing:

SQL> alter system set log_archive_trace=8191;  -- enabling trace

SQL> alter system set log_archive_trace=0;      -- disabling trace


To revert (switch back) to the previous situation, perform the same action. Remember, now, your primary is your previous STANDBY and standby is previous PRIMARY.


12c Data guard Switchover Best Practices using SQLPLUS (Doc ID 1578787.1)

Execute immediate and passing clob as a parameter is not working.

Tom Kyte - Thu, 2017-10-12 23:06
I have the following procedure <code> create or replace PROCEDURE bscs_rateplan_sync ( tmcode_list IN CLOB, project_id IN VARCHAR2, requestname IN VARCHAR2, v_errm OUT VARCHAR2 ) ...
Categories: DBA Blogs

Differrent Character set and NLS

Tom Kyte - Thu, 2017-10-12 23:06
Hi TOM, Some question on Character set and NLS. 1) What is the different between US7ASCII and WE8ISO8859P1 character set and NLS? 2) Is there any problem for the DB with two diferrent character sets to interact, - trigger - dblink - e...
Categories: DBA Blogs

Auditing Operating system program name

Tom Kyte - Thu, 2017-10-12 23:06
Sir, i am developing a client/server program using VB & Oracle, it is in testing stage. i want to track whether the programmer is doing changes to the database using the application program(VB) or by logging into database(SQLPLUS) for this i us...
Categories: DBA Blogs

How you manage you're work-spaces!!

Tom Kyte - Thu, 2017-10-12 23:06
Greetings Team, I am very curious to know how you actually manage you're work spaces.I mean you always face different kind of questions all day along.How many laptops you use(of course you are using VM). But my question is if you guide someone fo...
Categories: DBA Blogs

SQL * Loader connection errors

Tom Kyte - Thu, 2017-10-12 23:06
To previous answer to MY SQL*Loader question I had asked, you said: Examples of SQL*Loader String <code>sqlldr.exe userid=my_user/my_password@some_string control=my_control.ctl sqlldr.exe userid=scott/tiger@//my_host:my_port/my_service_name con...
Categories: DBA Blogs

Oracle SOA Suite developer IDE: Jdeveloper 12c (download & installation)

Dietrich Schroff - Thu, 2017-10-12 15:40
After installing an Oracle 12c database onto an Oracle Linux i want to run Oracle's SOA Suite. But before the installation of SOA Suite 12c including the application server, i decided to download Jdeveloper to get a first impression of BPEL processes in version 12c (several years ago i did some projects with Oracel Fusion and Oracle SOA Suite 11g).

First try, to get Jdeveloper with SOA Suite extensions (skip these 4 steps - because this attempt misses the SOA Suite...):
  1. You have to download two files:
    -rw-rw-r-- 1   432104829 Okt  7 21:40 jdev_suite_122130_linux64-2.zip
    -rw-rw-r-- 1 2045438518 Okt  7 21:51 jdev_suite_122130_linux64.bin
  2. chmod u+x jdev_suite_122130_linux64.bin
  3. ./jdev_suite_122130_linux64.bin
  4. Start Jdeveloper:

But after all this steps, still the SOA Suite is missing.
At the download site of Jdeveloper you will even not find a word about SOA Suite. Hmmm. Let's take a look at the SOA Suite webpages:

 "... it includes ... Oracle JDeveloper with SOA IDE extensions ...."

OK. Let's go:

  1. Download the SOA Suite quickstart zip files
  2. Extract them:
    schroff@zerberus:~/Downloads$ unzip fmw_12.
    Archive:  fmw_12.
      inflating: fmw_12. 
    schroff@zerberus:~/Downloads$ unzip fmw_12.
    Archive:  fmw_12.
      inflating: fmw_12.  
  3. Start the installation:
    /home/data/opt/jdk1.8.0_131/bin/java -jar fmw_12.

  4. Start Jdeveloper:

That was not as easy as expected...

Identifying Conditional Navigation Content References

Jim Marion - Thu, 2017-10-12 15:30

As PeopleSoft customers upgrade to Fluid-enabled applications, it is quite common to start with Fluid disabled, and then implement Fluid behavior post go-live. It is all about change management and an organization's ability to digest change. Even though Oracle has set retirement dates for certain Classic components, with the first wave retiring December 31, 2017, all Classic functionality is still supposed to be present and available. But if you have opened a recent PeopleSoft image or are in the middle of updating to a recent HCM build, you may be asking, "Where are those delivered and supported Classic components?" If you investigate the portal registry, you will see they exist, they just don't appear in any menus. Many of these menu items, such as the Personal Details menu items, use a feature named Conditional Navigation. PeopleSoft uses Conditional Navigation to conditionally replace Classic menu items with their Fluid counterparts. You can read more about Conditional Navigation in PeopleBooks at PeopleTools PeopleBooks entry Products > Development Tools > Portal Technology > Understanding Conditional Navigation. A question I hear regularly is, "How do I temporarily enable Classic components in my PeopleTools 8.55+ environment?" The first step is to disable Conditional Navigation and the MyOracle Support document 2215964.1 describes how. Armed with this information, you now know what it is, how to configure it, and how to disable it, but how do you find it? I mean, what components are preconfigured for conditional navigation? Since conditional navigation uses CREF attributes, CREF identification requires a small SQL statement:

That is nice, but the portal registry isn't structured that way. Next question: How do I find those CREFs in the portal registry? Here is the longer form SQL statement. Results contain complete paths to conditionally configured items. Those of you that follow my blog may recognize this SQL as a derivative of Query for Component and/or CREF Navigation Take II

JSON manipulation in Java 9 JShell

Amis Blog - Thu, 2017-10-12 09:38

In this article I will demonstrate how we can work with JSON based data – for analysis, exploration, cleansing and processing – in JShell, much like we do in Python. I work with a JSON document with entries for all sessions at the Oracle OpenWorld 2017 conference (https://raw.githubusercontent.com/lucasjellema/scrape-oow17/master/oow2017-sessions-catalog.json)

The Java 9 SE specification for the JDK does not contain the JSON-P API and libraries for processing JSON. In order to work with JSON-P in JShell, we need to add the libraries – that we first need to find and download.

I have used a somewhat roundabout way to get hold of the required jar-files (but it works in a pretty straightforward manner):

1. Create a pom.xml file with dependencies on JSON-P





2. Then run

mvn install dependency:copy-dependencies

as described in this article: https://technology.amis.nl/2017/02/09/download-all-directly-and-indirectly-required-jar-files-using-maven-install-dependencycopy-dependencies/

this will download the relevant JAR files to subdirectory target/dependencies


3. Copy JAR files to a directory – that can be accessed from within the Docker container that runs JShell – for me that is the local lib directory that is mapped by Vagrant and Docker to /var/www/lib inside the Docker container that runs JShell.


4. In the container that runs JShell:

Start JShell with this statement that makes the new httpclient module available, for when the JSON document is retrieved from an HTTP URL resource:

jshell –add-modules jdk.incubator.httpclient


5. Update classpath from within jshell

To process JSON in JShell – using JSON-P – we need set the classpath to include the two jar files that were downloaded using Maven.

/env –class-path /var/www/lib/javax.json-1.1.jar:/var/www/lib/javax.json-api-1.1.jar

Then the classes in JSON-P are imported

import javax.json.*;

if we need to retrieve JSON data from a URL resource, we should also

import jdk.incubator.http.*;


6. I have made the JSON document available on the file system.


It can be accessed as follows:

InputStream input = new FileInputStream(“/var/www/oow2017-sessions-catalog.json”);


7. Parse data from file into JSON Document, get the root object and retrieve the array of sessions:

JsonReader jsonReader = Json.createReader(input)

JsonObject rootJSON = jsonReader.readObject();

JsonArray sessions = rootJSON.getJsonArray(“sessions”);


8. Filter sessions with the term SQL in the title and print their title to the System output – using Streams:

sessions.stream().map( p -> (JsonObject)p).filter(s ->  s.getString(“title”).contains(“SQL”)) .forEach( s -> {System.out.println(s.getString(“title”));})



One other example: show a list of all presentations for which a slidedeck has been made available for download along with the download URL:


.map( p -> (JsonObject)p)

.filter(s -> s.containsKey(“files”) && !s.isNull(“files”) && !(s.getJsonArray(“files”).isEmpty()))

.forEach( s -> {System.out.println(s.getString(“title”)+” url:”+s.getJsonArray(“files”).getJsonObject(0).getString(“url”));})


Bonus: Do HTTP Request

As an aside some steps in jshell to execute an HTTP request:

jshell> HttpClient client = HttpClient.newHttpClient();
client ==> jdk.incubator.http.HttpClientImpl@4d339552

jshell> HttpRequest request = HttpRequest.newBuilder(URI.create(“http://www.google.com”)).GET().build();
request ==> http://www.google.com GET

jshell> HttpResponse response = client.send(request, HttpResponse.BodyHandler.asString())
response ==> jdk.incubator.http.HttpResponseImpl@147ed70f

jshell> System.out.println(response.body())
<HTML><HEAD><meta http-equiv=”content-type” content=”text/html;charset=utf-8″>
<H1>302 Moved</H1>
The document has moved
<A HREF=”http://www.google.nl/?gfe_rd=cr&amp;dcr=0&amp;ei=S2XeWcbPFpah4gTH6Lb4Ag”>here</A>.



The post JSON manipulation in Java 9 JShell appeared first on AMIS Oracle and Java Blog.

SQL Server Management Studio New Features

Yann Neuhaus - Thu, 2017-10-12 09:24

regularly, Microsoft SQL Server comes up with its new features, and in particular SSMS 2017 17.3

in this post, i present you a new feature : Import flat files

How does it works :

Open SSMS, and right click on a selected database


Choose Import Flat File



Click Next



Tip the Path of your file to import and the new target table’s name


this is an overview of the file content

Click Next


As you can see a data type conversion is purposed , click Next



Click Finish and import task start


The import task is complet



to verify, go to your database and execute a select on the brand new table



This is a small tool that will make life easier for us to insert flat files quickly, the only drawback is that you can not insert the data on an existing table, the tool will ask you to create a new table.


Cet article SQL Server Management Studio New Features est apparu en premier sur Blog dbi services.

AUSOUG Connect 2017 – see you in Perth

Jeff Kemp - Thu, 2017-10-12 09:13

I’ll be presenting a talk about some APEX applications I’ve been building gradually over the past five years at the AUSOUG Connect 2017 conference in Perth, Western Australia on 29 November.

My hope is that you’ll get some ideas on how best to set up the overall architecture for your APEX applications that may need to serve more than one customer – whether different organisations or different business groups within your organisation.

Although it’s just one day, there’s a lot of good content packed in for developers, DBAs and eBusiness professionals so you really don’t want to miss it!

For more info

Oracle Utilities, Navigant Research Unveil Dual Innovation Blueprint Study for Utilities

Oracle Press Releases - Thu, 2017-10-12 06:45
Press Release
Oracle Utilities, Navigant Research Unveil Dual Innovation Blueprint Study for Utilities Utility executives worldwide share innovation best practices

Redwood Shores, Calif.—Oct 12, 2017

Oracle Utilities today unveiled a new research paper, titled ‘Utility Innovation Blueprint: How to Manage the Challenge of Dual Transformation,’ which explores how utilities can approach innovation across their people, processes and technology. Commissioned from Navigant Research, the paper combines short case studies with an action-oriented framework that offers a global perspective from executives interviewed in the United States, Europe, and Asia/Australia, across electric, gas, and water utilities. Participants included executives from Avista, Exelon, National Grid, UK Power Networks, Eesti Energia and Yarra Valley Water.

Historically, utilities have focused on core business objectives around safety, security, reliability and shareholder returns. However, massive disruption is currently occurring within the industry, with the advancement of distributed energy resources, IoT, big data and increasing customer expectations. These factors and more are signaling fundamental changes to business as usual. The new paper explains that simply knowing that disruption is coming is not enough. A blueprint for a new innovation path is required, one that needs to be agile and pragmatic.

The research takes a close look at a Dual Innovation Blueprint, a practice whereby a utility pursues two tracks simultaneously: one focuses on increasing the efficiency and resiliency of the core business, while the second centers around increasing agility to enable the iteration of new business models. 

Innovation Blueprint

While many industries have been faced with similar challenges, the approach detailed in the Navigant Research paper is a platform for innovation specific to the utilities industry, one that encompasses not only technology, but also the people and processes within the utility’s organization. According to the report, utilities looking to address innovation challenges should:

  • Adopt a set of processes that will optimize the success of innovation projects by evaluating accountability by project type, balancing finance with risk, and constantly monitoring progress.
  • Understand which technologies will enable innovation by digitizing then automating, deploying cloud analytics and iterating on new services.
  • Create a culture in which innovation can flourish by securing executive sponsorship, embracing new organizational perspectives and empowering employees.

“Changing business culture to incorporate innovation is one of the hardest things to do. Innovation teams must influence the business, but this can be very difficult to achieve. Increased communication around innovation is critical” said Ian Cameron, Head of Innovation, UK Power Networks.

“Utilities are increasingly facing the challenge of how to maintain reliability while simultaneously operating in an agile manner conducive to innovation,” said Rodger Smith, senior vice president and general manager of Oracle Utilities. “We believe the framework for dual innovation described in this paper provides an actionable model for utilities to respond to disruption.”

The executives interviewed for this research represent 16 utilities on three continents, serving 42 million customers, with 162,000 employees, selling electricity, water and gas.

Contact Info
Valerie Beaudett
+1 650.400.7833
Christina McDonald
+1 212.614.4221
About Oracle

The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at www.oracle.com.

About Navigant Research

Navigant Research, the dedicated research arm of Navigant, provides market research and benchmarking services for rapidly changing and often highly regulated industries. In the energy sector, Navigant Research focuses on in-depth analysis and reporting about global clean technology markets. The team’s research methodology combines supply-side industry analysis, end-user primary research and demand assessment, and deep examination of technology trends to provide a comprehensive view of the Energy Technologies, Utility Transformations, Transportation Efficiencies, and Buildings Innovations sectors. Additional information about Navigant Research can be found at www.navigantresearch.com.


Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Talk to a Press Contact

Valerie Beaudett

  • +1 650.400.7833

Christina McDonald

  • +1 212.614.4221

ORA-01722: invalid number error in decode

Tom Kyte - Thu, 2017-10-12 04:46
1. select decode('a',1,1,2) from dual --------> O/P Error saying Invalid Number 2. select decode(1,'a',1,2) from dual --------> O/P : 2 what is the difference in the above two cases and also please explain the execution process? Thanks I...
Categories: DBA Blogs

Using Analytical Functions to get Group Total

Tom Kyte - Thu, 2017-10-12 04:46
I need a GROUP TOTAL for EACH MONTH that SUMS up all the values in APPL column even when the query is for a particular APPL value. <code>create table test_grp_total (dttm_month date, appl varchar2(5), mins number) insert into test_grp_total va...
Categories: DBA Blogs

Incremental ETL : Streaming via Micro-Batch

Dylan's BI Notes - Wed, 2017-10-11 23:40
A modern analytic application takes the approach of streaming data to perform the similar process as the traditional data warehousing incremental ETL. Actually, if we look into Spark Streaming in details, the concept of streaming in Spark and Incremental ETL are the same: Spark Streaming is a Micro-Batch based streaming. Each micro-patch is much like […]
Categories: BI & Warehousing

Incremental ETL – The last refresh date

Dylan's BI Notes - Wed, 2017-10-11 15:31
There are multiple ways to model the last refresh date. In OBIA, DAC and Informatica based ETL, the last refresh date is maintained within DAC. It is maintained at the level of the source tables that populates the data. Oracle BI DAC User Guide > About Refresh Dates and DAC’s Incremental Load Strategy In OBIA […]
Categories: BI & Warehousing


Subscribe to Oracle FAQ aggregator