Feed aggregator

Performance on using oracle Table of an object type

Tom Kyte - Fri, 2017-02-17 12:06
Hi I am working on a reporting tool that extracts data from Oracle Stored Procedures base and generates reports using Jasper. Please find below a sample of the procedure written. --Creating Object CREATE OR REPLACE TYPE my_row_type AS OBJECT ...
Categories: DBA Blogs

Apex idle user goes to login

Tom Kyte - Fri, 2017-02-17 12:06
Hello When my users log in and idle like 15 minutes then some users are loged out. How can i prevent that? That user will be never logged out only when they do it. it is very important. I hope you can help me with that. Kristjan
Categories: DBA Blogs

char (10) data type hosting numeric data for natural keys

Tom Kyte - Fri, 2017-02-17 12:06
Hi, Can you please provide some guidance on the possible performance issue one might have in creating a char(10) data type that hosts natural keys? Also, I'm baffled by the following results.. THIS WORKS FINE even char(10) blank pads the ...
Categories: DBA Blogs

Skip scripts based on a variable value during run

Tom Kyte - Fri, 2017-02-17 12:06
Hi, We run the same script package to each environment (production, acceptance test, etc.). We have some of these scripts which shouldn't run e.g. into PRO db. Is there any way to define which script can run in which environment if environment type ...
Categories: DBA Blogs

Bind variables in a View?

Tom Kyte - Fri, 2017-02-17 12:06
It is possible to use bind variables in a view? It is possible to replace the values ??10, 11, 12 and VISOR by BindVariables or how can I optimize this view? CREATE OR REPLACE VIEW VIEW1 AS SELECT a.column1, a.column2 FROM TABLA1 a WHERE a...
Categories: DBA Blogs


Tom Kyte - Fri, 2017-02-17 12:06
Hi there! I want to know if it possible to get information about queries in database (like v$active_session_history contains) without diagnostic and tuning packs? This datadictionary view v$active_session_history consists of x$kewash and x$ash fi...
Categories: DBA Blogs

How RowProcessed works and updated in GV$SQL

Tom Kyte - Fri, 2017-02-17 12:06
How row_processed,disk read and buffer gets actually works while executing any query. I came across two scenarios while executing two different queries.(My sql arrayfetch size is 50) 1. Disk read and buffergets increased till the entire executio...
Categories: DBA Blogs

getting string

Tom Kyte - Fri, 2017-02-17 12:06
i have a table A and in that column B. column b has below data. sdkjksdfFDOR:1234; wwerwerrFDOR:12344; how to read the data after FDOR: and upto ';' my output should be 1234 12344 Please help me. Regards. T.prasad
Categories: DBA Blogs

ADF Editable Table - Recommendation For Data Entry Optimization

Andrejus Baranovski - Fri, 2017-02-17 11:33
I will explain data entry use case related to ADF table. Specifically I will talk about a bit more complex case, when some columns in the table are set with AutoSubmit=true, to force values to be submitted to the server on change. This can be required when validation rule must be processed on value change or there are dependent re-calculated fields in the same row.

If you are using AutoSubmit=true columns in ADF table, it is easy to start loosing values from columns with AutoSubmit=false. Really? Yes - only, if table iterator is set with ChangeEventPolicy=ppr.

Let's do an experiment. First Name column field is set with AutoSubmit=true:

Iterator is set with ChangeEventPolicy = ppr:

Enter value for Last Name, field with AutoSubmit=false:

Change value for First Name, field with AutoSubmit=true and tab away:

Previously entered value for Last Name will be lost and focus will move to table header. Two bad things happened at once. First Name is set with AutoSubmit=true, this means it send value from this field in PPR request, and since table iterator is set with ChangeEventPolicy=ppr, in response it is refreshing table with data from the server. Obviously Last Name new value wasn't sent to server yet (AutoSubmit=false) and ChangeEventPolicy=ppr is reloading values on the client with whatever values are on the server. Technically this is not a bug, but is a critical bug from user perspective - loosing data.

If you have AutoSubmit=true columns in the table, make sure to set ChangeEventPolicy=none for iterator:

This time after changing value with AutoSubmit=true - other field values stay on the client and focus moves nicely to the next field:

When data is saved - changed from both fields are submitted to the DB:

Download sample application - GroovyADFApp_v3.zip.

Working with APEX 5 Dialogs

Andrew Tulley - Fri, 2017-02-17 09:48

APEX 5 introduces native dialogs, both modal and non-modal. This article discusses:

  1. How to enable dialogs in your APEX 5 application if they are not already enabled.
  2. How to open a dialog.
  3. How to close a dialog, returning values from that dialog.
  4. How to respond to the closing of a dialog.
How to enable dialogs in your APEX 5+ application if they are not already enabled.

To check if Dialogs are already usable within your Application, start the Create New Page wizard and see if you can specify a Page Mode value of “Modal Dialog”. If you can then great, you don’t need to complete the steps in this section.

If you can’t, follow the steps below to enable the use of Dialogs.

NOTE: You won’t need to perform these steps if you’re using the Universal Theme. However, it is not necessary to be using the Universal Theme to use APEX’s native dialogs.


Above: If you can select Modal Dialog from the Page Mode select list then you do not need to complete the steps in this section.

Enabling APEX 5 native dialogs in your application

  1. Firstly, you need a Page Template which will be used as your Default Page Template for Dialogs. Within Application > Shared Components > Templates locate a suitable Page Template which already exists. You want to choose a Page Template which contains none of the components which would not be suitable for a dialog e.g. tabs, nav bar etc..
  2. Make a copy of this Page Template, calling the newly-created copy something like “Dialog Page”.
  3. Set the Template Type of this new Page Template to “Dialog Page”.dialogs2
  4. When you try to save the the new Page Template you may see errors such as the following:dialogs3To rectify these, locate the three fields mentioned and paste in the default values shown for each, as shown below.dialogs4
  5. Navigate to Shared Components > Themes. Open the the current theme for your application.
  6. Navigate to “Dialog Defaults” and select your newly-created Page Template from the Dialog Page select list. Click Apply Changes.dialogs5
  7. You should now be able to create new pages specifying their Page Mode as “Modal Dialog” or “Non-Modal Dialog”. You should also be able to modify the Page Mode of existing pages to turn them into dialog pages.

For more information about Dialog Templates in APEX 5:

How to open a Dialog

There are a couple of ways of opening a dialog page in APEX 5+.


The simplest way is to use any of APEX’s built in methods of generating navigation.

For example, setting the action of a Region Button to “Redirect to Page in this Application” and choosing a Dialog Page.


Alternatively, if you need to open a dialog using Javascript, you can use the following method:

a) Create a hidden page item which is going to hold the prepared URL of the dialog page you want to open. e.g. P35_DIALOG_URL

b) Populate this item with an on load pl/sql page process:

:P35_DIALOG_URL := APEX_UTIL.prepare_url(
, p_triggering_element => ‘$(”#openDialogIcon”)’

The first argument passed to APEX_UTIL.prepare _url here is easy: it’s the URL of the dialog page.

The second argument, p_triggering_element, is a string representing a jQuery object which will be identified as the Triggering Element which opened the dialog.

Note that you must use $(‘#someitemid’) and NOT $(“#someitemid”) (i.e. single quotes instead of double quotes to surround the selector string).

The jQuery object you specify here is important as it’s the item against which the apexafterclosedialog Javascript Event will be registered. You can see an example of this later in this article.

This call to APEX_UTIL.prepare_url will return a string of the following form:

javascript:apex.navigation.dialog(‘f?p=400:38:995688545561::NO:RP::\u0026p_dialog_cs=BA6YtuVKy9_VwUU3hN7x1vcfehc’,{title:’Select desired Item from List’,height:’600′,width:’700′,maxWidth:’false’,modal:true,dialog:null},”,$(‘#openDialogIcon’));

c) Create an On Page Load Dynamic Action with a Javascript action to assign this Prepared URL with an tag’s HREF attribute. E.g.:

// Wrap an tag around the icon which opens the Select Item dialog.
// Set the href of this new tag to be the URL
// javascript:apex.navigation.dialog(…) style value required to open
// the dialog.
$(“#openDialogIcon”).wrap( ‘<a href=”‘ + $(“#P35_DIALOG_URL”).val() + ‘”></a>’ );

How to close a dialog, returning values from that dialog.

1) You can close a Modal Dialog in one of three ways:

a) A Close Dialog Page Process (submit the page to fire this as you would any other Page Process).

b) A Close Dialog Dynamic Action

c) Call Javascript of this form:


Where the first parameter (true) indicates whether the dialog is modal or not and the second parameter is an array of items in the dialog whose values should be available in the parent once the dialog has closed.

See https://docs.oracle.com/cd/E59726_01/doc.50/e39149/javascript_api.htm#AEAPI30096 for more details on this Javascript function (i.e. apex.navigation.dialog.close).

2) The important concept, regardless of which of the above 3 methods you choose to close your Dialog, is the Items to Return value. This is the correct way to pass back items to the calling/parent page.

As John Synders points out here (http://hardlikesoftware.com/weblog/2015/05/22/apex-5-0-dialogs/):

John Synders on the philosophy behind dialogs:
“The normal pattern for using dialogs is that it is OK for the calling or parent page to know about the dialog page, but not the other way around. The dialog should not know the context from which it is used. If a dialog page knows about the page it is called from then it limits how the dialog can be reused. This is not APEX specific; it is a common UI framework pattern but is often ignored in web apps. This means that if you are trying to take data entered in a dialog and insert or use it in the parent page, this should be done from event handlers or dynamic actions on the parent page not from code on the dialog page. You should not try to use JavaScript on the dialog page to try to manipulate the parent page.”

For example, using the Close Dialog Pager Process method (method “a”), you can set a comma separated list of items to return:


You may think “that’s all well and good in theory but I’m going to ignore this ideology and, in direct contravention of the above advice from John Synders, I’m going to use JavaScript on the dialog page to manipulate the parent page, just because it’s what I know and I’m not really bothered about reusing my dialog. I can just use something like parent.$(“#P100_SOME_ITEM”).val( ’some val from dialog’ ) to reference the parent and the job’s done.”

If you’re only using one level of dialogs (i.e. not a nested dialog) you can do this if you really want to and are happy to ignore the point about reusability of the dialog. However, If you are within a nested dialog (i.e. a dialog opened from another dialog) you’ll have problems with this since parent  will refer to the top level page and NOT the parent dialog. Search the John Synders article referenced above for the term “top level” to see why it’s built this way i.e. why all dialogs, nested or otherwise, are created in the parent page.

My advice would be NOT to do this in any case. Stick to the principle that the Dialog should not know about its caller.

How to respond to the closing of a dialog

So, great, now we have a means to close a dialog and specify which of the items in the dialog we wish to be available in the parent/calling window. The question then is: How do we get access to those values in the calling/parent window?

Again, we have a couple of options.

OPTION 1: Use Dynamic Actions

Create a Dynamic action with “Event Dialog Closed” as in the screenshot below. Note that the value for Selection Type is very important. If this is wrong, this will not work. This needs to be the item that triggered the opening of the dialog i.e. the Triggering Element.


Add one or more true actions to it of type “Set Value”, choosing a Set Type of Dialog Return Item. (In our example Page 35 is the parent page and page 38 is the dialog):


Option 2: Do it all manually with Javascript (who doesn’t like to have more control?)

For this option just create an On Page Load Dynamic Action with a Javascript Action as follows:



// the data parameter will be set to an object containing the values passed back from the closed dialog (“The Return Items”)

// e.g.:

// { P38_ITEM_ID: “1234”, P38_ITEM_NAME: “ACME WIDGET”, dialogPageId: 38 }

apex.item(“P35_ITEM_ID”).setValue( data.P38_SELECTED_ITEM_ID );

apex.item(“P35_ITEM_NAME”).setValue( data.P38_SELECTED_ITEM_NAME);



Again, note that the value of #openDialogIcon is very important. To repeat: This is the element that was indicated as the trigger for the opening of the dialog.


Related articles

Details of apex.navigation Javascript API


John Synders on Modal Dialogs in APEX 5


Dialog Templates in APEX 5 (Oracle Documentation)


Oracle 12c – Combining Flashback Drop and Flashback Query

Yann Neuhaus - Fri, 2017-02-17 09:09

If you think that Flashback Drop feature just brings back your table, then this is only half of the story. It does much more than that. Besides undropping the table, it also brings back your constraints, your indexes, your trigger, your grants and the statistics as well.

The ugly part is, that the flashback drop brings back some strange object names e.g. your indexes and constraints with names like “BIN$…” or alike. Maybe something you don’t want. So why not combining the Flashback Drop with a Flashback Query on the Dictionary to get the old constraint and index names.

Let’s setup a few objects in the SCOTT schema. But before we do that, we need to grant the user SCOTT some extra privileges.

SQL> grant execute on dbms_flashback to scott;

Grant succeeded.

SQL> grant flashback on user_indexes to scott;

Grant succeeded.

SQL> grant flashback on user_constraints to scott;

Grant succeeded.

SQL> grant flashback on user_triggers to scott;

Grant succeeded.

Now we can setup our objects for this test. I will create 2 tables, and few grants, a trigger and statistics. The goal is to have after the flashback to before drop, exactly the same object names afterwards for the table the index, the constraints and the trigger.

SQL> connect scott/tiger

SQL> create table dbi_t
  2  ( x int, constraint t_pk primary key(x),
  3   y int, constraint check_x check(x>0)
  4  );

Table created.

SQL> insert into dbi_t values (1,1);

1 row created.

SQL> insert into dbi_t values (2,2);

1 row created.

SQL> insert into dbi_t values (3,3);

1 row created.


Commit complete.

SQL> create table dbi_audit
  2  (x int, x_before int, y int, y_before int, z varchar2(10));

Table created.

  3     ON DBI_T
  4     FOR EACH ROW
  6     v_z varchar2(10);
  7  BEGIN
  8     SELECT user INTO v_z FROM dual;
  9     -- Insert record into audit table
 10     INSERT INTO dbi_audit
 11     ( x,
 12       x_before,
 13       y,
 14       y_before,
 15       z)
 16     VALUES
 17     ( :new.x,
 18       :old.x,
 19       :new.y,
 20       :old.y,
 21       v_z );
 22* END;

Trigger created.

SQL> insert into dbi_t values (4,4);

1 row created.

SQL> commit;

Commit complete.

SQL> insert into dbi_t values (5,5);

1 row created.

SQL> commit;

Commit complete.

SQL> update dbi_t set x=6 where y=5;

1 row updated.

SQL> commit;

Commit complete.

SQL> select * from dbi_t;

         X          Y
---------- ----------
         1          1
         2          2
         3          3
         4          4
         6          5

SQL> select * from dbi_audit;

         X   X_BEFORE          Y   Y_BEFORE Z
---------- ---------- ---------- ---------- ----------
         4                     4            SCOTT
         5                     5            SCOTT
         6          5          5          5 SCOTT


SQL> begin
  3  ownname => '"SCOTT"',
  4  tabname => '"DBI_T"',
  5  estimate_percent => 100
  6  );
  7  end;
  8  /

PL/SQL procedure successfully completed.

SQL> begin
  3  ownname => '"SCOTT"',
  4  tabname => '"DBI_AUDIT"',
  5  estimate_percent => 100
  6  );
  7  end;
  8  /

PL/SQL procedure successfully completed.

SQL> grant select on dbi_t to hr;

Grant succeeded.

SQL> grant select on dbi_audit to hr;

Grant succeeded.

Ok. So let’s take a look how is the current situation is right now.

  2  from user_tables
  3  where TABLE_NAME in ('DBI_T','DBI_AUDIT');

------------ --------------------
DBI_AUDIT    17-FEB-17
DBI_T        17-FEB-17

SQL> select CONSTRAINT_NAME, CONSTRAINT_TYPE from user_constraints where table_name = 'DBI_T';

CONSTRAINT_NAME                      C
------------------------------------ -
CHECK_X                              C
T_PK                                 P

SQL> select index_name from user_indexes where table_name = 'DBI_T';


  2  where table_name in ('DBI_T','DBI_AUDIT');

GRANTEE        OWNER          TABLE_NAME           GRANTOR        PRIVILEGE
-------------- -------------- -------------------- -------------- --------------------
HR             SCOTT          DBI_AUDIT            SCOTT          SELECT
HR             SCOTT          DBI_T                SCOTT          SELECT

SQL> select TRIGGER_NAME, TABLE_NAME, STATUS from user_triggers;

------------------------ -------- --------

Everything looks good. Up to date statistics, trigger is enabled and no objects with “BIN$xx” or something. The next step is a quite important one for this demo. I am just saving the SCN number before the “drop table” into a variable. In the real world, you need to find the SCN number yourself, e.g. with the TIMESTAMP_TO_SCN function.

SQL> column SCN new_val S
SQL> select dbms_flashback.get_system_change_number SCN from dual;


After we got the SCN, we can drop the table and undrop it afterwards.

SQL> drop table dbi_t;

Table dropped.

SQL> flashback table dbi_t to before drop;

Flashback complete.

Let’s take a look how our constraints and index names look right now. Exactly like expected. They have this ugly “BIN$xxx” names, but we want the old names back.

SQL> select CONSTRAINT_NAME, CONSTRAINT_TYPE from user_constraints where table_name = 'DBI_T';

CONSTRAINT_NAME                      C
------------------------------------ -
BIN$SLt7vMNFZNbgU8k4qMDm6g==$0       C
BIN$SLt7vMNGZNbgU8k4qMDm6g==$0       P

SQL> select index_name from user_indexes where table_name = 'DBI_T';


The trick is now to invoke a Flashback Query on the dictionary. Flashback query on the dictionary is not 100% supported, but it works. I just save the current index name into the variable “I” and the old name into variable “OI”.

SQL> column index_name new_val I
SQL> select index_name from user_indexes where table_name = 'DBI_T';


SQL> column index_name new_val OI
SQL> select index_name from user_indexes as of scn &S
  2  where table_name = 'DBI_T';
old   1: select index_name from user_indexes as of scn &S
new   1: select index_name from user_indexes as of scn    1056212


After I have the current and the old name in place, I can do an alter index and get my old name back.

SQL> alter index "&I" rename to "&OI";
old   1: alter index "&I" rename to "&OI"
new   1: alter index "BIN$SLt7vMNHZNbgU8k4qMDm6g==$0" rename to "T_PK"

Index altered.

SQL> select index_name from user_indexes where table_name = 'DBI_T';



I will do now exactly the same for the constraints and the trigger.

SQL> column constraint_name new_val CC
SQL> select constraint_name from user_constraints where table_name = 'DBI_T' and CONSTRAINT_TYPE = 'C';


SQL> column constraint_name new_val OCC
SQL> select constraint_name from user_constraints as of scn &S where table_name = 'DBI_T' and CONSTRAINT_TYPE = 'C';
old   1: select constraint_name from user_constraints as of scn &S where table_name = 'DBI_T' and CONSTRAINT_TYPE = 'C'
new   1: select constraint_name from user_constraints as of scn    1056212 where table_name = 'DBI_T' and CONSTRAINT_TYPE = 'C'


old   1: alter table DBI_T RENAME CONSTRAINT "&CC" TO "&OCC"
new   1: alter table DBI_T RENAME CONSTRAINT "BIN$SLt7vMNFZNbgU8k4qMDm6g==$0" TO "CHECK_X"

Table altered.

SQL> column constraint_name new_val PC
SQL> select constraint_name from user_constraints where table_name = 'DBI_T' and CONSTRAINT_TYPE = 'P';


SQL> column constraint_name new_val OPC
SQL> select constraint_name from user_constraints as of scn &S where table_name = 'DBI_T' and CONSTRAINT_TYPE = 'P';
old   1: select constraint_name from user_constraints as of scn &S where table_name = 'DBI_T' and CONSTRAINT_TYPE = 'P'
new   1: select constraint_name from user_constraints as of scn    1056212 where table_name = 'DBI_T' and CONSTRAINT_TYPE = 'P'


old   1: alter table DBI_T RENAME CONSTRAINT "&PC" TO "&OPC"
new   1: alter table DBI_T RENAME CONSTRAINT "BIN$SLt7vMNGZNbgU8k4qMDm6g==$0" TO "T_PK"

Table altered.

SQL> col CONSTRAINT_NAME format a36
SQL> select CONSTRAINT_NAME, CONSTRAINT_TYPE from user_constraints where table_name = 'DBI_T';

CONSTRAINT_NAME                      C
------------------------------------ -
CHECK_X                              C
T_PK                                 P

SQL> col INDEX_NAME format a36
SQL> select index_name from user_indexes where table_name = 'DBI_T';


SQL> select TRIGGER_NAME, TABLE_NAME,STATUS from user_triggers;

TRIGGER_NAME                     TABLE_NAME                       STATUS
-------------------------------- -------------------------------- --------
BIN$SLt7vMNIZNbgU8k4qMDm6g==$0   DBI_T                            ENABLED

SQL> column trigger_name new_val T
SQL> select trigger_name from user_triggers where table_name = 'DBI_T';


SQL> column trigger_name new_val OT
SQL> select trigger_name from user_triggers as of scn &S where table_name = 'DBI_T';
old   1: select trigger_name from user_triggers as of scn &S where table_name = 'DBI_T'
new   1: select trigger_name from user_triggers as of scn    1056212 where table_name = 'DBI_T'


SQL> alter trigger "&T" RENAME TO "&OT";
old   1: alter trigger "&T" RENAME TO "&OT"
new   1: alter trigger "BIN$SLt7vMNIZNbgU8k4qMDm6g==$0" RENAME TO "DBI_AFTER_UPDATE"

Trigger altered.

SQL> select TRIGGER_NAME, TABLE_NAME, STATUS from user_triggers;

TRIGGER_NAME             TABLE_NAME             STATUS
------------------------ ---------------------- --------
DBI_AFTER_UPDATE         DBI_T                  ENABLED

The stats and the grants do come back automatically.

  2  from user_tables
  3  where TABLE_NAME in ('DBI_T','DBI_AUDIT');

------------ --------------------
DBI_AUDIT    17-FEB-17
DBI_T        17-FEB-17

  2  where table_name in ('DBI_T','DBI_AUDIT');

GRANTEE        OWNER          TABLE_NAME           GRANTOR        PRIVILEGE
-------------- -------------- -------------------- -------------- --------------------
HR             SCOTT          DBI_AUDIT            SCOTT          SELECT
HR             SCOTT          DBI_T                SCOTT          SELECT



The Flashback Drop feature does not just bring back your table. It does much more, it brings back your grants, the trigger, the statistics, the indexes and the constraints as well. If you are lucky, you can even combine it with the Flashback Query to retrieve your old names for the indexes, constraints and triggers.


Cet article Oracle 12c – Combining Flashback Drop and Flashback Query est apparu en premier sur Blog dbi services.

Dublin Tech Summit 2017: Running into the Oracle Experience

Usable Apps - Fri, 2017-02-17 05:54
In Dublin's Fair City

I've just been to the first Dublin Tech Summit (@DubTechSummit) where I was wowed by the awesome depth and breadth of the event, learned some new things, renewed old acquaintances, and forged new business relationships too!


Robo Thespian from Engineered Arts: "We Can't Unbite the Apple"

Dublin Tech Summit (DTS) offered a powerful schedule covering topics from Fashion and Healthcare to IoT and FinTech and a geansaí load more for an audience that ranged from storytellers to investors to coders, combined with workshops and a platform for startups and spearheaded by a top notch speaker line up. A powerful promotional campaign by the DTS organizers resulted in the Convention Centre Dublin (@TheCCD) packed with attendees from around the world, all greeted by the ever-engaging Robo Thespian in the CCD lobby.

FinTech, IoT, and More...

I was particularly impressed with the sessions on FinTech and IoT, the focus on happening technologies such as AI, VR and chatbots, and how millennials live and work in the digital age. Millennials, for example, would prefer to bank with a digital tech brand such as Amazon or Google than a more 'traditional' financial institution.


Accenture, Samsung, and others brought cutting edge AI, VR, and other happening tech to the event.

Oracle had a strong DTS presence in the exhibition area as well as great sessions on the Internet of Things (IoT) and business transformation, covering such fascinating uses for the linking of devices and cloud data such as parking management and factory monitoring in the U.S.,  e-scooters in Japan, and data analytics for performance sailing in the America’s Cup, FinTech, and more.


Oracle had a strong floor presence for SaaS, IoT and Cloud Services, a fun Anki OVERDRIVE demo, and super IoT sessions by Oracle IoT strategist Joost Volker (@joostvolker) who also joined with other industry players to share insights into BigData, IoT, and digital transformation.

That Conversation About UI

An engaging session from Comtrade Digital on chatbots, or conversational computing, offered me insights into how chatbots have brought the Turing Test to life in real business as mobile messaging chat apps surpass social networks usage. Chatbots offer a personal, fun, and free yet deeply contextual mobile user experience for the millennial generation but also delivers a text-based conversational engagement that resonates with us 'perennials' too.


There was a super-engaging FinTech session on chatbot conversational computing from Comtrade Digital titled "When millennials literally can't even ..."

Would I Go Again? Yep! 

In all, this was one super event that did the Dublin tech ecosystem proud and it was great to see Oracle there in force with some great thought leadership on show. DTS was a great platform for demonstrating your goodness sure, but also a place for hands-on doing and learning, relationship building, and opening your mind to innovative possibilities. That DTS did my hometown of Dublin proud made it all a sweeter experience!

Dublin Tech Summit: Where you'll 'run into' all sorts of digitally-savvy people doing great things.


Here’s to DTS in 2018!

Where Will Oracle Applications User Experience Turn Up Next?

Check our website for news about upcoming events and about how to get involved in our activities. 

(All images in this post by Ultan O'Broin)

Coming Soon: Database Star Academy Membership

Complete IT Professional - Fri, 2017-02-17 05:00
The Database Star Academy membership will be launching very soon. Read on to find out more. What Is The Database Star Academy Membership? Over the last couple of years, I’ve developed several video courses and PDF guides to help database developers improve their SQL skills and their career. Up until now, I’ve made these available […]
Categories: Development

OEL 7 – How to disable IPv6 on Oracle Linux 7 – Follow Up

Yann Neuhaus - Fri, 2017-02-17 02:56

This is a follow up to the Blog were I explained how to disable IPv6 on Oracle Linux 7.

If you have done all the steps which I have explained here http://blog.dbi-services.com/oel-7-how-to-disable-ipv6-on-oracle-linux-7/  then you have already IPv6 successfully disabled. However, some tools require some special attention afterwards if you want to avoid some ugly warning or error messages. There are so many tools that can use IPv4 and IPv6, but it is impossible to mention all of them. I will just dig a little deeper into the following 4.

  • Postfix
  • Oracle
  • NFS
  • rsyslogd

Let’s start with Postfix. This might be one of the first warning messages you see, in case you have disabled IPv6 on your system. If you receive the following warning message when you try to send an email, then you need to adjust your /etc/postfix/main.cf file.

$ mailx -s "Test" xxx.xxx@xxx.com
$ send-mail: warning: inet_protocols: IPv6 support is disabled: Address family not supported by protocol
send-mail: warning: inet_protocols: configuring for IPv4 support only
postdrop: warning: inet_protocols: IPv6 support is disabled: Address family not supported by protocol
postdrop: warning: inet_protocols: configuring for IPv4 support only

The solution is to configure your /etc/postfix/main.cf file to allow only the ipv4 protocol.

[root@SVPCHODAC01 sbin]# /usr/sbin/postconf | grep inet_protocols
inet_protocols = all
/usr/sbin/postconf: warning: inet_protocols: IPv6 support is disabled: Address family not supported by protocol
/usr/sbin/postconf: warning: inet_protocols: configuring for IPv4 support only

[root@SVPCHODAC01 sbin]# cd /etc/postfix/
[root@SVPCHODAC01 postfix]# cp main.cf main.cf.20170203a
[root@SVPCHODAC01 postfix]# vi main.cf

Change “inet_protocols = all”  to “inet_protocols = ipv4″ and then restart PostFix.

[root@SVPCHODAC01 postfix]# /etc/init.d/postfix restart
Shutting down postfix: [ OK ]
Starting postfix: [ OK ]

[root@SVPCHODAC01 postfix]# /usr/sbin/postconf | grep inet_protocols
inet_protocols = ipv4

That’s it. Now the ugly Postfix warning messages disappear.


The next candidate is the Oracle Listener. In some situations,  you might see the following error message in your listener.log file when working with Cloud Control 12c.

TNS-01189: The listener could not authenticate the user

This is related to an Oracle bug, to be more precise, it is “BUG 16054202 – TNLIN EXTRACTS WRONG SUBNETMASK FOR IPV6 ADDRESSES”. The bug can be fixed by configuring the Oracle Listener to work with IPv4 only. This is done via the listener.ora IP parameter, which knows the following options.


Listen on the first IP address returned by the DNS resolution of the host name.
If the user wants the listener to listen on the first IP to which the specified host name resolves,
then the address must be qualified with (IP=first).


Listen only on IPv4 addresses.


Listen only on IPv6 addresses.

Simply put the (IP=V4_ONLY) after your PORT setting, and then restart the listener like shown in the following example.

-- listener.ora
      (ADDRESS = (PROTOCOL = TCP)(HOST = dbidg03)(PORT = 1521)(IP=V4_ONLY))

-- restart

$ lsnrctl stop LISTENER; lsnrctl start LISTENER

Now the messages “TNS-01189: The listener could not authenticate the user” in the listener.log should disappear.



Under normal circumstances, no changes should be required for NFS unless you had proto=tcp6 configured for your mount options. If so, then your mount will not work anymore.

[root@dbidg02 etc]# mount /u99
mount.nfs: an incorrect mount option was specified

And you will see the following error in the /var/log/messages file.

Feb 14 10:26:48 dbidg02 kernel: NFS: server address does not match proto= option

Now you could either remove the proto option or change it to proto=tcp.

For NFS version 4 you have the following options:

proto=netid The netid determines the transport that is used to communicate with the NFS server. Supported options are tcp, tcp6, and rdma. tcp6 use IPv6 addresses and is only available if support for TI-RPC is built in. Both others use IPv4 addresses.

In my case, I have added the proto=tcp option to my NFS mount table in the /etc/fstab

#-- NFS mounts
dbidg03:/u99   /u99  nfs  vers=4.1,proto=tcp,rw,bg,hard,nointr,rsize=32768,wsize=32768,tcp,timeo=600    

-- And now the mount works perfectly again.

[root@dbidg02 etc]# mount /u99
[root@dbidg02 etc]#
[root@dbidg02 etc]# mount | grep nfs
sunrpc on /var/lib/nfs/rpc_pipefs type rpc_pipefs (rw,relatime)
nfsd on /proc/fs/nfsd type nfsd (rw,relatime)
dbidg03:/u99 on /u99 type nfs4 (rw,relatime,vers=4.1,rsize=32768,wsize=32768,namlen=255,hard,proto=tcp,port=0,timeo=600,retrans=2,sec=sys,clientaddr=,local_lock=none,addr=

Now the NFS mount works again.


Almost the same applies to the rsyslogd. In case you have not specified “-6″ in your syslogd options, you are fine. If not, you need to either remove the option or replace it with “-4″

oracle@dbidg03:/etc/sysconfig/ [oms13c] rpm -qa  | grep rsyslog

-- from the doc
 -4  Causes rsyslogd to listen to IPv4 addresses only.  If neither -4 nor -6 is given, rsyslogd listens to all configured addresses of the system.
[root@dbidg03 sysconfig]# cat rsyslog
# Options for rsyslogd
# Syslogd options are deprecated since rsyslog v3.
# If you want to use them, switch to compatibility mode 2 by "-c 2"
# See rsyslogd(8) for more details

[root@dbidg03 sysconfig]# systemctl restart rsyslog
[root@dbidg03 sysconfig]#

There might be some tools on your system that requires special attention after you have disable IPv6 on your system.



Cet article OEL 7 – How to disable IPv6 on Oracle Linux 7 – Follow Up est apparu en premier sur Blog dbi services.

Webcast: "Ready or Not: Applying Secure Configuration to EBS"

Steven Chan - Fri, 2017-02-17 02:06

Applying Secure Configuration Oracle University has a wealth of free webcasts for Oracle E-Business Suite.  If you're looking for our latest updates on the EBS Security front, see:

  • Ready or Not: Applying Secure Configuration to Oracle E-Business Suite (26 minutes)

It's a new world - one where secure configuration is no longer optional and you must reduce your attack surface.  Eric Bing, Senior Director Product Development, shares that going forward, many Oracle E-Business Suite security features will now be turned on by default. To further assist you with deploying Oracle E-Business Suite securely, we are now providing a Secure Configuration Management console.  Under certain conditions, access to Oracle E-Business Suite will be limited until your Applications DBA or System Adminstrator corrects or acknowledges the errors and warnings in the console.  Come to this session to learn about the new Secure Configuration Management console and our guidelines for auditing, monitoring and securing your Oracle E-Business Suite environment and sensitive data. This material was presented at Oracle OpenWorld 2016.

Categories: APPS Blogs

Links for 2017-02-16 [del.icio.us]

Categories: DBA Blogs

Weekly Link Roundup – Feb 17, 2017

Complete IT Professional - Fri, 2017-02-17 00:39
Here’s a collection of interesting articles I’ve read this week. Articles I’ve Read Truncate 12c https://jonathanlewis.wordpress.com/2017/02/16/truncate-12c/ Jonathan Lewis writes an article about some improvements to the TRUNCATE statement in Oracle 12c, and how it relates to “on delete cascade”. THere’s a small example and a good explanation on how it works.   Step by Step […]
Categories: Development

Analytics apps or embedded analytics

Dylan's BI Notes - Thu, 2017-02-16 21:02
An analytics apps should be an application.   We should not see it as a report solution or just a way to view or download data. A best analytics apps is an apps, and lets people forget about it is an “analytics” apps. If you keep the end user in mind and drive the design of […]
Categories: BI & Warehousing

In Real Estate it's Location, Location, Location - In Startups it's all about Sales, Sales, Sales

Bradley Brown - Thu, 2017-02-16 18:32
Sure, you could argue that marketing is important or execution or support or product quality, but when it comes down to what you really need to focus on, it's all about sales.  You have to have a high quality product to sell, you certainly need to be able to support it, and it has to be scaleable, but if you don't have customers, you won't have a business.

I've built a number of software as a service (SaaS) businesses from the ground up.  It's hard work!  One of my favorite best practices, like in the lean startup is getting to your MVP (minimum viable product) as quickly as you can.  It's great to hear prospects saying no to certain things and yes to others.  Once you have an MVP and your initial customers, you know exactly where to improve your product, what requires your support and ultimately how to sell to more customers as well as how to sell more to your existing customer base.

Drumfire Sales Search (www.drumfiress.com)  solves the sales equation for you with process and people.  In this blog post, I've outlined 6 businesses that we've worked with and the approach we took from start to finish.  Generally speaking, the outline of our approach/process is here.

This video walks through the process with a short explainer video:

Step 1 is always the assessment.  It's important to build out the game plan, which often involves the following steps.  Your business might know and understand your value proposition.  You might know exactly who your ideal / target customer is too.  This assessment is key to your sales success.  Our typical steps to establish the game plan include:

1. Value proposition
2. Ideal/Targeted customer
3. Identify decision makers and method
4. Prospecting strategy
5. Go to market plan
6. Establish pricing
Tools are required to sell your product or service.  Once you know who you're selling to, you need get your message down.  As Mark Twain said, it takes a lot of time to write a short letter, just as it takes a lot of time to get your elevator pitch down.  Summarizing what you do isn't easy.  How quickly can you summarize what you do?  In my days of owning an Oracle consulting (or professional services) firm, my early message was very complicated and confused my early prospects.  As time went on, we shortened our elevator pitch.  In the end, it was pretty simple - we were "Oracle Experts."  I would often say to the CEO of a company, "if you hear someone in your company mention Oracle and a challenge they are having, please call me."  That message worked.  Having the proper materials is important as is having a CRM that allows you to track and monitor your sales activity.  Some of the tools we help you develop include:

1. Develop elevator speech
2. Marketing, messaging and presentations
3. Select and customize appropriate CRM
4. Identify individual sales tools
5. Sales training

At Drumfire, we have our own proprietary solution that will help seed your prospect base if your customers are businesses.  It's called NorthStar and it's evolved over the last 20 years.  Back in the early days of NorthStar, I used it to find companies who were running Oracle's web server.  Today it's used to find businesses in any business category and then to read through all of the websites pulling key information about competitors, specific terms, email addresses and more.  There will be more information on Northstar in the examples below.
Drumfire's origins revolve around the staffing industry.  You will have short and long term requirements around sales in your business.  You might need to establish regions, which we can help with.  Maybe you're taking your first products to market and you need to find those first team members that can sell your MVP.  Team makeups change over time.  Here are some of the steps we help you with around your team:

1. Determine long term hiring plan to include regions and levels
2. Identify candidate makeup
3. Job descriptions and comp plans 
4. Have quotas in place for every level
5. Go to market recruitment plan

Focus is your number one priority.  Scalable execution is key to your success.  Success can easily hide a lack of focus and execution.  In my Oracle consulting business, when we didn't know and understand our pipeline (in our early days), we got caught with a lack of backlog.  When the "cliff" hits your business, it can be a real surprise.  We help you understand the importance of having strong visability to your pipeline.  We also feed your pipeline with prospects so you constantly have new prospects to call on.

1. Create pipeline report
2. Integrate and align sales objectives with corporate objectives
3. Present to potential clients
4. Negotiate
5. Deliver

As mentioned above, NorthStar is our software that finds businesses matching your target customer criteria.

Here are a couple of videos that explain NorthStar in a bit more detail:

Here's a specific example of NorthStar in action for a company with an irrigation solution like Nest:

And one last video, which explains it for an imaginary new pool product:

Let me explain the approach we take to help you find your customers.  I've provided examples for 6 different business segments:

1. Staffing Firms
    a. Healthcare
    b. Oil and Gas
2. Real Estate Infomercial
3. Cannabis Regulatory Compliance Business
4. Nest for Irrigation
5. Virtual Yoga Business
6. Rehab Marketing Firm

Sure, we have databases full of companies and email addresses, but we typically search the web "right now" to find your prospects and then we read through every single website looking for keywords, competitors, and more.  This takes time and resources.  It can take as little as a couple of days and as much as a month.  And...NorthStar continues to run regularly to find new prospects for you month after month.  You'll get access to your NorthStar dashboard and the full platform as you'll see below.

Staffing FirmsLet's say you own (or run sales/bus dev for) a staffing firm.  You probably know exactly what types of companies you're calling on every day.  How do you find/identify those companies?  Do you care about the industry they are in?  Revenue metrics?  Number of employees?  Most staffing firms specialize in a niche from IT to engineering to sales to you name it.  Let's look at 2 examples of companies that we used NorthStar to find potential clients.  These prospects get fed right into the CRM and MAS systems.  What's most important to you?  Do you need a phone number?  An email address?  The head of HR?

Healthcare Staffing
How many companies in the US are based around healthcare?  What are the categories that they group themselves into?

We ran NorthStar looking for businesses in "healthcare." We find these businesses "now" (i.e. it's not a canned list of businesses - it changes every single day.  We harvest them by zip code.  Pulling the tip 10% of the zip codes (which is typically covers about 80% of the businesses) we found about 200,000 (195,670 to be exact) businesses around the US that identify themselves as being in healthcare.  They group themselves into a number of categories (many are grouped into multiple categories) as you see here:

CATEGORYPROSPECTSDoctor & Clinic24575Home Health Care Service & Supply13820General Practice Medicine;Doctor & Clinic;Neurology9322Doctor & Clinic;Neurology7311Car Insurance;Insurance;Life Insurance;Home Insurance5554Home Health Care Service & Supply;Doctor & Clinic4876Long-Term & Hospice Care4826Senior Service;Long-Term & Hospice Care2624Home Health Care Service & Supply;Long-Term & Hospice Care2619Home Health Care Service & Supply;Business Alarm Systems2520

It's interesting to see that the top group are Doctors and Clinics - they are just more than 10% of the companies we found.  When you put this data onto a map, it's very interesting to see how your prospects are distributed across the US.  You might have preconceived notions about where your prospects are.  Are most of them in the states with the most people (NY, CA, IL) or are they not?  Here's the distribution by state - notice how low NY is!


Here's another look at the data on a map:

We are able to pull a lot of important information about each of your prospects, which includes:
  • Company Name
  • Full Address (Address, City, State, Zip)
  • Phone Number
  • Latitude and Longitude
  • Ratings and Reviews
  • Website

We use the website's URL to spider through their website to determine things such as:

  • Email Addresses on their website
  • Competitor Mentions
  • Specific Terms on their website
  • Social Media Platforms they are on (Facebook, Twitter, Pinterest, Google Plus, etc)
  • Is their website mobile enabled?
  • Do they have a "current" (HTML5) website?
  • Do they mention your company?

There is SO much information that we can glean from a company's website!  Use your imagination and let NorthStar do the work!  This client asked us to get them a list of companies specializing in revenue cycle management.  That's an easy term to search for.  We added "back office" into our search terms too since that's sometimes used.  The category did provide us with just over 300 people in the billing category.  These terms are even more effective at narrowing down your prospects!

Here's what their dashboard looks like:

Our clients (hopefully you're one of them) can drill into information any way you can dream of.  

Water, Oil and Gas Engineer Staffing

Another client asked us to find all of the power plants in the US that mention gas turbines on their website or they mention the M501J gas turbine.  That's getting pretty specific isn't it!  If you want to know exactly who you should be calling on based on talent you currently have available or based on a demand surge, how else could you find this type of information?

Here's a look at their dashboard that we provided them access to:

You can see that Texas has the most power plants (over 2,800) and Oklahoma is next with about 800 and then Colorado weighs in at 565.  We loaded the 22,083 power plants into their database (with over 2,400 email addresses and over 13,000 websites of which over 2,600 of them were mobile or HTML5 websites and 1,700 had social media) and they were off to the races.  We found about 50 power plants that mentioned gas turbines so they focused on those 50 first.  We only found 1 power plant that mentioned the M501J, but 1 is better than none!  Not every concept we come up with is going to pan out, but more pan out than not!

Real Estate Infomercial

The "Direct Response" (DR) world is an interesting world full of infomercials geared at getting to consumers, typically to buy their goods or services.  The real estate world has had a number of people who focused on the DR space - many of them "bad" (i.e. they just wanted your money, not to really help you).  A new entre into the DR space approached us about their business.  One side is the B2C (business to consumer) side, which infomercials and other solutions like StarStar are excellent at helping.  The other side of the business is that they plan to deliver their services through existing Realtors.  In fact, the Realtors could be customers.  Real estate agencies are registered businesses that we can identify with NorthStar.

We ran a test to see how many agents we would find in the US.  After processing about 25% of the zip codes (15,072 to be exact - out about 45k zips), we found 225,379 companies.  177,979 of them have "Real Estate Agent" in the category.  Here are the top 10 categories:

CATEGORYCOUNT(*)Real Estate Agent71837Real Estate Agent;Commercial Property Management35252Real Estate Agent;Real Estate22403Real Estate6501Real Estate;Real Estate Agent4169Real Estate Agent;Property Management;Commercial Property Management3741Property Management;Commercial Property Management2255Real Estate Agent;Other Real Estate;Commercial Property Management2100Real Estate Agent;Property Management2096Real Estate Financing1976

We figured that the Realtors that would move interest these guys are those who talk about REO, Fix and Flips, Short Sales, Investors, Buy Back, etc. on their website, so we searched for these terms.  Sure enough, 4,295 of those businesses mention these terms.  It's always important to be able to triage the businesses that you're going to call on first.  If you say they have to mention one of these terms AND they need to promote their social media, the list is reduced to 2,362 agents of which only 1,925 have a mobile or current (HTML5) website.  Multiply that number by 4 (since we've only processed about 25% of the country) and they will end up with about 8k agents to call on.  That's nearly impossible without an autodialer like ConnectAndSell - after all, we have phone numbers for every single agent!

Here's what the distribution looks like across the US:

Cannabis Regulatory Compliance BusinessOne of my good friends is in this business.  They help the medical and recreational cannabis businesses keep current with their paperwork.  We searched 3 different categories (cannabis, alternative clinics, and marijuana to find these businesses.  We then searched each site for terms such as cannabis and marijuana.  The most important piece of data for them at this point is an email address.  They also wanted to know how many shops we could find in the US, but they wanted to start in Colorado.  Overall we found 6,447 email addresses.  Limiting it down to Colorado, we came up with only 274 records.  They said that calling doesn't work (the workers are too stoned), so an email is critical for them.
Here's a breakdown of shops around the US:

Nest for IrrigationOne of the first clients to use NorthStar was a Denver-based company that has an irrigation (sprinkler system) timer like Nest (for the home thermostat).  They had a theory that California, Colorado, Arizona, and Illinois would have the most irrigation specialists (sprinkler guys).  Here's how it actually panned out:

So California, Florida and Texas are the biggest states.  Colorado, Arizona, Illinois, not so much.  Setting up sales people across the country would likely require at least one person in each of these states.

We also established a NorthStar scoring system from 1-16.  Points were established for their social media presence, mentioning their competitors, mentioning them, if they talk about "low voltage" and more.  Their dashboard allows them to move the dials to view the irrigation distribution across the country based on their NorthStar score.

Here's what the US looks like when we search out irrigation companies that have a NorthStar score of 8 or more that also mention "low voltage."  It looks like we just removed most of the west coast doesn't it!

If we reduce the minimum score to 1 and remove the low voltage requirement, the US is very well covered:

Virtual Yoga Business
As many of you know, prior to this venture, I was in the subscription and transactional video on demand business.  We had a number of clients in the fitness business.  Subscription businesses are recurring revenue businesses that keep on giving (i.e. generating revenue) once you get customers signed up.  In fact, if you have a $10/mo product and you add 10 new customers a day for a year, your ARR will be over $400k with a low churn rate.  That's powerful!Yoga businesses are typically recurring revenue businesses - people pay month after month.  So why not create a subscription Yoga business that you can attend right from home?  Why build your own Yoga studio to this?  Why not just put cameras in existing Yoga studios and allow them to post their live video sessions for anyone in the world (who has a subscription) to attend the class?
Again, using NorthStar to find existing Yoga studios is easy enough!  Maybe they want to see if the studio already has a video or DVD for sale.  What else might they want to look for?  Hopefully you're getting the picture here.  NorthStar can help you triage your calling list to determine who to call and in what order. 

Rehab Marketing Firm

A marketing firm specializing in providing referrals for rehab centers, asked us to find the "good guys" in this business.  Many "bad guys" enter this business and scam people for money.  It's always difficult to imagine that families are tricked in their worst hours of darkness.  It's easy enough to identify rehab centers around the US.  However, they decided that they know the top 10 firms in the country that they are going to focus on first and then they will come back to us.  We appreciate a focused sales approach. 

As I stated in the title, it's all about sales in the end.  The more sales you have, the more revenue you have and ultimately the more success you'll have.  In the services business, each person that's billable is additional revenue.  In the product business, each new sale is additional revenue.  Let's go get some business together!  Give us a call at 303.223.4304 or visit us on the web.  Or send us a note (through the website).  We would LOVE to work with you and be successful together!

Return blob from remote database

Tom Kyte - Thu, 2017-02-16 17:46
Hi, I have a question on how to get Blob from remote database with out using Temporary table and MVIEW. Can you please provide a way to deal with blob.
Categories: DBA Blogs


Subscribe to Oracle FAQ aggregator