Feed aggregator

Roles and Privileges

Tom Kyte - Thu, 2017-06-22 16:46
Hi, Good Day ! Today we have a structured approach to access control, based on use of schemas (to group together database objects), and assigning specific permissions to roles. 2 roles : - USER_ROLE = CONNECT + SELECT/INSERT/UPDATE/DELETE...
Categories: DBA Blogs

Unrolling loop speeds up program

Bobby Durrett's DBA Blog - Thu, 2017-06-22 15:55

This is a follow-up to my earlier post about the assembly language book that I am working through. I have struggled to speed up a program using something that the book recommends, unrolling a loop. I think I have finally found an example where unrolling a loop speeds up a program so I wanted to share it.

I am working on Chapter 17 Exercise 2 of the book which asks you to write a program to find the longest common substring from two strings. I choose an inefficient and simple way to find the common substring and tried to speed the program up without changing the algorithm.

Here is the C version on GitHub: url

The core of the program is three loops. The outer loop tries each character in string 1 as the start of the substring. The middle loop tries each character in string 2 as the start of the substring. The inner loop advances through both strings until it finds the end of the common substring.

The C version ran in 27.2 seconds.

I built an assembly version that uses registers for most of the variables and it ran in about 11.7 seconds. It has the same three loops. Assembly register version: url

I tried to improve on the 11.7 seconds by unrolling each of the three loops. Unrolling the outer and inner loops resulted in no improvement in runtime. I was about to give up but finally decided to try unrolling the middle loop and it caused the program to run in 10.2 seconds. Very cool. Assembly unrolled middle loop version: url

I had to figure out how to use %rep, %assign, and how to have a label that I based on a nasm variable such as .skipif %+ i.

Kind of fun. I realize that this is off topic for my “DBA Blog” but it is something I’m playing with so I thought I would throw it out there. It doesn’t hurt a DBA to know some low-level computer science, even if you are not using it directly in your job. Anyway, it was a bit of a struggle to come up with an example that was faster with the loop unrolling and I think that I have found one.


Categories: DBA Blogs

Getting Smarter in Renting with Tableau 10

Rittman Mead Consulting - Thu, 2017-06-22 03:26

Not a long time ago a friend of mine spent a significant amount of time trying to find a flat to rent. And according to what he said it wasn't an easy task. It took him a decent time and efforts to find something that is big enough (but not too big) not too far from a workplace, had required features and affordable at the same time. And as a specialist in data analysis, I prefer to think about this task as a data discovery one (yes, when you have a hammer everything looks like a nail). And I decided to see if a data analysis tool can help me understand the rental market better. I'm sure you've already read the name of this post so I can't pretend I'm keeping intrigue. This tool is Tableau 10.3.

The Data

The friend I was talking before was looking for a flat in Moscow, but I think that this market is completely unknown to the most of the readers. And also I'd have to spend a half of time translating everything into English so for this exercise I took Brighton and Hove data from http://rightmove.co.uk and got a nice JSON Lines file. JSON Lines files are basically the same JSON as we all know but every file has multiple JSONs delimited by a newline.

{json line #1}
{json line #2}
{json line #n}

That could be a real problem but luckily Tableau introduced JSON support in Tableau 10.1 and that means I don't have to transform my data to a set of flat tables. Thanks to Tableau developers we may simply open JSON Lines files without any transformations.

Typical property description looks like this:


It has a few major blocks:

  • Property name - 2 bedroom apartment to rent
  • Monthly price - £1,250
  • Description tab:
    • Letting information - this part is more or less standard and has only a small number of possible values. This part has Property name: Property value structure ('Date available':'Now').
    • Key features - this part is an unformalized set of features. Every property may have its own unique features. And it is not a key-value list like Letting information, but a simple list of features.
    • Full description - simply a block of unstructured text.
  • Nearest stations - shows three nearest train stations (there could be underground stations too if they had it in Brighton).
  • School checker - this shows 10 closest primary and 10 secondary schools. For this, I found a kind of API which brought me a detailed description of every school.

And finally, the JSON for one property has the following look. In reality, it is one line but just to make it more easy to read I formatted it to a human readable format. And also I deleted most of the schools' info as it is not as important as it is huge.

Property JSON

   "property_price_week":"£254 pw",
         "station_dist":"(0.4 mi)"
         "station_dist":"(1.2 mi)"
         "station_dist":"(1.7 mi)"
   "letting_type":"Long term",
            "distance":"0.6 miles",
            "name":"Bishop Luffa School, Chichester",
   "agent":"On The Move, South",
   "full_description":"<p itemprop=\"description\">We are delighted to bring to market, this fabulous semi detached bungalow ... </p>",
            "distance":"0.3 miles",
            "name":"Fishbourne CofE Primary School",
   "property_address":[ "Mill Close, Chichester, West Sussex, PO19"],
   "property_name":"3 bedroom bungalow to rent",
   "date_added":"08 June 2017 (18 hours ago)",
   "property_price_month":"£1,100 pcm",

The full version is here: 6391 lines, I warned you. My dataset is relatively small and has 1114 of such records 117 MB in total.

Just a few things I'd like to highlight. Letting information has only a small number of fixed unique options. I managed to parse them to fields like furnish, letting_type, etc. Key Features list became just an array. We have thousands of various features here and I can't put them to separate fields. Nearest stations list became an array of name and value pairs. My first version of the scrapper put them to a key-value list. Like this:

      "Fishbourne": "(0.4 mi)",
      "Chichester": "(1.2 mi)",
      "Bosham": "(1.7 mi)"

but this didn't work as intended. I got around one hundred of measures with names Fishbourne, Chichester, Bosham, etc. Not what I need. But that could work well if I had only a small number of important POIs (airports for example) and wanted to know distances to this points. So I changed it to this and it worked well:

         "station_dist":"(0.4 mi)"
         "station_dist":"(1.2 mi)"
         "station_dist":"(1.7 mi)"
Connect to the Data

When I started this study my knowledge of the UK property rent market was close to this:


And it's possible or even likely that some of my conclusions may be obvious for anyone who is deep in the topic. In this blog, I show how a complete newbie (me) can use Tableau and become less ignorant.

So my very first task was to understand what kind of objects are available for rent, what are their prices and so on. That is the typical task for any new subject area.

As I said before Tableau 10 can work with JSON files natively but the question was if it could work with such a complex JSON as I had. I started a new project and opened my JSON file.


I expected that I will have to somehow simplify it. But in reality after a few seconds of waiting Tableau displayed a full structure of my JSON and all I had to do was selecting branches I need.


After a few more seconds I got a normal Tableau data source.


And this is how it looked like in analysis mode


First Look at the Data

OK, let's get started. The first question is obvious: "What types of property are available for rent?". Well, it seems that name ('2 bedroom apartment to rent') is what I need. I created a table report for this field.


Well, it gives me the first impression of what objects are offered and what my next step should be. First of all the names are ending with "to rent". This just makes strings longer without adding any value. The word "bedroom" also doesn't look important. Ideally, I'd like to parse these strings into fields one of which is # of bedrooms and the second one is Property type. The most obvious action is to try Split function.


Well, it partially worked. This function is smart enough and removed 'to rent' part. But except for this, it gave me nothing. On other datasets (other cities) it gave me much better results but it still wasn't able to read my mind and did what I wanted:


But I spent 15 seconds for this and lost nothing and if it worked I'd saved a lot of time. Anyway, I'm too old to believe in magic and this almost didn't hurt my feelings.

Some people, when confronted with a problem, think “I know, I'll use regular expressions.” Now they have two problems.

Yes, this string literally asks some regular expressions wizardry.


I can easily use REGEXP_EXTRACT_NTH and get what I want. Group 1 is the number of bedrooms and Group 3 is the property type. Groups 2 and 4 are just constant words.


Explanation for my regular expressionI can describe most of the names in the following way: "digit bedroom property type to rent" and the rest are "property type to rent. So digit and bedroom are optional and property type to rent are mandatory. The expression is easy and obvious: ([0-9]*)( bedroom )*(.*)( to rent)

Regular expressions are one of my favourite hammers and helped me a lot for this analysis. And after all manipulations, I got a much better view of the data (I skipped some obvious steps like create a crosstab or a count distinct measure to save space for anything more interesting).


And while this result looks pretty simple it gives me the first insight I can't get simply browsing the site. The most offered are 1 and 2 bedroom properties especially flats and apartments. And if a family needs a bigger something with 4 or 5 bedrooms, well I wish them good luck, not many offers to chose from. Also if we talk about living property only we should filter out things like GARAGE, PARKING or LAND.

120 130

I think both charts work pretty well. The first one presents a nice view of how flats and apartments outnumber all other types and the second one gives a much better understanding of how many of 2 bedroom properties offered compared to all others.

And while I'm not a big fan of fancy visualisations but if you need something less formal and more eye-catching try Bubbles chart. It's not something I'd recommend for an analysis but may work well for a presentation. Every bubble represents particular property type, colour shows a number of bedrooms and size shows the number of properties.


Going Deeper

The next obvious question is the price. How much do different properties cost? Is any particular one more expensive than average or less? What influences the price?

As a baseline, I'd like to know what is the average property price. And I obviously don't want just one figure for the city-wide price. It's meaningless. Let's start with a bar chart and see what is the range of prices.


Well, we have a lot of options. Flat share costs less than £700 or we may choose a barn for more than £3600. Again a very simple result but I can't get it directly from the site.

The next obvious question is how the number of bedrooms affects the price. Does the price skyrockets with every additional bedroom or maybe more bedrooms mean smaller rooms and price increases not too fast?


Well, this chart gives me the answer but it looks bad. Mostly because a lot of properties types don't have enough variance in room number. Studio flats have only one bedroom by definition and the only converted barn has 7 bedrooms. I'd like to remove types which don't have at least 3 options and see how the price changes. For this, I created a new computed field using fixed keyword. It counts the number of bedroom options by property type.


And then I use it in the filter 'Bedroom # variance' at least 3. Now I have a much more clean view. And I can see that typically more bedrooms mean significantly higher price with a few exceptions. But in fact, these are not actual exceptions just a problem of a small dataset. I can say that increase in # bedrooms certainly means a significant increase in price. And one more insight. Going above 7 bedrooms may actually double the price.


Averages are good but they hide important information of how prices are distributed. For example, six properties priced £1K and one £200 give average £885. And looking at average only may make you think that with £900 you may choose one of 7 options. It's very easy to build a chart to check this. Just create a new calculation called Bins and use in a chart.

180 190

With £100 bins I got the following chart. It shows how many properties have price falling to a particular price range. For example, the £1000 bin shows # of properties with prices £1000-£1100.


The distribution looks more or less as expected but the most interesting here is that £1000-£1100 interval seems to be very unpopular. Why? Let's add # of bedrooms to this chart.


£1000 is too expensive for 1 bedroom and studios but too cheap for two. Simple. What else can we do here before moving further? Converting this chart to a running total gives a cool view.


What can this chart tell us? For example, if we look at the orange line (2 bedrooms) we will find that with £1200 we may choose among 277 of 624 properties. With £1400 budget we have 486 of 624. Further £200 increase in budget won't significantly increase the number of possibilities and if the change from £1200 to £1400 almost doubled the number of possibilities, the next £200 give only 63 new options. I don't have a ready to use insight here, but I got a way to estimate a budget for a particular type of property. With budget £X I will be able to choose one of N properties.

Why It Costs What It Costs

OK, now I know a lot of statistics about prices. And my next question is about factors affecting the price. I'd like to understand does a particular property worth what it cost or not. Of course, I won't be able to determine exact price but even hints may be useful.

The first hypothesis I want to check is if a train station near raises the price or it isn't any important. I made a chart very similar to the previous one and it seems that Pareto principle works perfectly here. 80% or properties are closer than 20% of the maximum distance to a station.


But this chart doesn't say anything about the price it just gives me the understanding of how dense train stations are placed. I'd say that most of the properties have a station in 10-15 minutes of walking reach and therefore this should not significantly affect the price. My next chart is a scatter plot for price and distance. Every point is a property and its coordinates on the plot determined by its price and distance to the nearest station. Colour shows # of bedrooms.


I'd say that this chart shows no clear correlation between price and distance. And a more classical line chart shows that.


The maximum price slightly decreases with distance, minimum price on the contrary increases. Average price more or less constant. I think the hypothesis is busted. There is no clear correlation between the distance a tenant have to walk to a station and the price he has to pay. If you want to rent something and the landlord says that the price is high because of a train station near, tell him that there are stations all around and he should find something more interesting.

What about furnishings? Does it cheaper to get an unfurnished property or a landlord will be happy to meet someone who shares his taste?


Unfurnished property is definitely cheaper. And it's interesting that in some cases partly furnished even cheaper than completely unfurnished. But at least for furnished/unfurnished, we can see a clear correlation. When you see a furnished one for the price of unfurnished this may be a good pennyworth.

Another thing I'd like to check. Can we expect I lower price for a property not available immediately? Or is, on the contrary, the best price is offered for already unoccupied properties?

As always start with a general picture. What is the average time of availability by property types?


For most popular types it is about one month and if you have a house you typically publish it two or three months in advance. And what is about the price? One more chart that I like in Tableau. In the nutshell, it is a normal line chart showing an average price by days before property availability. But the thickness of lines shows the number of properties at the same time. So I can see not only the price but reliance too. A thick line means it was formed by many properties and a thin line may be formed by few properties and move up or down significantly then something changes. It would be very interesting to get a historical data and see how much time properties stay free or how long it takes before the price is reduced, but unfortunately, I don't have this data.


And looking at this chart I'd say that there is no statistically significant dependency for price and availability date. Renting a property available in the distant future won't save you money* (*=statistically).

And the last thing I'd like to investigate is the Key features. What do landlords put as the key features of their properties? How do they affect the price?

The list of popular Key features surprised me.


'Unfurnished' looks good to me, it is a really significant part of the deal. But 'Brighton'? For properties in Brighton? '1 Bedroom'. How many bedrooms can '1 bedroom flat to rent' have? Oh, there is a key feature saying '1 bedroom' now I know. But jokes aside. I had to make a lot of cleaning on this data before I could use it. There are six ways to write 'Modern kitchen'. Make everything upper case, then remove quotes, strip spaces and tabs, remove noisy features like 'stylish 1 bedroom apartment' and so on. After this, I got a slightly better list with approximately 3500 features instead of 4500. Note how all variants of writing 'GAS CENTRAL HEATING' now combined into one most popular feature. But there are still too many features. I'm sure that there should be not more than a hundred of them. Even at this screenshot you may see 'Unfurnished' and 'Unfurnished property' features.


When I need a visualisation for this amount of points, bar charts or tables won't play well. My weapon of choice is Scatter plot. Every point is a particular feature, axes are minimum and average prices of it, size is determined by the number of properties declaring to have this feature and the colour is the maximum price. So if a feature is located high on the plot it means that in average it will be expensive to have it. If this feature at the same time located close to the left side even cheap properties may have it. For example, if you want a swimming pool be ready to pay at least £3000 and £7000 in average. And the minimum price for tumble dryer is £3250 but average £3965. The cheapest property with a dryer is more expensive than with a pool, but in average pools are more expensive. That is how this chart works.


The problems of this chart are obvious. It is littered with unique features. Only one property has 4 acres (the point in top right corner). And actually not so many swimming pools are available for rent in Brighton. I filtered it by "# of properties > 25" and here is how prices for the most popular features are distributed.


Central location will cost you at least £100 and £1195 in average and for Great location be ready to pay at least £445 and £1013 in average. Great location seems to be less valuable than the central one.

And now I can see how a particular feature impacts prices. For example 'GAS HEATING'. I made a set with all variants of heating I could find ('GAS CENTRAL HEATING', 'GAS HEAT' and so on). Now I can analyse how this feature impacts properties. And here is how it impacts the price of flats. Blue circles are properties with gas heating and orange are without.


Very interesting in my opinion. The minimum price of properties with gas heating (blue circles) is higher than without. That is expected. But average price for properties without gas heating is higher.

And here are kitchen appliances. For 1 bedroom flats, they increase both minimum and average prices significantly. But for bigger flats minimum price with appliances is higher and average price is lower. Possible this option is important for relatively cheap properties, but its weight is not that big for the bigger ones.




Categories: BI & Warehousing

Webcast: Testing Oracle E-Business Suite Best Practices

Steven Chan - Thu, 2017-06-22 02:00

Oracle University has a large collection of free recorded webcasts that EBS system administrations may find useful.  Here's a great webcast on testing E-Business Suite:

Prashanti Madireddi, Senior Director Product Development provides an overview of how the Quality Assurance team tests Oracle E-Business Suite. This session covers the main areas that you should consider during functional testing, approaches for new-feature and regression testing, how to reduce the test script generation and execution time, experiences on capturing and presenting metrics to showcase the ROI of testing investment, leveraging automation for testing Oracle E-Business Suite applications and more. This material was presented at Oracle OpenWorld 2015.

Related Articles

Categories: APPS Blogs


Tom Kyte - Wed, 2017-06-21 22:26
Below is the the employee table, EID NAME SALARY COMM_PCT 100 Steven 24000 101 Neena 17000 0.6 102 Lex 17000 0.8 145 John 14000 0.4 If I do the below select to employee table I get the output as 3. SELECT DENSE_RANK(17000,0.6) W...
Categories: DBA Blogs

SQL execution plan - access predicates determination

Tom Kyte - Wed, 2017-06-21 22:26
Hello Tom, I know Tom has retired. But glad that a group of Oracle expertise help to keep operation of AskTom. It is indeed very helpful. I was facing a SQL plan issue in an SAP/Oracle Environment. The SQL is: <code> SELECT DISTINCT "M...
Categories: DBA Blogs

The VERSION column - A unsung treasure

Anthony Shorten - Wed, 2017-06-21 20:58

If you use an Oracle Utilities Application Framework based product you will notice the column VERSION exists on all objects in the product. There is a very important reason that this column exists on the tables.

One of the common scenarios in an online system is the problem called the lost update problem. Let me explain, say we have two users (there can be more), say User A and User B.

  • User A reads Object A to edit it.
  • User B reads Object A as well to edit it at the same time.
  • User B saves the Object changes first.
  • User A saves the Object changes.

Now, without protection, the changes that User B made would be overridden by User A's changes. We have lost User B's changes. This is the lost update problem in a nutshell.

Now using the VERSION column changes the above scenario:

  • When User A and User B reads the object, the current value of VERSION is noted.
  • Whenever the object is updated, the value VERSION is checked. If it is the same than the value of VERSION when the record was read then value of VERSION is incremented as part of the update.
  • If the value of VERSION does not match, the product will issue a "Concurrency Error" and ask the user to retry the transaction (after reloading the changed object).

In our scenario, User A would receive the message as the value of VERSION has incremented, and therefore differs, since it was read by that user.

VERSION is a standard column on all objects in the system and applies no matter what channel (online, web services or batch) updates the object.

EM13cR2 Installation Failing on BI Publisher Configuration

Pythian Group - Wed, 2017-06-21 16:07

This is going to be a short post as there wasn’t much information on this type of failure when searching for it. Also, the IP’s are changed due to security reasons

I was working on an Oracle EM13cR2 installation and when the installation reached 78%, it failed on the BI Publisher configuration. First, I looked at the CfmLogger log.

INFO: oracle.sysman.top.oms:WLST offline debugging is in the file: /u01/app/oracle/middleware/cfgtoollogs/bip/bipca_20170613152059.log.wlst_20170613152122.trc
INFO: oracle.sysman.top.oms:Securing BI Publisher to work with Enterprise Manager
INFO: oracle.sysman.top.oms:Locking Enterprise Manager ...
INFO: oracle.sysman.top.oms:OMS Console is locked. Access the console over HTTPS ports.
INFO: oracle.sysman.top.oms:BI Publisher is locked. Access BI Publisher over HTTPS ports.
INFO: oracle.sysman.top.oms:Restart OMS.
INFO: oracle.sysman.top.oms:Restarting Enterprise Manager
INFO: oracle.sysman.top.oms:Stopping Enterprise Manager, this can take some time ...
INFO: oracle.sysman.top.oms:Starting BI Publisher ...
INFO: oracle.sysman.top.oms:Failed to start Enterprise Manager. Diagnostic code 1.
INFO: oracle.sysman.top.oms:See log at /u01/app/oracle/middleware/cfgtoollogs/bip/bipca_20170613152059.log for details.
INFO: oracle.sysman.top.oms:Error extending domain
INFO: oracle.sysman.top.oms:See log at /u01/app/oracle/middleware/cfgtoollogs/bip/bipca_20170613152059.log for details.
INFO: oracle.sysman.top.oms:java.lang.Exception: See log at /u01/app/oracle/middleware/cfgtoollogs/bip/bipca_20170613152059.log for details.
INFO: oracle.sysman.top.oms: at oracle.sysman.sdkImpl.core.ip.model.bipca.BIPCA.extendDomainWithBIP(BIPCA.java:3401)
INFO: oracle.sysman.top.oms: at oracle.sysman.sdkImpl.core.ip.model.bipca.BIPCA.main(BIPCA.java:1973)

When this led me to the bipca.log , it referenced the details below (which really didn’t tell me much):

[2017-05-09T21:55:17.585-07:00] [sysman] [ERROR] [] [oracle.sysman.bipca] [host: OracleEntMgr] [nwaddr:] [tid: 1] [userId: oracle] [ecid: a4412d2a-def5-49d8-ac27-1046c7243de9-00000001,0] [SRC_CLASS: oracle.sysman.sdkImpl.core.ip.model.bipca.BIPCA] [SRC_METHOD: handleFatalError] Error extending domain
[2017-05-09T21:55:17.586-07:00] [sysman] [ERROR] [] [oracle.sysman.bipca] [host: OracleEntMgr] [nwaddr:] [tid: 1] [userId: oracle] [ecid: a4412d2a-def5-49d8-ac27-1046c7243de9-00000001,0] [SRC_CLASS: oracle.sysman.sdkImpl.core.ip.model.bipca.BIPCA] [SRC_METHOD: handleFatalError] Fatal error:[[
java.lang.Exception: See log at /u01/app/oracle/middleware/cfgtoollogs/bip/bipca_20170509214828.log for details.
at oracle.sysman.sdkImpl.core.ip.model.bipca.BIPCA.extendDomainWithBIP(BIPCA.java:3401)
at oracle.sysman.sdkImpl.core.ip.model.bipca.BIPCA.main(BIPCA.java:1973)

A review of the bipca trace file didn’t tell me much about why the configuration was failing:

2017-05-24 09:26:51,250 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - Selected DB vendor: Oracle
2017-05-24 09:26:51,250 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - adding normal datasource: emgc-mgmt_view-pool
2017-05-24 09:26:51,250 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - datasource: emgc-mgmt_view-pool component name: null
2017-05-24 09:26:51,251 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - Found normal datasource: emgc-sysman-pool
2017-05-24 09:26:51,253 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - Decomposing url: jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=dbhost)(PORT=1521)))(CONNECT_DATA=(SID=emrep))) for driver: oracle.jdbc.OracleDriver
2017-05-24 09:26:51,253 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - There is an unsupported jdbc URL or JDBC driver is detected. See Stack trace:
at com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper.decomposeURL(DatasourceXBeanAspectHelper.java:316)
at java.lang.reflect.Method.invoke(Method.java:606)
at weblogic.WLST.main(WLST.java:29)
2017-05-24 09:26:51,253 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - Found normal datasource: jvmd-ds
2017-05-24 09:26:51,255 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - Decomposing url: jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=dbhost)(PORT=1521)))(CONNECT_DATA=(SID=emrep))) for driver: oracle.jdbc.OracleDriver
2017-05-24 09:26:51,255 FINE [1] com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper - There is an unsupported jdbc URL or JDBC driver is detected. See Stack trace:
at com.oracle.cie.domain.jdbc.DatasourceXBeanAspectHelper.decomposeURL(DatasourceXBeanAspectHelper.java:316)
at org.python.pycode._pyx95.updateDomain$21(/tmp/WLSTOfflineIni822569357732826272.py:103)
at org.python.pycode._pyx95.call_function(/tmp/WLSTOfflineIni822569357732826272.py)
at org.python.core.PyTableCode.call(Unknown Source)
at org.python.core.PyTableCode.call(Unknown Source)
at org.python.core.PyFunction.__call__(Unknown Source)

After reading for a while and working back and forth with Oracle, we identified the problem. The host wasn’t resolving via nslookup. Though we were able to ping it, what nslookup does is query a DNS domain nameserver to lookup and find IP address and it , so what we had to do was fix this :

[oracle@dbhost ~]# nslookup dbhost

Name: dbhost

Once we fixed this, we were able to install EM13cR2 successfully.

Note:This was originally published on rene-ace.com

Categories: DBA Blogs

Securing the Oracle Cloud

Oracle Security Team - Wed, 2017-06-21 15:25
sup { vertical-align: baseline; position: relative; top: -0.4em; }

Technology safeguards, fewer risks, and unparalleled security motivate CIOs to embrace cloud computing.

If one thing is constant in the IT world, it's change. Consider the age-old dilemma of security versus innovation. Just a few years ago, concerns about data security and privacy prevented some organizations from adopting cloud-based business models. Today, many of these concerns have been alleviated. IT leaders are migrating their applications and data to the cloud in order to benefit from security features offered by some cloud providers. The key is to choose the right technology—one that is designed to protect users, enhance safeguarding of data, and better address requirements under privacy laws. Find out why millions of users rely on advanced and complete cloud services to transform fundamental business processes more quickly and confidently than ever before.

The Evolving Security Landscape

Mitigating the Risk of Data Loss with Cloud Technology

The IT security practices of many organizations that manage their own systems may not be strong enough to resist complex threats from malware, phishing schemes, and advanced persistent threats unleashed by malicious users, cybercriminal organizations, and state actors. The perimeter-based security controls typically implemented by organizations who manage their own security—from firewalls, intrusion detection systems, and antivirus software packages—are arguably no longer sufficient to prevent these threats.

It's time to look further. It's time to look to the cloud. Thousands of organizations and millions of users obtain a better security position using a tier 1 public cloud provider than they can obtain in their own data centers. A full 78 percent of businesses surveyed say the cloud can improve both their security and their agility.5 Consider the facts: Most of today's security budgets are used to protect the network, with less than a third used to protect data and intellectual property that resides inside the organization.6 Network security is important, but it's not enough.

Building Oracle's Defense-in-Depth Strategy

Oracle Cloud is built around multiple layers of security and multiple levels of defense throughout the technology stack. Redundant controls provide exceptional resiliency, so if vulnerability is discovered and exploited in one layer, the unauthorized user will be confronted with another security control in the next layer.

But having some of the world's best security technology is only part of the story. Oracle aligns people, processes, and technology to offer an integrated defense-in-depth platform:

  • Preventive controls designed to mitigate unauthorized access to sensitive systems and data
  • Detective controls designed to reveal unauthorized system and data changes through auditing, monitoring, and reporting
  • Administrative measures to address security policies, practices, and procedures

Gaining an Edge with Cloud Security

In the Digital Age, companies depend on their information systems to connect with customers, sell products, operate equipment, maintain inventory, and carry out a wide range of other business processes. If your data is compromised, IT assets quickly become liabilities. A 2016 Ponemon Institute study found that the average cost of a data breach continues to rise each year, with each lost or stolen record that contains confidential information representing US$158 in costs or penalties.8 In response, more and more organizations are transitioning their information systems to the cloud to achieve better security for sensitive data and critical business processes.

Security used to be an inhibitor to moving to the cloud. Now it's an enabler to get you where you need to go. Oracle helps you embrace the cloud quickly, and with confidence.

Learn more about Oracle security cloud services, read the paper "Oracle Infrastructure and Platform Cloud Services Security", and try Oracle Cloud today.

1 QuinStreet Enterprise, "2015 Security Outlook: Meeting Today's Evolving Cyber-Threats".
2 Ponemon Institute, "The Cost of Malware Containment," 2015.
3 Leviathan Security Group, "Quantifying the Cost of Cloud Security,".
4 Crowd Research Partners, "Cloud Security: 2016 Spotlight Report"
5 Coleman Parkes Research, "A Secure Path to Digital Transformation".
6 CSO Market Pulse, "An Inside-Out Approach to Enterprise Security."
7 Jeff Kauflin, "The Fast-Growing Job with a Huge Skills Gap: Cyber Security," Forbes, March 16, 2017.
8 IBM Security, "2016 Ponemon Cost of Data Breach Study"

(bash): The most useless commands (5)

Dietrich Schroff - Wed, 2017-06-21 15:17
In a comment Albert Godfrind pointed out, that the commands i was talking about like
  1. rev
  2. sl
  3. cowsay
are not really bash commands (in a way that they will be installed within the installation of the bash shell). They are commands, which can be used from on every shell. So my headline is a little bit missleading - thanks for this notice.

Just for clarification here a list with all bash builtin commands:
  • alias
  • bind
  • builtin
  • caller
  • command
  • declare
  • echo
  • enable
  • help
  • let
  • local
  • logout
  • mapfile
  • printf
  • read
  • readarray
  • source
  • type
  • typeset
  • ulimit
  • unalias
and the bourne shell builtins:

  • break
  • cd
  • continue
  • eval
  • exec
  • export
  • getopts
  • hash
  • pwd
  • readonly
  • return
  • shift
  • test
  • times
  • trap
  • umask
  • unset
Still missing:

Q4 FY17 GAAP EPS UP 15% TO $0.76 and NON-GAAP EPS UP 10% TO $0.89

Oracle Press Releases - Wed, 2017-06-21 15:00
Press Release
Q4 FY17 GAAP EPS UP 15% TO $0.76 and NON-GAAP EPS UP 10% TO $0.89 Total Cloud Revenues Up 58% to $1.4 Billion, Total Revenue Up 3% to $10.9 Billion

Redwood Shores, Calif.—Jun 21, 2017

Oracle Corporation (NYSE: ORCL) today announced fiscal 2017 Q4 results and fiscal 2017 full year results. Comparing Q4 2017 to Q4 last year, SaaS (Software as a Service) cloud revenues were up 67% to $964 million, and non-GAAP SaaS revenues were up 75% to $1.0 billion. Cloud PaaS (Platform as a Service) plus IaaS (Infrastructure as a Service) revenues were up 40% to $397 million, and non-GAAP PaaS plus IaaS revenues were up 42% to $403 million. Total cloud revenues were up 58% to $1.4 billion, and non-GAAP total cloud revenues were up 64% to $1.4 billion. Cloud plus on-premise software revenues were up 5% to $8.9 billion, and non-GAAP cloud and on-premise software revenues were up 6% to $8.9 billion. Total revenues were up 3% to $10.9 billion.

Operating Income was up 3% to $4.1 billion, and the operating margin was 37%. Non-GAAP Operating Income was up 5% to $5.0 billion, and the non-GAAP operating margin was 46%. Net Income was $3.2 billion, and non-GAAP Net Income was $3.8 billion. Earnings Per Share was $0.76, while non-GAAP Earnings Per Share was $0.89. Without the impact of the U.S. dollar strengthening compared to foreign currencies, Oracle’s reported GAAP Earnings Per Share would have been 2 cents higher, and non-GAAP Earnings Per Share would have been 1 cent higher.

Short-term deferred revenues were up 8% to $8.2 billion compared with a year ago. Operating cash flow on a trailing twelve-month basis was up 3% to $14.1 billion.

For fiscal 2017, Cloud SaaS revenues were up 61% to $3.2 billion compared to fiscal 2016. Non-GAAP SaaS revenues were up 68% to $3.4 billion. Cloud PaaS and IaaS revenues were up 60% to $1.4 billion for both GAAP and Non-GAAP. Total cloud revenues were up 60% to $4.6 billion. Non-GAAP cloud revenues were up 66% to $4.7 billion. Cloud and on-premise software revenues were up 4% to $30.2 billion. Non-GAAP cloud and on-premise software revenues were up 5% to $30.4 billion. Total Revenues were up 2% to $37.7 billion. Non-GAAP Total Revenues were up 2% to $37.9 billion.

Operating Income was $12.7 billion, and operating margin was 34%. Non-GAAP Operating Income was $16.2 billion and non-GAAP operating margin was 43%. Net Income was $9.3 billion while non-GAAP Net Income was $11.6 billion. Earnings Per Share was $2.21, while Non-GAAP Earnings Per Share was $2.74. Without the impact of the U.S. dollar strengthening compared to foreign currencies, Oracle’s reported GAAP Earnings Per Share would have been 4 cents higher and non-GAAP Earnings Per Share would have been 3 cents higher.

“Our fourth quarter results were very strong as revenue growth and earnings per share both substantially exceeded the high end of guidance,” said Oracle CEO, Safra Catz. “We continue to experience rapid adoption of the Oracle Cloud led by the 75% growth in our SaaS business in Q4. This cloud hyper-growth is expanding our operating margins, and we expect earnings per share growth to accelerate in fiscal 2018.”

“We sold $855 million of new annually recurring cloud revenue (ARR) in Q4, putting us over our $2 billion ARR bookings goal for fiscal year 2017,” said Oracle CEO, Mark Hurd. “We also delivered over $1 billion in quarterly SaaS revenue for the first time. Next year is going to be even better. We expect to sell a lot more than $2 billion in new cloud ARR in fiscal year 2018.”

“AT&T has agreed to migrate thousands of existing Oracle databases containing petabytes of data plus their associated applications workloads to the Oracle Cloud,” said Oracle Chairman and CTO, Larry Ellison. “In the coming year, I expect more of our big customers to migrate their Oracle databases and database applications to the Oracle Cloud. These large-scale migrations will dramatically increase the size of both our PaaS and IaaS cloud businesses.”

The Board of Directors also declared a quarterly cash dividend of $0.19 per share of outstanding common stock. This dividend will be paid to stockholders of record as of the close of business on July 19, 2017, with a payment date of August 2, 2017.

Q4 Fiscal 2017 Earnings Conference Call and Webcast

Oracle will hold a conference call and webcast today to discuss these results at 2:00 p.m. Pacific. You may listen to the call by dialing (816) 287-5563, Passcode: 425392. To access the live webcast of this event, please visit the Oracle Investor Relations website at http://www.oracle.com/investor. In addition, Oracle’s Q4 results and fiscal 2017 financial tables are available on the Oracle Investor Relations website.

A replay of the conference call will also be available by dialing (855) 859-2056 or (404) 537-3406, Pass Code: 34656498.

Contact Info
Ken Bond
Oracle Investor Relations
Deborah Hellinger
Oracle Corporate Communciations
About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE: ORCL), visit www.oracle.com or contact Investor Relations at investor_us@oracle.com or (650) 506-4073.


Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

"Safe Harbor" Statement

Statements in this press release relating to Oracle's future plans, expectations, beliefs, intentions and prospects, including statements regarding our future sales and non-GAAP financial results, expectations of future customer migrations to the Oracle Cloud and growth in the size of our PaaS and IaaS cloud business, are "forward-looking statements" and are subject to material risks and uncertainties. Many factors could affect our current expectations and our actual results, and could cause actual results to differ materially. We presently consider the following to be among the important factors that could cause actual results to differ materially from expectations: (1) Our cloud computing strategy, including our Oracle Cloud SaaS, PaaS, IaaS and data as a service offerings, may not be successful. (2) If we are unable to develop new or sufficiently differentiated products and services, or to enhance and improve our products and support services in a timely manner or to position and/or price our products and services to meet market demand, customers may not buy new software licenses, cloud software subscriptions or hardware systems products or purchase or renew support contracts. (3) If the security measures for our products and services are compromised or if our products and services contain significant coding, manufacturing or configuration errors, we may experience reputational harm, legal claims and reduced sales. (4) We may fail to achieve our financial forecasts due to such factors as delays or size reductions in transactions, fewer large transactions in a particular quarter, fluctuations in currency exchange rates, delays in delivery of new products or releases or a decline in our renewal rates for support contracts. (5) Our international sales and operations subject us to additional risks that can adversely affect our operating results, including risks relating to foreign currency gains and losses. (6) Economic, geopolitical and market conditions can adversely affect our business, results of operations and financial condition, including our revenue growth and profitability, which in turn could adversely affect our stock price. (7) We have an active acquisition program and our acquisitions may not be successful, may involve unanticipated costs or other integration issues or may disrupt our existing operations. A detailed discussion of these factors and other risks that affect our business is contained in our SEC filings, including our most recent reports on Form 10-K and Form 10-Q, particularly under the heading "Risk Factors." Copies of these filings are available online from the SEC or by contacting Oracle Corporation's Investor Relations Department at (650) 506-4073 or by clicking on SEC Filings on Oracle’s Investor Relations website at http://www.oracle.com/investor. All information set forth in this press release is current as of June 21, 2017. Oracle undertakes no duty to update any statement in light of new information or future events. 

Talk to a Press Contact

Ken Bond

  • +1.650.607.0349

Deborah Hellinger

  • +1.212.508.7935

New Window Bookmarklet

Jim Marion - Wed, 2017-06-21 12:05

I am a "New Window" link junkie. I use that link ALL THE TIME! If it were possible to wear it out, mine would be worn out. I wish all PeopleSoft pages had the "New Window" link. For some reason, however, certain developers chose to remove it from specific PeopleSoft pages (such as Structure and Content). I'm sure there is a good reason... there just has to be. So seeing it missing from Fluid has been a significant struggle for me. I'm thankful for Sasank's Fluid UI - New Window Feature - Workaround customization. For quick access to a new window without customization, I have a Bookmarklet, which is a JavaScript fragment masquerading as a favorite (or bookmark). Here is the JavaScript:

(function() {
var parts = window.location.href.match(/(.+?\/ps[pc])\/(.+?)(?:_\d+?)*?\/(.*)/);
window.open(parts[1] + '/' + parts[2] + '_newwin/' + parts[3], '_blank');

To add it to your bookmark toolbar, drag the following link into your link toolbar:

PS New Window

This solution is simple, but may not satisfy your requirements. This bookmarklet assumes you want to open a new window to the URL displayed in the address bar. That URL may or may not match the actual transaction. If you want a bookmarklet that opens a new window specifically targeting the current transaction, then try this bookmarklet:

(function() {
var href = window.location.href;
var parts = (!!frames["TargetContent"] ? !!frames["TargetContent"].strCurrUrl ? frames["TargetContent"].strCurrUrl : href : href).match(/(.+?\/ps[pc])\/(.+?)(?:_\d+?)*?\/(.*)/);
window.open(parts[1] + '/' + parts[2] + '_newwin/' + parts[3], '_blank');

To use it, drag the following link into your bookmark toolbar:

PS New Window

Special shout out to David Wiggins, who posted a similar bookmarklet on my Where is My New Window Link? post as I was writing this blog post.

Oracle Ksplice on Oracle Linux in Bare Metal Cloud

Wim Coekaerts - Wed, 2017-06-21 09:58

One of the great advantages of using Oracle Cloud is the fact that it includes full Oracle Linux support. All the services that you get with Oracle Linux Premier support are included without additional cost when you use Oracle Cloud.

Oracle Ksplice is such a service. (see: http://www.ksplice.com/ ). In order to use Oracle Ksplice outside of Oracle Cloud you configure it at install time when registering your Oracle Linux server with ULN (http://linux.oracle.com ) and you then use the generated access key to configure the uptrack tools.

With Oracle Cloud, both Oracle Public Cloud and Oracle Bare Metal Cloud Services ( http://cloud.oracle.com ), we have made it very easy. Any instance that runs inside our infrastructure has immediate access to the ksplice servers.

For customers or users with existing Oracle Linux instances in BMCS, you have to do a few simple steps to enable Ksplice. We are in the process of adding the uptrack tools to the image by default so, soon, you don't have to do any configuration at all.

Enable Ksplice today:

Log into your Oracle Linux instance as user opc (or as root)

# sudo bash

Download the uptrack client:

# wget -N https://www.ksplice.com/uptrack/install-uptrack

or if you prefer to use curl

# curl -O https://www.ksplice.com/uptrack/install-uptrack

Install the client, make sure you use this exact key, it will only work inside BMCS and is a generic identifier.

# sh install-uptrack dfc21b3ced9af52f6a8760c1b1860f928ba240970a3612bb354c84bb0ce5903e --autoinstall

This command unpacks the downloaded script and install the uptrack utilities (Ksplice client tools). Ignore the connect error, you need the step below.

One more step. In order for the above key to work, you have to point the uptrack tools to a specific update server.

edit /etc/uptrack/uptrack.conf:

# The location of the Uptrack updates repository.


and that's it.

# uptrack-upgrade
Nothing to be done.
Your kernel is fully up to date.
Effective kernel version is 4.1.12-94.3.6.el6uek


For instances that are Bring Your Own we will automate the above steps as well. But at least this gets you going right away.


Storage Server: Firmware

Frank van Bortel - Wed, 2017-06-21 09:42
Firmware The first thing to do, in order to get any software RAID program to run, is to flash the controller out of RAID mode. Only then all of the disks will be seen as just a bunch of disks - nothing else. JBOD that is, for short. The board I have, comes with a LSI SAS2308 controller, to with I want to connect 12 SATA drives using three SAS-to-SATA breakout cables. Drivers There are Frankhttp://www.blogger.com/profile/07830428804236732019noreply@blogger.com0

Podcast Show Notes: Zombie Devices and the Moons of Jupiter

OTN TechBlog - Wed, 2017-06-21 09:38

Click to listenOver the past several weeks I’ve had the good fortune to conduct video interviews with various session presenters at the Oracle Code events in New York City, Washington DC, Toronto, and Atlanta, GA. Those interviews, as well as interviews conducted by my OTN colleagues at other Oracle Code events around the globe, are available on the Oracle Developer YouTube channel.

Among the interviews I conducted, two stood out for me because the topics covered, while rooted in the real world, looked forward into a world that borders on science fiction.

So this program will depart from the usual panel discussion format to bring you audio excerpts from those two interviews, in a format that allows you to consume the content while driving, walking, or doing other activities that require your eyes to be on the road rather than staring at a screen. Listen!

Maurice NaftalinUp first is a bit of my March 21st conversation in NYC with Maurice Naftalin. Maurice is a developer, researcher, and trainer with forty years experience in computing. He is the author of Mastering Lambdas: Java Programming in a Multicore World  (2014, Oracle Press), and co-author of Java Generics and Collections: Speed Up the Java Development Process (2006, O’Reilly). Maurice is a Java Champion, and a three-time JavaOne Rock Star award winner.

Maurice presented the session Open Sesame! Conversations With My Front Door at the Oracle Code event in New York City on March 21, 2017. In this segment of the podcast Maurice talks about the Raspberry Pi experiment that was the basis for his session, and shares his thoughts on the how the Internet of Things, chatbots, and other technologies are invading our homes.

Up next is Sean Phillips. Sean is a Principal Software Engineer with a.i. Solutions in Washington DC, where he specializes in Java and JavaFX development and rich-client programming using the NetBeans Platform. Sean currently serves as the lead software engineer for the NASA James Webb Space Telescope Flight Dynamics Ground System.

If you are reading this after 10:00am Eastern Daylight Time on June 22, 2017, Sean will have presented his keynote session, Deep Space Trajectory Design Software For Ocean World Orbiters and Human Space Flight, at the Oracle Code event in Atlanta, GA. However, I recorded my conversation with Sean at the Oracle Code event in Washington DC on March 27, 2017. In this segment of the podcast Sean talks about his work with JavaFX and the development of the Deep Space Trajectory Explorer (DSTE) software designed for NASA.

Additional Resources

Rittman Mead at Kscope 2017

Rittman Mead Consulting - Wed, 2017-06-21 07:45
Rittman Mead at Kscope 2017

Rittman Mead will be well represented in San Antonio, Texas next week for Kscope 17 with some of our best from both sides of the Atlantic! Our very own Francesco Tisiot and Jordan Meyer will present various topics as well as participate in the conference events. Also, the newly named ODTUG BI Community Lead, Rittman Mead's Becky Wagner, will be on hand and leading a lot of activities throughout. See details below and we hope to see you in Texas.


Oracle Big Data Spatial and Graph enables the analysis of data sets beyond that of standard relational analytics commonly used. Through graph technology relationships can be identified that may not otherwise have been. This has practical uses including in product recommendations, social network analysis, and fraud detection.

In this presentation we will see a practical demonstration of Oracle Big Data Spatial and Graph to load and analyze the "Panama Papers" data set. Graph algorithms will be utilized to identify key actors and organizations within the data, and patterns of relationships shown. This practical example of using the tool will give attendees a clear idea of the functionality of the tool and how it could be used within their own organization.

When: Jun 27, 2017, Tuesday Session 7 , 11:15 am - 12:15 pm
Room: Magnolia


OBIEE 12c is the latest generation of Oracle's Enterprise analytics and reporting tool, bringing with it many powerful new features. Many users are still on earlier releases of OBIEE 11g or even 10g, and are looking to understand how they can move to OBIEE 12c to benefit from its new capabilities.

Liberty Global is a global telecommunications company, with a long history with OBIEE going back to 10g. They wanted to move to OBIEE 12c in order to use the new Advanced Analytics options, and used Rittman Mead to support them with the full scope of the upgrade.

In this presentation, we will see what a highly successful OBIEE 12c migration looks like. We will cover clear details of all the steps required, and discuss some of the problems encountered. Regression testing is a crucial step in any upgrade and we will show how we did this efficiently and accurately with the provided Baseline Validation Tool. This presentation will assist all attendees who are considering, or in the process of, an OBIEE 12c upgrade.

When: Jun 26, 2017, Monday Session 5 , 4:45 pm - 5:45 pm
Room: Wisteria/Sunflower


As a DBA or sysadmin responsible for OBIEE how do you really dig into the guts of OBIEE, look at intra-component communication between the system components and examine the apparently un-examinable? What do you do when you need to trace activity beyond what is in the log files? How do you work with log files in order to give precise but low-level information? What information can be gleaned, by hook or by crook, from OBIEE?

OBIEE provides a set of systems management and diagnostic tools, but these only take you so far. Join me in this presentation to dive deeper with OBIEE. We will take a look at a bag of tricks including undocumented configuration options, flame graphs, system call tracing, discovering undocumented REST APIs, and more! This is not just a geek-out - this is real-life examples of where client OBIEE projects have required that next level of diagnostic techniques and tools. Don your beanie hat and beard as we go deep!

When: Jun 28, 2017, Wednesday Session 12 , 9:45 am - 10:45 am
Room: Wisteria/Sunflower


Becky Wagner is the new ODTUG BI Community Lead. You will find her at:

Monday Community Lunch | 12:45 – 2:00 PM | Grand Oaks K-S

Monday evening BI Community Night | 8:00 - 10:00 PM | Grand Oaks H http://kscope17.com/events/community-nigh-events

She will be doing the 5K Fun Run http://kscope17.com/events/kscope17-5k on Tuesday morning

Women in Technology Lunch | 12:15– 1:45 PM | Cibolo Canyon 6 on Wednesday https://form.jotformpro.com/71134693041955

Navigating the Oracle Business Analytics Frontier Panel
9:00 AM - 11:00 AM, Cibolo Canyon 8/9/10

Categories: BI & Warehousing

Alert message on Listener Log growth

Tom Kyte - Wed, 2017-06-21 04:06
Hi, Need suggestion to get an Alert message over the Listener Log file's growth, now each file gets 10M. We want an alert message over 1GB of the combined listener log files. Regards Binoj
Categories: DBA Blogs

Oracle Linux 7.3 Production Server setup for OEM 13c and repository database.

Tom Kyte - Wed, 2017-06-21 04:06
I am planning to install Oracle Linux 7.3 on VMware. This server will be used to host our oracle enterprise manager 13c and its repository database. I have no experience as a server administrator. Currently I am in a situation were I have to setu...
Categories: DBA Blogs

How do I find out what username the database is passing through when connecting using Windows Authentication?

Tom Kyte - Wed, 2017-06-21 04:06
Hi, I'm having trouble using Windows authentication with my database accounts. I have a front-end application that creates a user account based on what their OS username is, OPS$Username, disregarding the domain name. Logging into the application ...
Categories: DBA Blogs

DBMS_JOB Poll Interval

Tom Kyte - Wed, 2017-06-21 04:06
We recently upgraded databases from to With that upgrade, we noticed that after submitted jobs via dbms_job, the procedures can take anywhere between 5-30 seconds in order to actually start to execute after committing. A sample scr...
Categories: DBA Blogs


Subscribe to Oracle FAQ aggregator