Skip navigation.

Feed aggregator

Adaptive Query Optimization in Oracle 12c : Ongoing Updates

Tim Hall - Thu, 2015-08-27 12:09

I’ve said a number of times, the process of writing articles is part of an ongoing learning experience for me. A few days ago my personal tech editor (Jonathan Lewis) asked about a statement I made in the SQL Plan Directive article. On further investigation it turned out the sentence was a complete work of fiction on my part, based on my misunderstanding of something I read in the manual, as well as the assumption that everything that happens must be as a result of a new feature. :)

Anyway, the offending statement has been altered, but the conversation this generated resulted in new article about Automatic Column Group Detection.

The process also highlighted how difficult, at least for me, it is to know what is going on in the optimizer now. It wasn’t always straight forward before, but now with the assorted new optimizations, some beating others to the punch, it is even more difficult. There are a number of timing issues involved also. If a statement runs twice in quick succession, you might get a different sequence of events compared to having a longer gap between the first and second run of the statement. It’s maddening at times. I’m hoping Jonathan will put pen to paper about this, because I think he will do a better job of explaining the issues around the inter-dependencies better than I can.

Anyway, I will be doing another pass through this stuff over the coming days/weeks/months/years to make sure it is consistent with “my current understanding”. :)

Fun, fun, fun…

Cheers

Tim…

Adaptive Query Optimization in Oracle 12c : Ongoing Updates was first posted on August 27, 2015 at 7:09 pm.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.

Red Samurai ADF Performance Audit Tool v 4.0 - Web Client Request Monitoring and Complete Query Analysis

Andrejus Baranovski - Thu, 2015-08-27 03:12
I'm excited to announce, we have released a new version of our RSA audit tool. This is a major update after previous version released in February 2015 - Red Samurai ADF Performance Audit Tool v 3.4 - ADF Task Flow Statistics with Oracle DMS Servlet Integration.

It is already 3 years, since initial version - Red Samurai Performance Audit Tool - Runtime Diagnosis for ADF Applications. We are using it for many of our customers to monitor ADF performance in both test and production environments. Many new features were added during these years, more features to come.

RSA Audit v4.0 New Features

1. RSA Audit v4.0 dashboard is supporting ADF 12c and Alta UI look


2. Web Client Request Time monitoring. Supported with ADF 11g and 12c. Generic method tracks request time for all ADF UI components. Logged data can be analysed through ADF UI dashboard or directly in the DB. Request time represents complete time from user action in the browser, until request is completed. This includes real user experience - browser processing time, network time, server side time and ADF BC/DB processing times. Runs in VERBOSE logging mode


3. Detail information about ADF fragment, button or other ADF UI component involved into request is being logged together with request processing time and is accessible from audit dashboard. This helps to identify slow actions spanning from Web Client to DB


4. Information about each request is grouped, this allows to compare differences between multiple requests and identify bottlenecks in the application performance


5. Duplicate Queries. Allows to track all executed VO’s, very helpful to identify redundant VO’s executions. Groups VO executions per ECID, this helps to identify VO’s re-executed multiple times during the same request. Runs in MEDIUM logging mode


6. VO’s executed from the same ECID are automatically highlighted - this simplifies redundant queries analysis


7. Number of duplicate executions of VO’s per ECID is calculated and presented in the table and sunburst chart


8. We calculate top VO’s per AM. This helps to set priorities for SQL tuning and understand heavy used VO’s


9. Sunburst chart displays visual representation of duplicate and top VO’s per AM

Page Borders and Title Underlines

Tim Dexter - Wed, 2015-08-26 15:32

I have taken to recording screen grabs to help some folks out on 'how do I' scenarios. Sometimes a 3 minute video saves a couple of thousand words and several screen shots.

So, per chance you need to know:

1. How to add a page border to your output and/or

2. How to add an under line that runs across the page

Watch this!   https://www.youtube.com/watch?v=3UcXHeSF0BM

If you need the template, sample data and output, get them here.

I'm taking requests if you have them.

Categories: BI & Warehousing

The Fraught Interaction Design of Personalized Learning Products

Michael Feldstein - Wed, 2015-08-26 12:49

By Michael FeldsteinMore Posts (1042)

David Wiley has a really interesting post up about Lumen Learning’s new personalized learning platform. Here’s an excerpt:

A typical high-level approach to personalization might include:

  • building up an internal model of what a student knows and can do,
  • algorithmically interrogating that model, and
  • providing the learner with a unique set of learning experiences based on the system’s analysis of the student model

Our thinking about personalization started here. But as we spoke to faculty and students, and pondered what we heard from them and what we have read in the literature, we began to see several problems with this approach. One in particular stood out:

There is no active role for the learner in this “personalized” experience. These systems reduce all the richness and complexity of deciding what a learner should be doing to – sometimes literally – a “Next” button. As these systems painstakingly work to learn how each student learns, the individual students lose out on the opportunity to learn this for themselves. Continued use of a system like this seems likely to create dependency in learners, as they stop stretching their metacognitive muscles and defer all decisions about what, when, and how long to study to The Machine.

Instructure’s Jared Stein really likes Lumen’s approach, writing,

So much work in predictive analytics and adaptive learning seeks to relieve people from the time-consuming work of individual diagnosis and remediation — that’s a two-edged sword: Using technology to increase efficiency can too easily sacrifice humanness — if you’re not deliberate in the design and usage of the technology. This topic came up quickly amongst the #DigPedNetwork group when Jim Groom and I chatted about closed/open learning environments earlier this month, suggesting that we haven’t fully explored this dilemma as educators or educational technologist.

I would add that I have seen very little evidence that either instructors or students place a high value on the adaptivity of these products. Phil and I have talked to a wide range of folks using these products, both in our work on the e-Literate TV case studies and in our general work as analysts. There is a lot of interest in the kind of meta-cognitive dashboarding that David is describing. There is little interest in, and in some cases active hostility toward, adaptivity. For example, Essex County College is using McGraw Hill’s ALEKS, which has one of the more sophisticated adaptive learning approaches on the market. But when we talked to faculty and staff there, the aspects of the program that they highlighted as most useful were a lot more mundane, e.g.,

It’s important for students to spend the time, right? I mean learning takes time, and it’s hard work. Asking students to keep time diaries is a very difficult ask, but when they’re working in an online platform, the platform keeps track of their time. So, on the first class day of the week, that’s goal-setting day. How many hours are you going to spend working on your math? How many topics are you planning to master? How many classes are you not going to be absent from?

I mean these are pretty simple goals, and then we give them a couple goals that they can just write whatever they feel like. And I’ve had students write, “I want to come to class with more energy,” and other such goals. And then, because we’ve got technology as our content delivery system, at the end of the week I can tell them, in a very efficient fashion that doesn’t take up a lot of my time, “You met your time goal, you met your topic goal,” or, “You approached it,” or, “You didn’t.”

So one of the most valuable functions of this system in this context is to reflect back to the students what they have done in terms that make sense to them and are relevant to the students’ self-selected learning goals. The measures are fairly crude—time on task, number of topics covered, and so on—and there is no adaptivity necessary at all.

But I also think that David’s post hints at some of the complexity of the design challenges with these products.

You can think of the family of personalized learning products as having potentially two components: diagnostic and prescriptive. Everybody who likes personalized learning products in any form like the diagnostic component. The foundational value proposition for personalization, (which should not in any way be confused with “personal”), is having the system provide feedback to students and teachers about what the student does well and where the student is struggling. Furthermore, the perceived value of the product is directly related to the confidence that students and teachers have that the product is rendering an accurate diagnosis. That’s why I think products that provide black box diagnoses are doomed to market failure in the long term. As the market matures, students and teachers are going to want to know not only what the diagnosis is but what the basis of the diagnosis is, so that they can judge for themselves whether they think the machine is correct.

Once the system has diagnosed the student’s knowledge or skill gaps—and it is worth calling out that these many of these personalized learning systems work on a deficit model, where the goal is to get students to fill in gaps—the next step is to prescribe actions that will help students to address those gaps. Here again we get into the issue of transparency. As David points out, some vendors hide the rationale for their prescriptions, even going so far as to remove user choice and just hide the adaptivity behind the “next” button. Note that the problem isn’t so much with providing a prescription as it is with the way in which it is provided. The other end of the spectrum, as David argues, is to make recommendations. The full set of statements from a well behaved personalized learning product to a student or teacher might be something like the following:

  1. This is where I think you have skill or knowledge gaps.
  2. This is the evidence and reasoning for my diagnosis.
  3. This is my suggestion for what you might want to do next.
  4. This is my reasoning for why I think it might help you.

It sounds verbose, but it can be done in fairly compact ways. Netflix’s “based on your liking Movie X and Movie Y, we think you would give Movie Z 3.5 stars” is one example of a compact explanation that provides at least some of this information. There are lots of ways that a thoughtful user interface designer can think about progressively revealing some of this information and providing “nudges” that encourage students on certain paths while still giving them the knowledge and freedom they need to make choices for themselves. The degree to which the system should be heavy-handed in its prescription probably depends in part on the pedagogical model. I can see something closer to “here, do this next” feeling appropriate in a self-paced CBE course than in a typical instructor-facilitated course. But even there, I think the Lumen folks are 100% right that the first responsibility of the adaptive learning system should be to help the learner understand what the system is suggesting and why so that the learner can gain better meta-cognitive understanding.

None of which is to say that the fancy adaptive learning algorithms themselves are useless. To the contrary. In an ideal world, the system will be looking at a wide range of evidence to provide more sophisticated evidence-based suggestions to the students. But the key word here is “suggestions.” Both because a critical part of any education is teaching students to be more self-aware of their learning processes and because faulty prescriptions in an educational setting can have serious consequences, personalized learning products need to evolve out of the black box phase as quickly as possible.

 

 

The post The Fraught Interaction Design of Personalized Learning Products appeared first on e-Literate.

Inside View Of Blackboard’s Moodle Strategy In Latin America

Michael Feldstein - Wed, 2015-08-26 11:45

By Phil HillMore Posts (356)

One year ago Blackboard’s strategy for Moodle was floundering. After the 2012 acquisition of Moodlerooms and Netspot, Blackboard had kept its promises of supporting the open source community – and in fact, Blackboard pays much more than 50% of the total revenue going to Moodle HQ[1] – but that does not mean they had a strategy. Key Moodlerooms employees were leaving, and the management was frustrated. Last fall the remaining Moodlerooms management put together an emerging strategy to invest in (through corporate M&A) and grow the Moodle business, mostly outside of the US.

In just the past twelve months, Blackboard has acquired three Moodle-based companies – Remote-Learner UK (Moodle Partner in the UK), X-Ray Analytics (learning analytics for Moodle), and Nivel Siete (Moodle Partner in Colombia). When you add in organic growth to these acquisition, Blackboard has added ~450 new clients using Moodle in this same time period, reaching a current total of ~1400.

This is a change worth exploring. To paraphrase Michael’s statements to me and in his recent BbWorld coverage:

If you want to understand Blackboard and their future, you have to understand what they’re doing internationally. If you want to understand what they’re doing internationally, you have to understand what they’re doing with Moodle.

Based on this perspective, I accepted an invitation from Blackboard to come visit Nivel Siete last week to get a first-hand view of what this acquisition means I also attended the MoodleMoot Colombia #mootco15 conference and talked directly to Moodle customers in Latin America. Let’s first unpack that last phrase.

  • Note that due to the nature of this trip, I “talked directly” with Blackboard employees, Nivel Siete employees, Blackboard resellers, and Nivel Siete customers. They did give me free access to talk privately with whoever I wanted to, but treat this post as somewhat of an inside view rather than one that also includes perspectives from competitors.
  • “Moodle” is very significant in Latin America. It is the default LMS that dominates learning environments. The competition, or alternative solution, there is Blackboard Learn or . . . another route to get Moodle. In this market D2L and Canvas have virtually no presence – each company has just a couple of clients in Latin America and are not currently a factor in LMS decision-making. Schoology has one very large customer in Uruguay service hundreds of thousands of students. Blackboard Learn serves the top of the market – e.g. the top 10% in terms of revenue of Colombian institutions, where they already serves the majority of that sub-market according to the people I talked to. For the remaining 90%, it is pretty much Moodle, Moodle, alternate applications that are not LMSs, or nothing.[2]
  • I chose “customers” instead of “schools” or “institutions” for a reason. What is not understood in much of the education community is that Moodle has a large footprint outside of higher ed and K-12 markets. Approximately 2/3 of Nivel Siete’s clients are in corporate learning, and several others are government. And this situation is quite common for Moodle. In the US, more than 1/3 of Moodlerooms’ and approximately 1/2 of Remote-Learner’s customers are corporate learning. Phill Miller, the VP of International for Moodlerooms, said that for most of the Moodle hosting and service providers he has met, they also are serving corporate clients at similar numbers as education.
  • I chose “Latin America” instead of “Colombia” for a reason. While all but ~12 of Nivel Siete’s existing clients are in Colombia, Blackboard bought the company to act as a center of excellence or support service company for most of Latin America – Colombia, Mexico, Brazil, and Peru in particular. Cognos Online, their current local reseller for Latin America for core Blackboard products (Learn, Collaborate, etc) will become the reseller also for their Moodle customers. Nivel Siete will support a broader set of clients. In other words, this is not a simple acquisition of customers – it is an expansion of international presence.

And while we’re at it, the conference reception included a great opera mini flash mob (make sure to watch past 0:37):

Nivel Siete

Nivel Siete (meaning Level 7, a reference from two of the founders’ college days when a professor talked about need to understand deeper levels of the technology stack than just top-level applications that customers see), is a company of just over 20 employees in Bogota. They have 237+ clients, but that is growing. During the three days while I was there they signed several new contracts. They offer Moodle hosting and service in a cloud environment based on Amazon Web Services (AWS) – not true SaaS, as they allow multiple software versions in production and have not automated all provisioning or upgrade processes. What they primarily offer, according to the founders, is a culture of how to service and support using cloud services and specific marketing and sales techniques.

In Latin America, most customers care more about the local sales and support company than they do about the core software. As one person put it, they believe in skin-to-skin sales, where clients have relationships they trust as long as solutions are provided. Most LMS customers in Latin America do not care as much about the components of that solution as they do about relationships, service, and price. And yet, due to open source software and lightweight infrastructure needs, Moodle is dominant as noted above. The Moodle brand, code base, and code licensing does not matter as much as the Moodle culture and ecosystem. From a commercial standpoint, Nivel Siete’s competitors include a myriad of non Moodle Partner hosting providers – telcos bundling in hosting, mom-and-pop providers, self-hosting – or non-consumption. For a subset of the market, Nivel Siete has competed with Blackboard Learn.

Beyond Cognos Online, Blackboard has another ~9 resellers in Latin America, and Nivel Siete (or whatever they decide to name the new unit) will support all of these resellers. This is actually the biggest motivation other than cash for the company to sell – they were seeking methods to extend their influence, and this opportunity made the most sense.

Blackboard Learn and Ultra

What about that Learn sub-market? Most clients and sales people (resellers as well as Blackboard channel manager) are aware of Learn Ultra, but the market seems to understand already that Ultra is not for them . . . yet. They appear to be taking a ‘talk to me when it’s done and done in Spanish’ approach and not basing current decisions on Ultra. In this sense, the timing for Ultra does not matter all that much, as the market is not waiting on it. Once Ultra is ready for Latin America, Blackboard sales (channel manager and resellers) expect the switchover to be quicker than in the US, as LMS major upgrades (involving major UI and UX changes) or adoptions tend to take weeks or months instead of a year or more as we often see in the states. At least in the near term, Learn Ultra is not a big factor in this market.

What Blackboard is best known for in this market is the large SENA contract running on Learn. SENA (National Service for Learning) is a government organization that runs the majority of all vocational colleges – providing certificates and 2-year vocational degrees mostly for lower-income students, a real rising middle class move that is important in developing countries. Blackboard describes SENA as having 6+ million total enrollment, with ~80% in classrooms and ~20% in distance learning.

Integration

The challenge Blackboard faces is integrating its Learn and Moodle operations through the same groups – Nivel Siete internal group, Cognos Online and other resellers serving both lines – without muddling the message and go-to-market approach. Currently Learn is marketed and sold through traditional enterprise sales methods – multiple meetings, sales calls, large bids – while Nivel Siete’s offering of Moodle is marketed and sold with more of a subscription-based mentality. As described by ForceManagement:

A customer who has moved to a subscription-based model of consumption has completely different expectations about how companies are going interact with them.

How you market to them, how you sell to them, how you bill them, how you nurture the relationship – it’s all affected by the Subscription Economy. The customer’s idea of value has changed. And, if the customer’s idea of value has changed, your value proposition should be aligned accordingly. [snip]

The subscription-based sales process relies less on the closing of a sale and more on the nurturing of a long-term relationship to create lifetime customer value.

One of Nivel Siete’s most effective techniques is their The e-Learner Magazine that highlights customer telling their own stories and lessons in a quasi-independent fashion. The company has relied on inbound calls and quick signups and service startups. There is quite a different cultural difference between enterprise software and subscription-based approaches. While Blackboard themselves are facing such changes due to Ultra and newly-offered SaaS models, the group in Latin America is facing the challenge of two different cultures served by the same organizations today.

To help address this challenge, Cognos Online is planning to have two separate teams selling / servicing mainline Blackboard products and Moodle products. But even then, CEO Fernery Morales described that their biggest risk is muddling the message and integrating appropriately.

Moodle Strategy and Risk

At the same time, this strategy and growth comes at a time where the Moodle community at large appears to be at an inflection point. This inflection point I see comes from a variety of triggers:

  • Blackboard acquisitions causing Moodle HQ, other Moodle Partners, and some subset of users’ concerns about commercialization;
  • Creation of the Moodle Association as well as Moodle Cloud services as alternate paths to Moodle Partners for revenue and setup; and
  • Remote-Learner leaving the Moodle Partner program and planning to join the Moodle Association, with its associated lost revenue and public questioning value.

I don’t have time to fully describe these changes here, but Moodle itself is both an opportunity and a risk mostly based on its own success globally. More of that in a future post.

What Does This Mean Beyond Latin America?

It’s too early to fully know, but here are a few notes.

  • Despite the positioning in the US media, there is no “international” market. There are multiple local or regional markets outside of the US that have tremendous growth opportunities for US and other companies outside of those immediate markets. Addressing these markets puts a high premium on localization – having feet on the ground for people who know the culture, can be trusted in the region, and including product customizations meant for those markets. Much of the ed tech investment boom is built on expectations of international growth, but how many ed tech companies actually know how to address local or regional non-US markets? This focus on localizing international markets is one of Blackboard’s greatest strengths.
  • Based on the above, at least in Latin America Blackboard is building itself up as being the status quo before other learning platforms really get a chance to strategically enter the market. For example, Instructure has clearly not chosen to go after non English-speaking international markets yet, but by the time they do push Canvas into Latin America, and if Blackboard is successful integrating Nivel Siete, for example, it is likely Instructure will face an entrenched competitor and potential clients who by default assume Moodle or Learn as solutions.
  • Blackboard as a company has one big growth opportunity right now – the collection of non-US “international” markets that represent just under 1/4 of the company’s revenue. Domestic higher ed is not growing, K-12 is actually decreasing, but international is growing. These growing markets need Moodle and  traditional Learn 9.1 much more than Ultra. I suspect that this growing importance is creating more and more tension internal to Blackboard, as the company needs to balance Ultra with traditional Learn and Moodle development.
  • While I strongly believe in the mission of US community colleges and low-cost 4-year institutions, in Latin America the importance of education in building up an emerging middle class is much greater than in US. We hear this “importance of education” and “building of middle class” used in generic terms regarding ed tech potential, but seeing this connection more closely by being in country is inspiring. This is a real global need that can and should drive future investment in people and technology to address.
  1. This information based on tweet last spring showing Moodlerooms + Netspot combined were more than 50% of revenue, and that the next largest Moodle Partner, Remote-Learner, has left the program. Since last year I have confirmed this information through multiple sources.
  2. Again, much of this information is from people related to Blackboard, but it also matches my investigation of press releases and public statements about specific customers of D2L and Instructure.

The post Inside View Of Blackboard’s Moodle Strategy In Latin America appeared first on e-Literate.

DELETE is faster than TRUNCATE

Laurent Schneider - Wed, 2015-08-26 07:18

Truncate is useful in some serial batch processing but it breaks the read-write consistency, generates stranges errors and results for running selects, and it needs DROP ANY TABLE when run over a table that you do not own.

But also, DELETE is faster in the following test case.

In 12c, you could have over one million partition in a table, but for the sake of the universe, I’ll try with 10000.


SQL> create table scott.t(x) 
  partition by range(x) 
  interval(1) 
  (partition values less than (0)) 
  as 
  select rownum 
  from dual 
  connect by level<10001;
SQL> select count(*) from scott.t;

  COUNT(*)
----------
     10000

The 10K rows table is created, each row is its partition


SQL> delete scott.t;

10000 rows deleted.

Elapsed: 00:00:04.02
SQL> rollback;

Rollback complete.

Not tuned or parallelized or whatever. It took 4 seconds for 10’000 rows. If you have one billion rows, it is doable in a few hours. But you better do it in chunks then.

Anyway, let’s truncate


SQL> truncate table scott.t;

Table truncated.

Elapsed: 00:05:19.24

Five minutes !!! to truncate that tiny table.

If you have one million partitions and underlying indexes and lobs, it will probably failed with out of memory errors after hours and a large impact on the dictionary, sysaux, undo.

The dictionary changes are here very slow.

Protect Your APEX Application PL/SQL Source Code

Pete Finnigan - Wed, 2015-08-26 04:35

Oracle Application Express is a great rapid application development tool where you can write your applications functionality in PL/SQL and create the interface easily in the APEX UI using all of the tools available to create forms and reports and....[Read More]

Posted by Pete On 21/07/15 At 04:27 PM

Categories: Security Blogs

Oracle Security and Electronics

Pete Finnigan - Wed, 2015-08-26 04:35

How does Oracle Security and Electronic mix together? - Well I started my working life in 1979 as an apprentice electrician in a factory here in York, England where I live. The factory designed and built trains for the national....[Read More]

Posted by Pete On 09/07/15 At 11:24 AM

Categories: Security Blogs

New Conference Speaking Dates Added

Pete Finnigan - Wed, 2015-08-26 04:35

In the last few years I have not done as many conference speaking dates as I used to. This is simply because when offered they usually clashed with pre-booked work. I spoke for the UKOUG in Dublin last year and....[Read More]

Posted by Pete On 06/07/15 At 09:40 AM

Categories: Security Blogs

Happy 10th Belated Birthday to My Oracle Security Blog

Pete Finnigan - Wed, 2015-08-26 04:35

Make a Sad Face..:-( I seemed to have missed my blogs tenth which happened on the 20th September 2014. My last post last year and until very recently was on July 23rd 2014; so actually its been a big gap....[Read More]

Posted by Pete On 03/07/15 At 11:28 AM

Categories: Security Blogs

Oracle Database Vault 12c Paper by Pete Finnigan

Pete Finnigan - Wed, 2015-08-26 04:35

I wrote a paper about Oracle Database Vault in 12c for SANS last year and this was published in January 2015 by SANS on their website. I also prepared and did a webinar about this paper with SANS. The Paper....[Read More]

Posted by Pete On 30/06/15 At 05:38 PM

Categories: Security Blogs

Unique Oracle Security Trainings In York, England, September 2015

Pete Finnigan - Wed, 2015-08-26 04:35

I have just updated all of our Oracle Security training offerings on our company website. I have revamped all class pages and added two page pdf flyers for each of our four training classes. In have also updated the list....[Read More]

Posted by Pete On 25/06/15 At 04:36 PM

Categories: Security Blogs

Coding in PL/SQL in C style, UKOUG, OUG Ireland and more

Pete Finnigan - Wed, 2015-08-26 04:35

My favourite language is hard to pin point; is it C or is it PL/SQL? My first language was C and I love the elegance and expression of C. Our product PFCLScan has its main functionallity written in C. The....[Read More]

Posted by Pete On 23/07/14 At 08:44 PM

Categories: Security Blogs

Integrating PFCLScan and Creating SQL Reports

Pete Finnigan - Wed, 2015-08-26 04:35

We were asked by a customer whether PFCLScan can generate SQL reports instead of the normal HTML, PDF, MS Word reports so that they could potentially scan all of the databases in their estate and then insert either high level....[Read More]

Posted by Pete On 25/06/14 At 09:41 AM

Categories: Security Blogs

On docker, Ubuntu and Oracle RDBMS

Marcelo Ochoa - Tue, 2015-08-25 19:17
I have Oracle RDBMS working on Ubuntu for a long time (12.04 and 14.04) RDBMS versions 10g, 11g and 12c with some tweaks to get it working.
Apart from the effort to get it working, some time requires Makefile modifications, these configurations are not supported by Oracle and for sure you couldn't report any bug.
To solve this, is easy to get VirtualBox working and download a pre-built VM, but it requires a lot of hardware resource :(
Hopefully Docker comes in action, I followed this great post by Frits Hoogland for Installing Oracle Database in Docker.
First We need docker running on Ubuntu, there too many guides about that also by simple installing using apt-get repository, my case.
Following Frits's guide I changed the default repository destination using an USB external disk formatted with btrfs:
# df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/sda1        15G  7,7G  6,5G  55% /
none            4,0K     0  4,0K   0% /sys/fs/cgroup
udev            3,9G   12K  3,9G   1% /dev
tmpfs           789M  1,4M  788M   1% /run
none            5,0M     0  5,0M   0% /run/lock
none            3,9G  344M  3,6G   9% /run/shm
none            100M   56K  100M   1% /run/user
/dev/sda6       442G  384G   36G  92% /home
/dev/sdb        597G   18G  577G   3% /var/lib/dockerAlso I created the directory for the Docker template but with some changes, here the files:root@local:/# cd /var/lib/docker/dockerfiles/build-oracle-12102/
root@local:/var/lib/docker/dockerfiles/build-oracle-12102# ls -l
total 2625148
-rw-r--r-- 1 root root      10976 ago 23 10:07 db_install.dbt
-rw-r--r-- 1 root root      10931 ago 25 16:30 db_install-full.dbt
-rw-r--r-- 1 root root      10972 ago 25 16:30 db_install-simple.dbt
-rw-r--r-- 1 root root       1168 ago 25 11:09 Dockerfile
-rw-r--r-- 1 root root 1673544724 ago 22 20:36 linuxamd64_12102_database_1of2.zip
-rw-r--r-- 1 root root 1014530602 ago 22 20:36 linuxamd64_12102_database_2of2.zip
-rwxr-xr-x 1 root root       1729 ago 25 10:11 manage-oracle.sh
-rw-r--r-- 1 root root      24542 ago 24 20:42 responsefile_oracle12102.rspThe content of Dockerfile is:FROM    oraclelinux:6
MAINTAINER marcelo.ochoa@gmail.com
RUN groupadd -g 54321 oinstall
RUN groupadd -g 54322 dba
RUN useradd -m -g oinstall -G oinstall,dba -u 54321 oracle
RUN yum -y install oracle-rdbms-server-12cR1-preinstall perl wget unzip
RUN mkdir /u01
RUN chown oracle:oinstall /u01
USER    oracle
WORKDIR /home/oracle
COPY linuxamd64_12102_database_1of2.zip /home/oracle/
COPY linuxamd64_12102_database_2of2.zip /home/oracle/
COPY responsefile_oracle12102.rsp /home/oracle/
RUN unzip linuxamd64_12102_database_1of2.zip
RUN unzip linuxamd64_12102_database_2of2.zip
RUN rm linuxamd64_12102_database_1of2.zip linuxamd64_12102_database_2of2.zip
RUN /home/oracle/database/runInstaller -silent -force -waitforcompletion -responsefile /home/oracle/responsefile_oracle12102.rsp -ignoresysprereqs -ignoreprereq
USER    root
RUN /u01/app/oraInventory/orainstRoot.sh
RUN /u01/app/oracle/product/12.1.0.2/dbhome_1/root.sh -silent
RUN rm -rf /home/oracle/responsefile_oracle12102.rsp /home/oracle/database
USER    oracle
WORKDIR /home/oracle
RUN     mkdir -p /u01/app/oracle/data
COPY    manage-oracle.sh /home/oracle/
EXPOSE  1521
CMD sh -c /home/oracle/manage-oracle.shRemarked lines differ from the original post by Frits to not download everything from the web using wget, instead of doing that I downloaded 12c binary distribution from OTN Web Site and I copied these two zip into the directory where Dockerfile resides, also I am download Frits's files responsefile_oracle12102.rsp, manage-oracle.sh and db_install.dbt.The file which is responsible for creating/starting/stopping the DB also was modified to use db_install.dbt from the host machine, here the modified version of manage-oracle.sh:#!/bin/bash
PERSISTENT_DATA=/u01/app/oracle/data
export ORACLE_HOME=/u01/app/oracle/product/12.1.0.2/dbhome_1
export ORACLE_SID=$(hostname)
stop_database() {
$ORACLE_HOME/bin/sqlplus / as sysdba << EOF
shutdown abort
exit
EOF
exit
}
start_database() {
$ORACLE_HOME/bin/sqlplus / as sysdba << EOF
startup
exit
EOF
}
create_pfile() {
$ORACLE_HOME/bin/sqlplus -S / as sysdba << EOF
set echo off pages 0 lines 200 feed off head off sqlblanklines off trimspool on trimout on
spool $PERSISTENT_DATA/init_$(hostname).ora
select 'spfile="'||value||'"' from v\$parameter where name = 'spfile';
spool off
exit
EOF
}
trap stop_database SIGTERM
printf "LISTENER=(DESCRIPTION_LIST=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=$(hostname))(PORT=1521))(ADDRESS=(PROTOCOL=IPC)(KEY=EXTPROC1521))))\n" > $ORACLE_HOME/network/admin/listener.ora
$ORACLE_HOME/bin/lsnrctl start
if [ ! -f ${PERSISTENT_DATA}/DATABASE_IS_SETUP ]; then
sed -i "s/{{ db_create_file_dest }}/\/u01\/app\/oracle\/data\/$(hostname)/" $PERSISTENT_DATA/db_install.dbt
sed -i "s/{{ oracle_base }}/\/u01\/app\/oracle/" $PERSISTENT_DATA/db_install.dbt
sed -i "s/{{ database_name }}/$(hostname)/" $PERSISTENT_DATA/db_install.dbt
$ORACLE_HOME/bin/dbca -silent -createdatabase -templatename $PERSISTENT_DATA/db_install.dbt -gdbname $(hostname) -sid $(hostname) -syspassword oracle -systempassword oracle -dbsnmppassword oracle
create_pfile
if [ $? -eq 0 ]; then
touch ${PERSISTENT_DATA}/DATABASE_IS_SETUP
fi
else
mkdir -p /u01/app/oracle/admin/$(hostname)/adump
$ORACLE_HOME/bin/sqlplus / as sysdba << EOF
startup pfile=$PERSISTENT_DATA/init_$(hostname).ora
exit
EOF
fi
tail -f /u01/app/oracle/diag/rdbms/$(hostname)/*/trace/alert_$(hostname).log &
waitI am doing that to quickly creates different database with some Oracle options enabled or not, for example db_install.dbt file for OLS Searching will have enabled:option name="JSERVER" value="true"
option name="SPATIAL" value="true"
option name="IMEDIA" value="true"
option name="XDB_PROTOCOLS" value="true"
option name="ORACLE_TEXT" value="true"
.....To create the docker template is similar to the original post:# cd /var/lib/docker/dockerfiles/build-oracle-12102
# docker build -t "oracle-12102" .
# docker images
REPOSITORY          TAG                 IMAGE ID            CREATED             VIRTUAL SIZE
oracle-12102        latest              24687eeab73c        8 hours ago         12.26 GB
oraclelinux         6                   cfc75fa9f295        3 weeks ago         156.2 MBFinally to create a full featured Java enabled 12c database I created a directory with the following content:# mkdir -p /var/lib/docker/db/ols
# cp /var/lib/docker/dockerfiles/build-oracle-12102/db_install-full.dbt  /var/lib/docker/db/ols/db_install.dbt
# chown -R 54321:54321 /var/lib/docker/db/olsand executed:# docker run --ipc=host --volume=/var/lib/docker/db/ols:/u01/app/oracle/data --name ols --hostname ols --detach=true oracle-12102
25efb5d26aad31e7b06a8e2707af7c25943e2e42ec5c432dc9fa55f0da0bdaefthe container is started and the database creations works perfect, here the output:# docker logs -f ols
LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 25-AUG-2015 16:35:18
Copyright (c) 1991, 2014, Oracle.  All rights reserved.
Starting /u01/app/oracle/product/12.1.0.2/dbhome_1/bin/tnslsnr: please wait...
TNSLSNR for Linux: Version 12.1.0.2.0 - Production
System parameter file is /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Log messages written to /u01/app/oracle/diag/tnslsnr/ols/listener/alert/log.xml
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ols)(PORT=1521)))
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=ols)(PORT=1521)))
STATUS of the LISTENER
------------------------
Alias                     LISTENER
Version                   TNSLSNR for Linux: Version 12.1.0.2.0 - Production
Start Date                25-AUG-2015 16:35:19
Uptime                    0 days 0 hr. 0 min. 1 sec
Trace Level               off
Security                  ON: Local OS Authentication
SNMP                      OFF
Listener Parameter File   /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Listener Log File         /u01/app/oracle/diag/tnslsnr/ols/listener/alert/log.xml
Listening Endpoints Summary...
  (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ols)(PORT=1521)))
  (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
The listener supports no services
The command completed successfully
LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 25-AUG-2015 16:35:18
Copyright (c) 1991, 2014, Oracle.  All rights reserved.
Starting /u01/app/oracle/product/12.1.0.2/dbhome_1/bin/tnslsnr: please wait...
TNSLSNR for Linux: Version 12.1.0.2.0 - Production
System parameter file is /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Log messages written to /u01/app/oracle/diag/tnslsnr/ols/listener/alert/log.xml
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ols)(PORT=1521)))
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=ols)(PORT=1521)))
STATUS of the LISTENER
------------------------
Alias                     LISTENER
Version                   TNSLSNR for Linux: Version 12.1.0.2.0 - Production
Start Date                25-AUG-2015 16:35:19
Uptime                    0 days 0 hr. 0 min. 1 sec
Trace Level               off
Security                  ON: Local OS Authentication
SNMP                      OFF
Listener Parameter File   /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Listener Log File         /u01/app/oracle/diag/tnslsnr/ols/listener/alert/log.xml
Listening Endpoints Summary...
  (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ols)(PORT=1521)))
  (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
The listener supports no services
The command completed successfully
Creating and starting Oracle instance
1% complete
.....
25% complete
Adding Oracle JVM
32% complete
....
Adding Oracle Text
50% complete
....
Adding Oracle Multimedia
55% complete
....
Adding Oracle Spatial
69% complete
Adding Oracle Spatial
69% complete
....
Adding Oracle Application Express
82% complete
87% complete
Completing Database CreationNext steps installing Scotas OLS for testing and happy Docker/Oracle combination :)

Oracle Sales Cloud is a Dynamic Force at Oracle OpenWorld 2015

Linda Fishman Hoyle - Tue, 2015-08-25 16:29

A Guest Post by Michael Richter (pictured left), Director of Product Management, Oracle

OpenWorld is returning to San Francisco–October 25-29, 2015!

Customer Experience (CX) will take over Moscone West 2nd floor this year, and Oracle Sales Cloud is excited to present this year’s lineup of customer experience sessions and demonstrations for sales professionals.

Learn about the newest enhancements or get the latest product demonstrations from the product management and sales consulting teams.

Visit Sales―CX Central@OpenWorld for details. All Oracle Sales Cloud sessions will take place in rooms 2003 or 2004 in Moscone West on the 2nd floor. The Oracle Sales Cloud demo zone is also situated on the 2nd floor. Dates, times, and room numbers will be published on at the link above in early September.

What’s New and Different?

  • Release essentials and roadmap sessions for Oracle Sales Cloud and Configure, Price, and Quote
  • Pre-configured industry solution session and product demonstrations
  • Partner-led interactive workshops and hands-on labs
  • Sessions showing new ways of how to migrate or integrate with existing applications

Guest Customer and Partner Appearances

Deloitte Digital, Pansonic Manufacturing (UK), Wilsonart, Accenture, Schneider Electric, Accuride, Config Consultants, DB Schenker, Swiss Post, KEC International, Batesville, Serene, ustudio, Hitachi Consultants, KPN, Oceaneering, TMEIC, TH March, GPUK (Global Pay), Prisio Technologies, Perficient, Infosys, Apex IT, e-Verge, General Electric, Tuff Shed, and more!

Kick-Off with the Sales General Session!

Deloitte Digital teams up with GVP Bill Hunt, Oracle Sales Cloud Development, to kick off the CX-Sales track (GEN4525) at 1:00 – 2:15 p.m. in Room 2003. Hear about Oracle’s product strategy and what Deloitte Digital is doing to meet today’s customer experience challenges.

The Release Essentials and Roadmap Conference Sessions

Oracle Sales Cloud will host more than 30 conference sessions this year. The first in a series of seven Release Essentials and Roadmap sessions begins at 2:45 p.m. on Monday, October 26. These sessions are led by Oracle Sales Cloud product management team members and include customer and partner case studies.

  • Oracle Sales Cloud Release 11 Essentials and Roadmap for Analytics [CON4508]
  • Oracle Sales Cloud Release 11 Essentials and Roadmap for Sales Performance Management [CON7951]
  • Oracle Sales Cloud Release 11 Essentials and Roadmap for PRM: High Tech and Manufacturing [CON4529]
  • Oracle Sales Cloud Release11 Essentials and Roadmap for Configuring and Customizing [CON7952]
  • Oracle Sales Cloud Release 11 Essentials and Roadmap for Outlook and IBM Notes [CON7950]
  • Oracle Sales Cloud Release 11 Essentials and Roadmap for Oracle Customer Data Hub and DaaS [CON7958]
  • Oracle Configure, Price, and Quote Cloud Essentials and Roadmap [CON4548]

Sessions of Special Interest

  • Oracle Sales Cloud: A Temperature Check for Customers by Nucleus Research [CON8191]
  • Oracle Sales Cloud: the Path to the Cloud for Siebel Customers [CON4537]
  • Oracle Configure, Price, and Quote Cloud: Driving Sales Productivity at Schneider Electric [CON8187]
  • Oracle Sales Cloud Hands-on Lab: Oracle Sales Cloud Mobile [HOL8735]
  • Oracle Sales Cloud Hands-on Lab: Easy to Use—Anytime, Anywhere [HOL8665]

Customer Panels

We have scheduled two popular customer panels:

  • Oracle Sales Cloud Analytics [CON9140] with Batesville and Fike
  • Oracle Sales Cloud Customer Panel: the Challenges of Digital Transformation [CON7954] with KEC, Tuff Shed, and TH March

Partner Panels

Hear from the experts at the partner panels.

  • Oracle Sales Cloud: Strategies for Successful CRM Implementations [CON8196] with panelists Config Consultants, Infosys, Apex IT, and e-Verge
  • Oracle Sales Cloud: Delivering Key Capabilities of Modern Selling [CON9660] with Accenture, Perficient, Hitachi Consulting, and Serene

Sales Demo Zone

Take part in Oracle Sales Cloud product demonstrations led by members of the Oracle Sales Cloud product management and sales consulting teams. The Buzz House is located next to the CX-Sales Demo so you can relax with a cup of coffee or snack.

  • New enhancements for core sales force automation and mobile solutions
  • How Oracle ensures data integrity with Customer Data Management
  • The latest developments for analytics
  • Streamlining with Configure, Price, and Quote
  • Oracle Sales Cloud integration with MS Outlook and IBM Notes
  • Incentive Compensation, Territory Management, and Quota Management
  • How to configure and customize and the tools available to you
  • Learn about integrations and migration processes and tools
  • Learn what’s in store for midsize companies

At a Glance

Visit Sales―CX Central@OpenWorld Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} for full details on speakers, conference sessions, exhibits and entertainment and experience all that Oracle OpenWorld has to offer.  Dates, times, and room numbers will be published at the link above in early September

Latest Oracle Service Cloud Product Release Powers Impactful Community Self-Service Experiences

Linda Fishman Hoyle - Tue, 2015-08-25 16:20

A Guest Post by David Vap (pictured left), Group Vice President, Product Development, Oracle

Today more than one in three customers prefers to contact brands through social channels rather than by phone or email (Nielsen), and the distinction between social and traditional channels is eroding. To deliver the best possible customer experience across traditional and digital channels, customer care organizations need to provide a positive and unified experience where and when customers want, whether they are on Twitter, Facebook, peer-to-peer communities, or other social networks.

Following Oracle’s recent Twitter-enriched social customer service announcement, the latest release of Oracle Service Cloud and Oracle Social Cloud continues to power positive and differentiated customer experiences. The new functionality includes:

New Community Self-Service solution to help streamline the customer journey

  • New approach to web self-service brings community functionality directly into core Service Cloud multi-channel web experience
  • Service Cloud now enables organizations to deliver a seamless experience between web self-service and community interactions, leveraging power of customer knowledge to improve service operations
  • A customer no longer needs to separately navigate self-service and community sites to find an answer, instead discovering and interacting with both formal knowledge (knowledge base) and informal knowledge (community answers) in a single experience

Enhanced social service and incident routing

  • New workflow capabilities between Social Cloud and Service Cloud enable businesses to leverage power of social insights and engagements
  • Business users can now attach contextual attributes and notes from posts or incidents identified by Social Cloud directly to Service Cloud to improve service quality and efficiency by providing more customer information and context

Extended social listening and analytics capabilities to private data sources

  • Enhanced connectivity between Social Cloud and Service Cloud has also extended social listening and analytics to enterprise private-data sources, such as the new  community self-service capability, survey data, and chat and call logs.
  • Organizations can now listen and analyze unstructured data and gain insights with terms, themes, sentiment, and customer metrics, and view private and public data side by side in the Oracle SRM.

According to Gartner, investment in peer-to-peer communities drives support costs down and boosts profits. In fact, in a December 2014 Gartner research note entitled “Nine CRM Projects to Do Right Now for Customer Service,” Michael Maoz, Vice President, Distinguished Analyst, Gartner, writes, “Gartner clients who are successful in this space are still seeing on average of 20% reduction in the creation of support tickets following the introduction of peer-to-peer communities.” Maoz goes on to say, “Clients are seeing other business benefits as well. By enabling community-based support, clients have been able to recognize new sales opportunities and increase existing customer satisfaction, resulting in increased revenue in several of these cases.”

For more information about this leading social customer service product, read the news release and check out the VentureBeat profile!

Truncate – 2

Jonathan Lewis - Tue, 2015-08-25 11:25

Following on from my earlier comments about how a truncate works in Oracle, the second oldest question about truncate (and other DDL) appeared on the OTN database forum“Why isn’t a commit required for DDL?”

Sometimes the answer to “Why” is simply “that’s just the way it is” – and that’s what it is in this case, I think.  There may have been some historic reason why Oracle Corp. implemented DDL the way they did (commit any existing transaction the session is running, then auto-commit when complete), but once the code has been around for a few years – and accumulated lots of variations – it can be very difficult to change a historic decision, no matter how silly it may now seem.

This posting isn’t about answering the question “why”, though; it’s about a little script I wrote in 2003 in response to a complaint from someone who wanted to truncate a table in the middle of a transaction without committing the transaction. Don’t ask why – you really shouldn’t be executing DDL as part of a transactional process (though tasks like dropping and recreating indexes as part of a batch process is a reasonable strategy).

So if DDL always commits the current transaction how do you truncate a table without committing ? Easy – use an autonomous transaction. First a couple of tables with a little data, then a little procedure to do my truncate:


create table t1 (n1 number);
insert into t1 values(1);

create table t2 (n1 number);
insert into t2 values(1);

create or replace procedure truncate_t1
as
        pragma autonomous_transaction;
begin
        execute immediate 'truncate table t1';
end;
/

Then the code to demonstrate the effect:


prompt  ======================================
prompt  In this example we end up with no rows
prompt  in t1 and only the original row in t2,
prompt  the truncate didn't commit the insert.
prompt  ======================================

insert into t2 values(2);

execute truncate_t1;
rollback;

select * from t1;
select * from t2;


According to my notes, the last time I ran this code was on 9.2.0.3 but I’ve just tested it on 12.1.0.2 and it behaves in exactly the same way.

I’ve only tested the approach with “truncate” and “create table” apparently, and I haven’t made any attempt to see if it’s possible to cause major distruption with cunningly timed concurrent activity; but if you want to experiment you have a mechanism which Oracle could have used to avoid committing the current transaction – and you may be able to find out why it doesn’t, and why DDL is best “auto-committed”.


Autonomous transaction to the rescue

Patrick Barel - Tue, 2015-08-25 10:10
.code, .code pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .code pre { margin: 0em; } .code .rem { color: #ff0000; } .code .kwrd { color: #008080; } .code .str { color: #0000ff; } .code .op { color: #0000c0; } .code .preproc { color: #cc6633; } .code .asp { background-color: #ffff00; } .code .html { color: #800000; } .code .attr { color: #ff0000; } .code .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .code .lnum { color: #606060; }

Today, at my current project, I came across an issue where autonomous transactions came in handy.

The situation: I need to create a query to perform an export. A couple of the fields to be selected come from a global temporary table, nothing fancy so far except this global temporary table is filled by a (rather complex) procedure. Another problem is this table is emptied for every row, i.e. it will contain only one row at a time. ‘Just build a wrapper table function for this procedure and have that function call the procedure’ was my first idea.

I created a script that shows the situation

CREATE GLOBAL TEMPORARY TABLE empdate
(
  empno NUMBER(4)
, hiredate DATE
)
ON COMMIT DELETE ROWS
/
CREATE OR REPLACE PROCEDURE getthehiredate(empno_in IN NUMBER) IS
BEGIN
  DELETE FROM empdate;
  INSERT INTO empdate
    (empno
    ,hiredate)
    (SELECT empno
           ,hiredate
       FROM emp
      WHERE empno = empno_in);
END getthehiredate;
/

Then I set out to build a pipelined table function that accepts a cursor as one of its parameters. This function then loops all the values in the cursor, calls the procedure, reads the data from the global temporary table and pipes out the resulting record, nothing really fancy so far.

CREATE TYPE empdate_t AS OBJECT
(
  empno    NUMBER(4),
  hiredate DATE
)
/
CREATE TYPE empdate_tab IS TABLE OF empdate_t
/
CREATE OR REPLACE FUNCTION getallhiredates(empnos_in IN SYS_REFCURSOR) RETURN empdate_tab
  PIPELINED IS
  l_empno       NUMBER(4);
  l_returnvalue empdate_t;
BEGIN
  FETCH empnos_in
    INTO l_empno;
  WHILE empnos_in%FOUND LOOP
    getthehiredate(empno_in => l_empno);
    SELECT empdate_t(ed.empno, ed.hiredate)
      INTO l_returnvalue
      FROM empdate ed
     WHERE 1 = 1
       AND ed.empno = l_empno;
    PIPE ROW(l_returnvalue);
    FETCH empnos_in
      INTO l_empno;
  END LOOP;
  RETURN;
END getallhiredates;
/

But when I ran a query against this function:

SELECT *
FROM TABLE(getallhiredates(CURSOR (SELECT empno
FROM emp)))
/

I ran into an error:

ORA-14551: cannot perform a DML operation inside a query 

So, all the work I done so far had been for nothing? Time wasted? I don’t think so. If there is anything I learned over the years it is that Oracle tries to stop you doing certain things but at the same time supplies you the tools to create a work-around.

There is something like an autonomous transaction, that might help me in this case so I changed the code for the function a bit:


CREATE OR REPLACE FUNCTION getallhiredates(empnos_in IN SYS_REFCURSOR) RETURN empdate_tab
  PIPELINED IS
  PRAGMA AUTONOMOUS_TRANSACTION;
  l_empno       NUMBER(4);
  l_returnvalue empdate_t;
BEGIN
  FETCH empnos_in
    INTO l_empno;
  WHILE empnos_in%FOUND LOOP
    getthehiredate(empno_in => l_empno);
    SELECT empdate_t(ed.empno, ed.hiredate)
      INTO l_returnvalue
      FROM empdate ed
     WHERE 1 = 1
       AND ed.empno = l_empno;
    PIPE ROW(l_returnvalue);
    FETCH empnos_in
      INTO l_empno;
  END LOOP;
  COMMIT;
  RETURN;
END getallhiredates;
/

But when I ran the query:

SELECT *
FROM TABLE(getallhiredates(CURSOR (SELECT empno
FROM emp)))
/

I ran into a different error:

ORA-06519: active autonomous transaction detected and rolled back

So this doesn’t work or does it? Pipelined table functions have ‘exit’ the function multiple times. Whenever a row is piped out. So, I tried to put the COMMIT just before the PIPE ROW command:


CREATE OR REPLACE FUNCTION getallhiredates(empnos_in IN SYS_REFCURSOR) RETURN empdate_tab
  PIPELINED IS
  PRAGMA AUTONOMOUS_TRANSACTION;
  l_empno       NUMBER(4);
  l_returnvalue empdate_t;
BEGIN
  FETCH empnos_in
    INTO l_empno;
  WHILE empnos_in%FOUND LOOP
    getthehiredate(empno_in => l_empno);
    SELECT empdate_t(ed.empno, ed.hiredate)
      INTO l_returnvalue
      FROM empdate ed
     WHERE 1 = 1
       AND ed.empno = l_empno;
    COMMIT;
    PIPE ROW(l_returnvalue);
    FETCH empnos_in
      INTO l_empno;
  END LOOP;
  RETURN;
END getallhiredates;
/

And when I ran my statement again:

SELECT *
FROM TABLE(getallhiredates(CURSOR (SELECT empno
FROM emp)))
/

It worked as I hoped for.

As you can see I have tried to mimic the situation using the EMP and DEPT tables. I think this is a nice little trick, but it should be used with caution. It is not for no reason that Oracle prevents you from running DML inside a query, but in this case I can bypass this restriction.



Autonomous transaction to the rescue

Bar Solutions - Tue, 2015-08-25 10:10
.code, .code pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .code pre { margin: 0em; } .code .rem { color: #ff0000; } .code .kwrd { color: #008080; } .code .str { color: #0000ff; } .code .op { color: #0000c0; } .code .preproc { color: #cc6633; } .code .asp { background-color: [...]