Skip navigation.

Feed aggregator

Retailers use business intelligence to optimize supply chain

Chris Foot - Wed, 2014-04-16 10:07

Due to the prevalence of omnichannel retail, merchandisers are obligated to satisfy the inventory fulfillment requirements of brick-and-mortar stores and consumers. Instead of using human resources to scrutinize the distribution process, commodity-based companies are hiring database experts to implement business intelligence tools capable of providing actionable information regarding the supply chain.

What's redefining modern delivery systems?
E-commerce has allowed corporations to deliver their products to consumers residing in various parts of the country, creating variable demand for particular items. In order to anticipate customer desires, data analytics tools are being used to chart regional weather conditions, translate online surveys and monitor the distribution of materials. Jim Rice, a contributor to Supply Chain 24/7, stated that while transportation and storage processes cannot change the specifications of an item, they can revolutionize the way in which that particular product is delivered to a customer.

For example, a customized, direct-to-order method can transform consumer expectations. People don't want to wait to receive their purchased materials, even though an unspoken covenant was established the minute they finalized the order on a website. Therefore, database administration personnel employ programs that scrutinize which areas of the supply chain can be optimized to ensure that products are delivered as promptly as possible. The patterns these software solutions recognize are often overlooked by human eyes.

Enhancing global sourcing
Database engineering company Oracle recently announced the introduction of Oracle Global Trade Intelligence, a global commerce analytics application that provides organizations with the ability to leverage worldwide sourcing and distribution data to measure, predict and optimize the performance of their supply chains. Released in February, the program contains modifiable dashboards that enable enterprises to construct user-defined trade performance measurements that scrutinize import and export activities throughout the world.

Oracle experts and sourcing professionals are thrilled with the release, which also offers executives the chance to streamline communications between overseas departments. This process is expected to ensure that all materials are properly tracked, significantly reducing the chance of losing vital products. In addition, the program gives strategists the ability to anticipate the actions of both foreign and domestic competitors.

"Organizations are moving beyond automation of their global trade processes and are seeking ways to leverage their global trade data to make better business decisions," said Vice President of Value Chain Execution Product Strategy Derek Gittoes.

In the age of global commerce, it's imperative that companies possess programs akin to Oracle Global Trade Intelligence in order expedite the shipment of goods and reduce the cost for such products on the consumer's end.

Annonce : OVCA X4-2

Jean-Philippe Pinte - Wed, 2014-04-16 09:08
Oracle a annoncé  l' Oracle Virtual Compute Appliance X4-2

Plus d'informations :

Monitoring Oracle Golden Gate from SQL Developer

DBASolved - Wed, 2014-04-16 07:16

Last week I was at Collaborate 14 speaking in two sessions; one of the sessions I had done a couple of times before.  The other session was about the different ways of monitoring Oracle GoldenGate (If you are curious about the presentation it can be found here).  While at the conference I ran the idea of monitoring GoldenGate from SQL Developer by a few peers and there seems to be interest.  As for Oracle, this approach to monitoring GoldenGate is not on Oracle’s road map for SQL Developer.

To achieve this goal, the usage of XML extensions within SQL Developer is needed.  Using XML extensions, I’ve been able to leverage monitoring GoldenGate from SQL into a working extension.  The extension is not perfect and continues to need some work.  As you can see in image 1, I can get the status of a GoldenGate process and associated stats.

Image 1:image

The SQL Developer extension for Oracle GoldenGate is available for whoever would like to use it and extend on it.  This extension is included with my other GoldenGate monitoring scripts located here and on my scripts page.

Note: at some point, I will hopefully get this extension uploaded to a Github repository for community digestion.

This extension is to help DBAs have a way to monitor their GoldenGate environments without the need of going directly to the server. For now, it just gives up/down status and operation stats.  Hopefully, as this matures (as I and others work on it) it will become a robust extension for all monitoring with Oracle GoldenGate.


twitter: @dbasolved


Filed under: Golden Gate, Replication
Categories: DBA Blogs

Choice Hotels Reduces Time to Market with Oracle WebCenter

WebCenter Team - Wed, 2014-04-16 07:00
Choice Hotels is a global hotel franchise with 6000+ franchisees. Choice is in the process of completely re-building their distribution platform for the next generation.  They currently process and distribute over $6B worth of reservation activity to the franchisees of Choice Hotels. To continue to grow and support the strategic business initiatives of the company they needed to address the primary vehicle for delivering business to their customers.  Their current platform was an amalgam of 25 years of development on disparate systems that offered limited inter-operability and scalability.  A strategic decision was made to adopt the Oracle technology stack to build out this new platform.

 They introduced more self-service opportunities to their business to create greater empowerment and flexibility, thus reducing the dependency on IT to introduce new features to their production systems. They are using WebCenter Sites as their Web Experience Management platform deployed for their International Websites. Standardizing on WebCenter Sites has allowed them to consolidate four different Content Management System platforms and created a Content Center of Excellence making it easier for any of their business users to take advantage of the WYSIWYG interface.

Perhaps the biggest win at the enterprise level is standardizing on the Oracle platform. They have seen tremendous benefits by being standardized on a focused set of tools and applications.  Their development, testing, administration, and database engineers now have greater flexibility and mobility throughout their enterprise.  Since they all have the same relative skill set, they can flex on certain projects in order to get them to market quicker.  No more Sys Admins having to learn the ins and outs of a particular system and its idiosyncrasies because everything is now a standard Oracle deployment.

To learn more about delivering the best digital experience today - join us for the Oracle WebCenter Digital Business Thought Leadership Series

This week:

Delivering Moments of Engagement Across the Enterprise
Five Steps for Mobilizing Digital Experiences

Geoffrey Bock, Principal, Bock & Company
Michael Snow, Product Marketing Director, Oracle WebCenter

HGV Levy

Tony Andrews - Wed, 2014-04-16 04:45
The UK government has introduced a new service for foreign lorry drivers to pay a levy to use UK roads here: It was built by my current employer, Northgate Information Solutions. Guess what technology it runs on? We had a lot of interesting challenges when building this: Compliance with UK Government styling and standards Responsive design to work on Tony Andrews

Read Oracle Linux 7 Beta 1

Surachart Opun - Wed, 2014-04-16 03:53
It might be too late posting about Oracle Linux 7 (Beta 1). Just came back from long holidays in Thailand. I think it's very interesting to learn something new in OL7. Users can download it by using OTN account.
Release Note.

After installed it, I tested some a little bit.
[root@ol7beta ~]# cat /etc/oracle-release
Oracle Linux Everything release 7.0 Beta
[root@ol7beta ~]# uname -r
Users can choose to start with RHCK or UEK3.
[root@ol7beta ~]#Oracle Linux 7 provides the temporary file system (tmpfs), which is configured in volatile memory and whose contents do not persist after a system reboot.
[root@ol7beta ~]# df
Filesystem          1K-blocks   Used Available Use% Mounted on
/dev/mapper/ol-root  49747968 962512  48785456   2% /
devtmpfs               886508      0    886508   0% /dev
tmpfs                  893876      0    893876   0% /dev/shm
tmpfs                  893876   2212    891664   1% /run
tmpfs                  893876      0    893876   0% /sys/fs/cgroup
/dev/sda1              487652  91380    366576  20% /boot
[root@ol7beta ~]# systemctl status  tmp.mount
tmp.mount - Temporary Directory
   Loaded: loaded (/usr/lib/systemd/system/tmp.mount; disabled)
   Active: inactive (dead)
    Where: /tmp
     What: tmpfs
     Docs: man:hier(7)
[root@ol7beta ~]# systemctl start  tmp.mount
[root@ol7beta ~]# systemctl status  tmp.mount
tmp.mount - Temporary Directory
   Loaded: loaded (/usr/lib/systemd/system/tmp.mount; disabled)
   Active: active (mounted) since Wed 2014-04-16 05:33:32 ICT; 1s ago
    Where: /tmp
     What: tmpfs
     Docs: man:hier(7)
  Process: 16209 ExecMount=/bin/mount tmpfs /tmp -t tmpfs -o mode=1777,strictati                                                                                        me (code=exited, status=0/SUCCESS)
Apr 16 05:33:32 ol7beta systemd[1]: Mounting Temporary Directory...
Apr 16 05:33:32 ol7beta systemd[1]: tmp.mount: Directory /tmp to mount over...y.
Apr 16 05:33:32 ol7beta systemd[1]: Mounted Temporary Directory.
Hint: Some lines were ellipsized, use -l to show in full.
[root@ol7beta ~]# df
Filesystem          1K-blocks   Used Available Use% Mounted on
/dev/mapper/ol-root  49747968 962344  48785624   2% /
devtmpfs               886508      0    886508   0% /dev
tmpfs                  893876      0    893876   0% /dev/shm
tmpfs                  893876   2292    891584   1% /run
tmpfs                  893876      0    893876   0% /sys/fs/cgroup
/dev/sda1              487652  91380    366576  20% /boot
tmpfs                  893876      0    893876   0% /tmpNote: After installed, Not found "ifconfig" command line.
[root@ol7beta ~]# ip addr
1: lo: mtu 65536 qdisc noqueue state UNKNOWN
    link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
    inet scope host lo
    inet6 ::1/128 scope host
       valid_lft forever preferred_lft forever
2: p2p1: mtu 1500 qdisc pfifo_fast state UP qlen 1000
    link/ether 08:00:27:48:ff:7f brd ff:ff:ff:ff:ff:ff
    inet scope global p2p1
    inet6 fe80::a00:27ff:fe48:ff7f/64 scope link
       valid_lft forever preferred_lft foreverThe output of the ifconfig command has changed format.
So, installed ifconfig command and tested.
[root@ol7beta ~]# rpm -qa |grep createrepo
[root@ol7beta ~]# mount /dev/cdrom /mnt
mount: /dev/sr0 is write-protected, mounting read-only
[root@ol7beta ~]# cd /mnt/Packages/
[root@ol7beta Packages]# rpm -ivh createrepo-0.9.9-21.el7.noarch.rpm
warning: createrepo-0.9.9-21.el7.noarch.rpm: Header V3 RSA/SHA256 Signature, key ID ec551f03: NOKEY
error: Failed dependencies:
        deltarpm is needed by createrepo-0.9.9-21.el7.noarch
        libxml2-python is needed by createrepo-0.9.9-21.el7.noarch
        python-deltarpm is needed by createrepo-0.9.9-21.el7.noarch
[root@ol7beta Packages]# cd /mnt
[root@ol7beta mnt]# createrepo .
-bash: createrepo: command not found
[root@ol7beta mnt]#
[root@ol7beta mnt]#
[root@ol7beta mnt]#  cd /mnt/Packages/
[root@ol7beta Packages]# rpm -ivh createrepo-0.9.9-21.el7.noarch.rpm deltarpm-3.6-1.el7.x86_64.rpm  libxml2-python-2.9.1-2.0.1.el7.x86_64.rpm  python-deltarpm-3.6-1.el7.x86_64.rpm
warning: createrepo-0.9.9-21.el7.noarch.rpm: Header V3 RSA/SHA256 Signature, key ID ec551f03: NOKEY
Preparing...                          ################################# [100%]
Updating / installing...
   1:deltarpm-3.6-1.el7               ################################# [ 25%]
   2:python-deltarpm-3.6-1.el7        ################################# [ 50%]
   3:libxml2-python-2.9.1-2.0.1.el7   ################################# [ 75%]
   4:createrepo-0.9.9-21.el7          ################################# [100%]
[root@ol7beta Packages]# cd /mnt
[root@ol7beta mnt]# yum clean all
There are no enabled repos.
 Run "yum repolist all" to see the repos you have.
 You can enable repos with yum-config-manager --enable
[root@ol7beta mnt]# yum repolist all
repolist: 0
[root@ol7beta mnt]# vi /etc/yum.repos.d/iso.repo
[root@ol7beta mnt]# cat /etc/yum.repos.d/iso.repo
name=Local CD Repo
[root@ol7beta mnt]# yum clean all
Cleaning repos: local
Cleaning up everything
[root@ol7beta mnt]# yum repolist all
local                                                                                                                                            | 3.6 kB  00:00:00
(1/2): local/group_gz                                                                                                                            | 112 kB  00:00:00
(2/2): local/primary_db                                                                                                                          | 4.0 MB  00:00:00
repo id                                                                   repo name                                                                       status
local                                                                     Local CD Repo                                                                   enabled: 4,628
repolist: 4,628
[root@ol7beta mnt]# yum provides */ifconfig
local/filelists_db                                                                                                                               | 3.5 MB  00:00:00
net-tools-2.0-0.13.20131004git.el7.x86_64 : Basic networking tools
Repo        : local
Matched from:
Filename    : /sbin/ifconfig
[root@ol7beta mnt]# yum install net-tools-2.0-0.13.20131004git.el7.x86_64
Resolving Dependencies
--> Running transaction check
---> Package net-tools.x86_64 0:2.0-0.13.20131004git.el7 will be installed
--> Finished Dependency Resolution
Dependencies Resolved
 Package                               Arch                               Version                                               Repository                         Size
 net-tools                             x86_64                             2.0-0.13.20131004git.el7                              local                             303 k
Transaction Summary
Install  1 Package
Total download size: 303 k
Installed size: 917 k
Is this ok [y/d/N]: y
Downloading packages:
warning: /mnt/Packages/net-tools-2.0-0.13.20131004git.el7.x86_64.rpm: Header V3 RSA/SHA256 Signature, key ID ec551f03: NOKEY
Public key for net-tools-2.0-0.13.20131004git.el7.x86_64.rpm is not installed
Retrieving key from file:///mnt/RPM-GPG-KEY
Importing GPG key 0xEC551F03:
 Userid     : "Oracle OSS group (Open Source Software group) "
 Fingerprint: 4214 4123 fecf c55b 9086 313d 72f9 7b74 ec55 1f03
 From       : /mnt/RPM-GPG-KEY
Is this ok [y/N]: y
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
Warning: RPMDB altered outside of yum.
  Installing : net-tools-2.0-0.13.20131004git.el7.x86_64                                                                                                            1/1
  Verifying  : net-tools-2.0-0.13.20131004git.el7.x86_64                                                                                                            1/1
  net-tools.x86_64 0:2.0-0.13.20131004git.el7
[root@ol7beta mnt]# ifconfig -alo: flags=73  mtu 65536        inet  netmask        inet6 ::1  prefixlen 128  scopeid 0x10
        loop  txqueuelen 0  (Local Loopback)        RX packets 0  bytes 0 (0.0 B)        RX errors 0  dropped 0  overruns 0  frame 0        TX packets 0  bytes 0 (0.0 B)        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0
p2p1: flags=4163  mtu 1500        inet  netmask  broadcast        inet6 fe80::a00:27ff:fe48:ff7f  prefixlen 64  scopeid 0x20        ether 08:00:27:48:ff:7f  txqueuelen 1000  (Ethernet)        RX packets 4847  bytes 541675 (528.9 KiB)        RX errors 0  dropped 0  overruns 0  frame 0        TX packets 3591  bytes 1145806 (1.0 MiB)        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0and Users must know about "systemctl" command.
[root@ol7beta ~]#
[root@ol7beta ~]# type systemctl
systemctl is /usr/bin/systemctl
Written By: Surachart Opun
Categories: DBA Blogs

Interacting client-side JavaScript with server-side PeopleCode

Duncan Davies - Wed, 2014-04-16 03:23

This week we have a guest-post from Richard Yip. Richard is a PeopleSoft HCM & Payroll Technical Consultant at Santander Bank here in the UK. He is a long-term PeopleSoft techie and a regular at the UKOUG conferences.  I’m delighted to be able to give him a conduit through which to share some of his good work.

Richard Yip: How to interact client-side Javascript with server-side PeopleCode

(using getElementById.focus(), getElementById.submit(), submitAction_win0(this.form, and window.showModalDialog)

This example shows a custom built payroll results calendar which allows the user to display multiple sets of payroll results information on a single page, without needing to navigate out of the page as is currently delivered.

Additionally, it allows the display of related information such as payslip and 12 Months Fiscal Summary (pivoted summary results).

Note: results have been “desensitized” .

Something that looks like this:

Example 1
Click for bigger

The App Designer page, PeopleCode, SQL, JavaScript, CSS and HTML are shown below. The HTML/CSS were initially prototyped using Microsoft Office SharePoint Designer.

Screen shots

Main page after selecting an employee from a search page. Example 2


The menu bar above is created by HTML.RY_GP_RSLT_CAL_PIVOT_SQL_HDR

The following codes are used: -

PeopleCode :

Displaying the results calendar

When ‘Show Calendar’ is clicked, the payroll results for this calendar are now visible. See function Show_Calendar() and Hide_Calendar() in Javascript collection in HTML.RY_GP_RSLT_CUSTOM_JS.

Example 3
Click for bigger

Displaying the payroll results information

When then user clicks on any Calendar ID, highlighted in red, the PeopleCode is invoked via client side JavaScript.

To invoke PeopleCode, I used the javascript DOM method – see function do_calid(obj) in GetHTMLText(HTML.RY_GP_RSLT_CUSTOM_JS)


Screen shot below show 3 different calendar group/calendar id clicked by the user using the results calendar above.

Example 4
Click for bigger

The Clear Data  button will remove the specific payroll result. Basically, each calendar results in contained in a <DIV> tag with an ID. The following line takes the id of the <div> tag, for example BOX1 and passes it to the field DERIVED_RY_GP10.ID_LBL and then automatically clicks the field DERIVED_RY_GP10_CLEAR_BUTTON. This is similar to the DOM click() function previously mention. See HTML.RY_GP_RSLT_SEG_HDR

<div id='%Bind(:12)' class="result_container">
<input class='calendar_container' id="DERIVED_RY_GP10_CLEAR_BUTTON" title="ID" style="WIDTH: 72px" onclick="this.form.DERIVED_RY_GP10_ID_LBL.value='%Bind(:12)';submitAction_win0(this.form,;" type="button" value="Clear Data" name="DERIVED_RY_GP10_CLEAR_BUTTON">

The Copy to Excel  button will download the information to Excel. This uses the ActiveXObject(“Excel.Application”) function. See HTML.RY_GP_RSLT_HDR

Display the 12 Months Fiscal Summary

Using the window.showModalDialog function to display the 12 Months Fiscal Summary. See Get_Summary() in RY_GP_RSLT_CUSTOM_JS . Also see Get_Payslip().
Other method may include <iframe>

Example 5
Click for bigger

Summary/Design consideration

Strong knowledge of html, Javascript, CSS and  SQL in addition to PeopleCode.

Using a HTML editor facilitates prototype stage.

In this example, it’s all about presentation as getting the data from the database via SQL is fairly easier.

I have used the SQL XMLElement function to inject some html, albeit with a bit of trickery.

Page Field Name and Enable as Page Anchor gives the developer control over how the field is referenced in html

Code Listing

How to interact client side Javascript with server side PeopleCode


Preview of Maria Colgan, and Andrew Bond/Stewart Bryson Sessions at RM BI Forum 2014

Rittman Mead Consulting - Wed, 2014-04-16 02:11

We’ve got a great selection of presentations at the two upcoming Rittman Mead BI Forum 2014 events in Brighton and Atlanta, including sessions on Endeca, TimesTen, OBIEE (of course), ODI, GoldenGate, Essbase and Big Data (full timetable for both events here). Two of the sessions I’m particularly looking forward to though are ones by Maria Colgan, product manager for the new In-Memory Option for Oracle Database, and another by Andrew Bond and Stewart Bryson, on an update to Oracle’s reference architecture for Data Warehousing and Information Management.

The In-Memory Option for Oracle Database was of course the big news item from last year’s Oracle Openworld, promising to bring in-memory analytics and column-storage to the Oracle Database. Maria is of course well known to the Oracle BI and Data Warehousing community through her work with the Oracle Database Cost-Based Optimizer, so we’re particular glad to have her at the Atlanta BI Forum 2014 to talk about what’s coming with this new feature. I asked Maria to jot down a few worlds for the blog on what she’ll be covering, so over to Maria:

NewImage“Given this announcement and the performance improvements promised by this new functionality is it still necessary to create a separate access and performance layer in your data warehouse environment or to run  your Oracle data warehouse  on an Exadata environment?“At Oracle Open World last year, Oracle announced the upcoming availability of the Oracle Database In-Memory option, a solution for accelerating database-driven business decision-making to real-time. Unlike specialized In-Memory Database approaches that are restricted to particular workloads or applications, Oracle Database 12c leverages a new in-memory column store format to speed up analytic workloads.

This session explains in detail how Oracle Database In-Memory works and will demonstrate just how much performance improvements you can expect. We will also discuss how it integrates into the existing Oracle Data Warehousing Architecture and with an Exadata environment.”

The other session I’m particularly looking forward to is one being delivered jointly by Andrew Bond, who heads-up Enterprise Architecture at Oracle and was responsible along with Doug Cackett for the various data warehousing, information management and big data reference architectures we’ve covered on the blog over the past few years, including the first update to include “big data” a year or so ago.


Back towards the start of this year, Stewart, myself and Jon Mead met up with Andrew and his team to work together on an update to this reference architecture, and Stewart carried on with the collaboration afterwards, bringing in some of our ideas around agile development, big data and data warehouse design into the final architecture. Stewart and Andrew will be previewing the updated reference architecture at the Brighton BI Forum event, and in the meantime, here’s a preview from Andrew:

“I’m very excited to be attending the event and unveiling Oracle’s latest iteration of the Information Management reference architecture. In this version we have focused on a pragmatic approach to “Analytics 3.0″ and in particular looked at bringing an agile methodology to break the IT / business barrier. We’ve also examined exploitation of in-memory technologies and the Hadoop ecosystem and guiding the plethora of new technology choices.

We’ve worked very closely with a number of key customers and partners on this version – most notably Rittman Mead and I’m delighted that Stewart and I will be able to co-present the architecture and receive immediate feedback from delegates.”

Full details of the event, running in Brighton on May 7-9th 2014 and Atlanta, May 15th-17th 2014, can be found on the Rittman Mead BI Forum 2014 homepage, and the agendas for the two days are on this blog post from earlier in the week.

Categories: BI & Warehousing

The worst database developers in the world?

DBMS2 - Wed, 2014-04-16 00:45

If the makers of MMO RPGs (Massive Multi-Player Online Role-Playing Games) aren’t quite the worst database application developers in the world, they’re at least on the short list for consideration. The makers of Guild Wars didn’t even try to have decent database functionality. A decade later, when they introduced Guild Wars 2, the database-oriented functionality (auction house, real-money store, etc.) would crash for days at a time. Lord of the Rings Online evidently had multiple issues with database functionality. Now I’m playing Elder Scrolls Online, which on the whole is a great game, but which may have the most database screw-ups of all.

ESO has been live for less than 3 weeks, and in that time:

1. There’s been a major bug in which players’ “banks” shrank, losing items and so on. Days later, the data still hasn’t been recovered. After a patch, the problem if anything worsened.

2. Guild functionality has at times been taken down while the rest of the game functioned.

3. Those problems aside, bank and guild bank functionality are broken, via what might be considered performance bugs. Problems I repeatedly encounter include:

  • If you deposit a few items, the bank soon goes into a wait state where you can’t use it for a minute or more.
  • Similarly, when you try to access a guild — i.e. group — bank, you often find it in an unresponsive state.
  • If you make a series of updates a second apart, the game tells you you’re doing things too quickly, and insists that you slow down a lot.
  • Items that are supposed to “stack” appear in 2 or more stacks; i.e., a very simple kind of aggregation is failing. There are also several other related recurring errors, which I conjecture have the same underlying cause.

In general, it seems like that what should be a collection of database records is really just a list, parsed each time an update occurs, periodically flushed in its entirety to disk, with all the performance problems you’d expect from that kind of choice.

4. Even stupider are the in-game stores, where fictional items are sold for fictional money. They have an e-commerce interface that is literally 15+ years out of date — items are listed with VERY few filtering options, and there is no way to change the sort. But even that super-primitive interface doesn’t work; in particular, filter queries frequently return incorrect empty-set responses.

5. Much as in other games, over 10 minutes of state changes can be lost.

Except perhaps for #5, these are all functions that surely are only loosely coupled to the rest of the game. Hence the other difficulties of game scaling and performance should have no bearing on them. Hence there’s no excuse for doing such a terrible job of development on large portions of gameplay functionality.

Based on job listings, ESO developer Zenimax doesn’t see database functionality as a major area to fix. This makes me sad.

Categories: Other

Coursera CEO Interview: Mike Caulfield nailed it two months ago

Michael Feldstein - Tue, 2014-04-15 20:03

Two months ago Mike Caulfield lamented the inability for many people in online education, especially massive online initiatives, to honestly learn from the past. In the post Mike referred to the failed AllLearn initiative and the seminal post-mortem written up in University Business.

How does that relate? A paragraph from the 2006 post-mortem of AllLearn really stuck out for me:

Oxford, Yale, and Stanford have kept quiet about the collapse of their joint e-learning venture…[h]owever, AllLearn’s closure could offer an unprecedented opportunity to step back and discuss the strengths and weaknesses of the business model… Further research into the series of collapsed online ventures may shed some light on what makes a successful distance education program, and enable some of the surviving online providers to redefine their business models and marketing strategies accordingly

Of course they don’t delve into these things honestly, and as a result most people in these institutions are unaware of them. Like Leonard, the institutions alter the record of the past. They wake up the next day with amnesia, consult a set of dramatically altered notes, and wonder why no one has tried massive Ivy League courses yet. The PR push to cover one’s tracks ends up erasing the institutional knowledge that could build a better initiative.

Little did Mike realize that he was writing a script.

One month later Coursera hired Richard Levin as its new CEO. As president of Yale, Levin was one of the key figures in the creation of All Learn in 2000, and after the 2006 collapse of the initiative Levin was one of the key figures directly responsible for the Open Yale Courses initiative.

The consensus view is that AllLearn failed to generate enough interest in its non-credit elite courses, and subsequently the program closed due to economics (by Levin’s own previous admission). In 2005 AllLearn attempted to address this challenge by branching beyond alumni as related in this Yale Daily News post [emphasis added in all quotes below].

“I think we’ve learned a lot form the experiment,” Levin said. “While I believe we’ve produced some very high quality courses, we’ve learned that it’s hard to generate large audiences sufficiently from these courses from just the alumni of the three partner schools. So we’ve reached out to attract broader audiences through affiliating with universities and through finding other organizations that might have an interest in making courses available to members.”

Fast forward a year, and it is clear that the effort had failed economically despite the broadening of audiences, again from the Yale Daily News.

Yale President Richard Levin, who served as AllLearn’s chairman, said he thinks that while the participating institutions learned what is necessary to manage a successful distance learning program, they were unable to make the project financially viable.

“We are disappointed that we weren’t able to find a way to make this successful economically,” Levin said. “[But] we learned a lot, and I think it will serve us well in the future.”

Open Yale Courses also provides non-credit elite courses. The problem? You might have guessed it, as described by this 2012 report for the Committee on Online Education.

Open Yale Courses has been supported by generous grants from the Hewlett Foundation, but those grants are ending this semester; and there is no provision yet for the continuation of this program. There has been extensive planning, however, to keep the 42 existing courses on the Yale site as well as the iTunes U and YouTube platforms. All of the courses are being stored and preserved for future study. New visitors are discovering Open Yale Courses daily and global media coverage, which has been constant and enthusiastic since the start of the project, continues unabated.

The initiative is now attempting direct solicitation as a method of funding.

I don’t mean to question Levin’s good intentions nor his considerable support of the mission of making education more accessible through online technology. However, I find it disingenuous to try and alter history. This week the New York Times interviewed Levin about his new role as Coursera CEO, and the reporter asked some good questions but lacked follow-up.

Q. Yale has not exactly been a mass institution.

A. No, but we were early in the on-line arena, with a venture back in 2000 called All-Learn.

Q. How much did you lose, and why didn’t that spoil this for you?

A. It was too early. Bandwidth wasn’t adequate to support the video. But we gained a lot of experience of how to create courses, and then we used it starting in 2007 to create very high quality videos, now supported by adequate bandwidth in many parts of the world, with the Open Yale courses. We’ve released over 40 of them, and they gained a wide audience.

So here we have yet another initiative offering non-credit elite courses, and one of the biggest challenges that Coursera faces is that it has yet to find a viable business model. The company is living on $85 million in venture capital investment and has not yet found revenue sources that go beyond a few million dollars per year (Signature Track). Levin called out this challenge in the same NYT interview.

Q. Doesn’t edX have an advantage in being not-for-profit, meaning they don’t have to worry about returning on investment so soon? Yesterday Andrew Ng said, “We’ve raised $85 million, so we’ve got some runway.” How much runway?

A. I think the principal investors in Coursera understand that this is a long term play. We’re fortunate to have patient investors; and as Andrew said, we’re quite adequately capitalized. I think we can become financially viable certainly within that five-year framework.

Q. You’re an economist. How do you get from here to there?

A. Right now courses are free and we’re charging for certification. We think that as the idea of using Coursera courses for professional advancement grows, the numbers seeking certificates will grow. And the price we charge probably can grow, too. A move from $50 or $60 for Signature Track to $100 is certainly imaginable. At $100 a pop, if you had two or three, or five million people. …

I would suggest that Coursera will not “get from here to there” by altering the record of the past. AllLearn failed to generate sufficient interest in its courses, and the proximate cause was not “insufficient bandwidth”. AllLearn in fact had several approaches that alleviated bandwidth concerns, including CD-ROM delivery and the ability to turn off high-bandwidth features. AllLearn’s average learner was a 47-year-old Yale alumni – hardly a case of low-income lack of access to sufficient bandwidth. Plenty of online ventures started in 2000 or prior have succeeded – Penn State’s World Campus, UMUC, most large for-profits, UMassOnline, University of Central Florida, etc. This was not a case of being “too early”.

Read the University Business post-mortem and the Yale article. The issue involved economics and insufficient revenue to offset expenses.

Coursera and all the xMOOC providers have this same long-term challenge of adequate business models. I called out this challenge as one of the four key barriers that MOOcs faced, based on a July 2012 post. I speak as someone who would like to see MOOCs succeed – not in their current form, but in a form that evolves to better meet learner needs. This healthy evolution won’t happen, however, unless the providers honestly evaluate the lessons of the past.


The post Coursera CEO Interview: Mike Caulfield nailed it two months ago appeared first on e-Literate.

Using test prep software to prepare for 12c OCP upgrade exam

Bobby Durrett's DBA Blog - Tue, 2014-04-15 17:54

I got the newly available Kaplan test prep software for the Oracle 12c OCP upgrade exam.

I took the test in certification mode when I was tired at the end of the day some day last week and got 44% right – fail!  I usually wait until I get all the questions right before taking the real test so I have a ways to go.

The practice test software has been useful  in terms of showing me things I didn’t study very well or at all.  I’m expecting to significantly improve my correct answer percentage on my next pass.

I’m a little nervous though because it seems that the real test involves some questions that are generic database questions and I don’t think that the test prep software includes that section.  If you look at the list of topics they have a  section called “Key DBA Skills”.  I’d hope that after 19 years as an Oracle DBA I’d have some skills, but there are plenty of things I don’t do every day, such as setting up ASM.  I guess I’ll just have to bone up on the key areas of pre-12c that I don’t use all the time and hope I’m not surprised.

Anyway, I’m at 44% but hoping to make some strides in the new few weeks.

- Bobby


Categories: DBA Blogs

Disable IAMSuiteAgent

Online Apps DBA - Tue, 2014-04-15 17:25
This post will give an insight into IAMSuiteAgent and how to disable it? IAMSuiteAgent is a pre-built Java agent that comes with OAM 11g by default. Few important points of IAMSuiteAgent are: The IAMSuiteAgent is a domain-wide agent: Once Access Manager is deployed, the IAMSuiteAgent is installed on every server in the domain Unless disabled, [...]

This is a content summary only. Visit my website for full links, other content, and more!
Categories: APPS Blogs

April 2014 Critical Patch Update Released

Oracle Security Team - Tue, 2014-04-15 14:04

Hello, this is Eric Maurice again.

Oracle today released the April 2014 Critical Patch Update.  This Critical Patch Update provides fixes for 104 vulnerabilities across a number of product lines including: Oracle Database, Oracle Fusion Middleware, Oracle Hyperion, Oracle Supply Chain Product Suite, Oracle iLearning, Oracle PeopleSoft Enterprise, Oracle Siebel CRM, Oracle Java SE, Oracle and Sun Systems Products Suite, Oracle Linux and Virtualization, and Oracle MySQL.  A number of the vulnerabilities fixed in this Critical Patch Update have high CVSS Base Score and are being highlighted in this blog entry.  Oracle recommends this Critical Patch Update be applied as soon as possible.

Out of the 104 vulnerabilities fixed in the April 2014 Critical Patch Update, 2 were for the Oracle Database.  The most severe of these database vulnerabilities received a CVSS Base Score of 8.5 for the Windows platform to denote a full compromise of the targeted system, although a successful exploitation requires of this bug requires authentication by the malicious attacker.  On other platforms (e.g., Linux, Solaris), the CVSS Base Score is 6.0, because a successful compromise would be limited to the Database and not extend to the underlying Operating System.  Note that Oracle reports this kind of vulnerabilities with the ‘Partial+’ value for Confidentiality, Integrity, and Availability impact (Partial+ is used when the exploit affects a wide range of resources, e.g. all database tables).  Oracle makes a strict application of the CVSS 2.0 standard, and as a result, the Partial+ does not result in an inflated CVSS Base Score (CVSS only provides for ‘None,’ ‘Partial,’ or ‘Complete’ to report the impact of a bug).  This custom value is intended to call customers’ attention to the potential impact of the specific vulnerability and enable them to potentially manually increase this severity rating.  For more information about Oracle’s use of CVSS, see

This Critical Patch Update also provides fixes for 20 Fusion Middleware vulnerabilities.  The highest CVSS Base Score for these Fusion Middleware vulnerabilities is 7.5.  This score affects one remotely exploitable without authentication vulnerability in Oracle WebLogic Server (CVE-2014-2470).  If successfully exploited, this vulnerability can result in a wide compromise of the targeted WebLogic Server (Partial+ rating for Confidentiality, Integrity, and Availability.  See previous discussion about the meaning of the ‘Partial+’ value reported by Oracle). 

Also included in this Critical Patch Update were fixes for 37 Java SE vulnerabilities.  4 of these Java SE vulnerabilities received a CVSS Base Score of 10.0.  29 of these 37 vulnerabilities affected client-only deployments, while 6 affected client and server deployments of Java SE.  Rounding up this count were one vulnerability affecting the Javadoc tool and one affecting unpack200.  As a reminder, desktop users, including home users, can leverage the Java Autoupdate or visit to ensure that they are running the most recent version of Java.  Java SE security fixes delivered through the Critical Patch Update program are cumulative.  In other words, running the most recent version of Java provides users with the protection resulting from all previously-released security fixes.   Oracle strongly recommends that Java users, particularly home users, keep up with Java releases and remove obsolete versions of Java SE, so as to protect themselves against malicious exploitation of Java vulnerabilities. 

This Critical Patch Update also included fixes for 5 vulnerabilities affecting Oracle Linux and Virtualization products suite.  The most severe of these vulnerabilities received a CVSS Base Score of 9.3, and this vulnerability (CVE-2013-6462) affects certain versions of Oracle Global Secure Desktop. 

Due to the relative severity of a number of the vulnerabilities fixed in this Critical Patch Update, Oracle strongly recommends that customers apply this Critical Patch Update as soon as possible.  In addition, as previously discussed, Oracle does not test unsupported products, releases and versions for the presence of vulnerabilities addressed by each Critical Patch Update.  However, it is often the case that earlier versions of affected releases are affected by vulnerabilities fixed in recent Critical Patch Updates.  As a result, it is highly desirable that organizations running unsupported versions, for which security fixes are no longer available under Oracle Premier Support, update their systems to a currently-supported release so as to fully benefit from Oracle’s ongoing security assurance effort.

For more information:

The April 2014 Critical Patch Update Advisory is located at

More information about Oracle’s application of the CVSS scoring system is located at

An Ovum white paper “Avoiding security risks with regular patching and support services” is located at

More information about Oracle Software Security Assurance, including details about Oracle’s secure development and ongoing security assurance practices is located at

The details of the Common Vulnerability Scoring System (CVSS) are located at

Java desktop users can verify that they are running the most version of Java and remove older versions of Java by visiting      



<b>Contributions by Angela Golla,

Oracle Infogram - Tue, 2014-04-15 13:52
Contributions by Angela Golla, Infogram Deputy Editor

Mark Hurd’s Latest Blog Explains Why Customer-Obsessed Marketing Is Your Next Competitive Edge

Oracle President Mark Hurd has posted his latest LinkedIn Influencer blog, “Customer-Obsessed Marketing Is Your Next Competitive Edge.” 
Mark HurdMark Hurd,
President, OracleIn this new blog, Mark writes, “Marketing executives are leading the charge to convince their organizations of the inherent danger in today’s highly digitized buyer-seller relationship. And they’re doing that by proving that “your customers are only one click away from your competitors” is more than just a clever phrase—it’s the difference between being a market leader and going out of business.
"The good news is that as marketing executives strive to develop new customer-engagement models, to optimize multiple channels formerly in conflict and generate new revenue streams, they now have access to world-class marketing-automation tools, which have the potential to keep more prospects from making that one-click jump to a competitor…

Frequently Misused Metrics in Oracle

Steve Karam - Tue, 2014-04-15 13:43
Lying Businessman

Back in March of last year I wrote an article on the five frequently misused metrics in Oracle: These Aren’t the Metrics You’re Looking For.

To sum up, my five picks for the most misused metrics were:

Business Graph

  1. db file scattered read – Scattered reads aren’t always full table scans, and they’re certainly not always bad.
  2. Parse to Execute Ratio – This is not a metric that shows how often you’re hard parsing, no matter how many times you may have read otherwise.
  3. Buffer Hit Ratio – I want to love this metric, I really do. But it’s an advisory one at best, horribly misleading at worst.
  4. CPU % – You license Oracle by CPU. You should probably make sure you’re making the most of your processing power, not trying to reduce it.
  5. Cost – No, not money. Optimizer cost. Oracle’s optimizer might be cost based, but you are not. Tune for time and resources, not Oracle’s own internal numbers.

Version after version, day after day, these don’t change much.

Anyways, I wanted to report to those who aren’t aware that I created a slideshow based on that blog for RMOUG 2014 (which I sadly was not able to attend at the last moment). Have a look and let me know what you think!

Metric Abuse: Frequently Misused Metrics in Oracle

Have you ever committed metric abuse? Gone on a performance tuning snipe hunt? Spent time tuning something that, in the end, didn’t even really have an impact? I’d love to hear your horror stories.

Also while you’re at it, have a look at the Sin of Band-Aids, and what temporary tuning fixes can do to a once stable environment.

And lastly, keep watching #datachat on Twitter and keep an eye out for an update from Confio on today’s #datachat on Performance Tuning with host Kyle Hailey!

The post Frequently Misused Metrics in Oracle appeared first on Oracle Alchemist.

Links to External Articles and Interviews

Michael Feldstein - Tue, 2014-04-15 11:41

Last week I was off the grid (not just lack of Internet but also lack of electricity), but thanks to publishing cycles I managed to stay artificially productive: two blog posts and one interview for an article.

Last week brought news of a new study on textbooks for college students, this time from a research arm of the  National Association of College Stores. The report, “Student Watch: Attitudes and Behaviors toward Course Materials, Fall 2013″, seems to throw some cold water on the idea of digital textbooks based on the press release summary [snip]

While there is some useful information in this survey, I fear that the press release is missing some important context. Namely, how can students prefer something that is not really available?

March 28, 2014 may well go down as the turning point where Big Data lost its placement as a silver bullet and came down to earth in a more productive manner. Triggered by a March 14 article in Science Magazine that identified “big data hubris” as one of the sources of the well-known failures of Google Flu Trends,[1] there were five significant articles in one day on the disillusionment with Big Data. [snip]

Does this mean Big Data is over and that education will move past this over-hyped concept? Perhaps Mike Caulfield from the Hapgood Blog stated it best, including adding the education perspective . . .

This is the fun one for me, as I finally have my youngest daughter’s interest (you made Buzzfeed!). Buzzfeed has added a new education beat focusing on the business of education.

The public debut last week of education technology company 2U, which partners with nonprofit and public universities to offer online degree programs, may have looked like a harbinger of IPO riches to come for companies that, like 2U, promise to disrupt the traditional education industry. At least that’s what the investors and founders of these companies want to believe. [snip]

“We live in a post-Facebook area where startups have this idea that they can design a good product and then just grow, grow, grow,” said Phil Hill, an education technology consultant and analyst. “That’s not how it actually works in education.”


The post Links to External Articles and Interviews appeared first on e-Literate.

Twitter Oracle Security Open Chat Thursday 6th March

Pete Finnigan - Tue, 2014-04-15 10:50

I will be co-chairing/hosting a twitter chat on Thursday 6th March at 7pm UK time with Confio. The details are here . The chat is done over twitter so it is a little like the Oracle security round table sessions....[Read More]

Posted by Pete On 05/03/14 At 10:17 AM

Categories: Security Blogs

PFCLScan Reseller Program

Pete Finnigan - Tue, 2014-04-15 10:50

We are going to start a reseller program for PFCLScan and we have started the plannng and recruitment process for this program. I have just posted a short blog on the PFCLScan website titled " PFCLScan Reseller Program ". If....[Read More]

Posted by Pete On 29/10/13 At 01:05 PM

Categories: Security Blogs

PFCLScan Version 1.3 Released

Pete Finnigan - Tue, 2014-04-15 10:50

We released version 1.3 of PFCLScan our enterprise database security scanner for Oracle a week ago. I have just posted a blog entry on the PFCLScan product site blog that describes some of the highlights of the over 220 new....[Read More]

Posted by Pete On 18/10/13 At 02:36 PM

Categories: Security Blogs

PFCLScan Updated and Powerful features

Pete Finnigan - Tue, 2014-04-15 10:50

We have just updated PFCLScan our companies database security scanner for Oracle databases to version 1.2 and added some new features and some new contents and more. We are working to release another service update also in the next couple....[Read More]

Posted by Pete On 04/09/13 At 02:45 PM

Categories: Security Blogs