Skip navigation.

Feed aggregator

Oracle OpenWorld 2014 – Datatype context…?!

Marco Gralike - Tue, 2014-09-30 20:26
The native JSON database functionality presentations are done. If you want to experience first hand…

OOW14 Day 2 - Delphix #cloneattack

Yann Neuhaus - Tue, 2014-09-30 19:14

Do you know Delphix? The first time I heard of it was from Jonathan Lewis. And from Kyle Hailey of course. So it's not only about agile and virtualization. It's a real DBA stuff. So as I did yesterday with Dbvisit #repattack let's install the demo.

Here is the setup:

  • one source virtual machine with an XE database
  • one target virtual machine with XE installed but no database
  • one virtual machine with Delphix
And what can we do with that? We can clone the databases instantaneously. It's:
  • a virtual appliance managing storage snapshots for instant cloning
  • this is exposed through direct NFS to be used by the database
  • totally automated database maintenance (creating, restore, changing name, etc) through a nice GUI
So what's the point? You want to clone an environment instantaneously. Chose the point in time you want and it's done. You can clone 50 databases for your 50 developers. You can rewind your test database to run unit testing in an continuous integration development environment. You can do all that stuff that requires so many IT procedures just with a few clicks on the Delphix GUI.   Just an example, here is my source database and the way I choose the point in time I want to clone:   CaptureDelphix01.PNG   It's running: CaptureDelphix02.PNG   The #cloneattack is a good way to test things and discuss with others...  

I have now @delphix on my laptop installed with @kylehhailey at #oow14. Cont. tomorrow at OTW http://t.co/QJLVhp93jg pic.twitter.com/QgoAgJPXyo

— Franck Pachot (@FranckPachot) September 30, 2014

@kylehhailey #cloneattack: finished it today now playing with clones while listening to @TanelPoder pic.twitter.com/wH3kQKBp8U

— Franck Pachot (@FranckPachot) September 30, 2014

That's some powerful multitasking - awesome @FranckPachot @TanelPoder

— Kyle Hailey (@kylehhailey) September 30, 2014

Good UX - Don't Leave Home Without It

Floyd Teter - Tue, 2014-09-30 16:56
There was a time when I asserted that User Experience would be a differentiator for Oracle in selling Fusion Applications.  Lots has changed since then, so I think it’s time to change my own thinking.  What’s changed?


  • Oracle has a cloud platform
  • Fusion Applications is now Cloud Application Services
  • We’re seeing well-designed user experiences throughout Oracle’s offerings: Simplified UI in moving into the Applications Unlimited products, and is also evident throughout Oracle’s cloud services offerings.
  • Other enterprise application software companies now see the value of a well-designed user experience.  Look at the transition at Infor.  Check ADP’s announcement from earlier today.  Even the brand-W company that cannot be named recently released software that is a straight clone of Oracle’s Simplified UI.

OpenWorld has only reinforced my opinion.  Everyone here - Oracle product teams, Oracle partners, 3rd-party product providers - everyone is talking about and offering an enhanced UX.

So, I don’t consider good user experience design as a differentiator anymore.  I now see it as a necessity.  Enterprise software applications vendors must offer well-design UI to even have a seat at the table.

But what about custom-developed applications?  Good user experience still required.  You can’t expect user adoption without it.  In fact, I see the tools that facilitate good user experience design to be value-added products in and of themselves.


Good UX.  Don’t leave home without it.

Exadata Shellshock: IB Switches Vulnerable

Jason Arneil - Tue, 2014-09-30 16:16

Andy Colvin has the lowdown on the Oracle response and fixes for the bash shellshock vulnerability.

However, when I last looked it seemed Oracle had not discussed anything regarding the IB switches being vulnerable.

The IB switches have bash running on them and Oracle have verified the IB switches are indeed vulnerable.


[root@dm01dbadm01 ~]# ssh 10.200.131.22

root@10.200.131.22's password:

Last login: Tue Sep 30 22:46:41 2014 from dm01dbadm01.e-dba.com

You are now logged in to the root shell.

It is recommended to use ILOM shell instead of root shell.

All usage should be restricted to documented commands and documented

config files.

To view the list of documented commands, use "help" at linux prompt.

[root@dm01sw-ibb0 ~]# echo $SHELL

/bin/bash

[root@dm01sw-ibb0 ~]# rpm -qf /bin/bash

bash-3.2-21.el5

We have fixed up, as instructed by Oracle, our compute nodes and the test then shows the following once you are no longer vulnerable to the exploit:

env 'x=() { :;}; echo vulnerable' 'BASH_FUNC_x()=() { :;}; echo vulnerable' bash -c "echo test"
bash: warning: x: ignoring function definition attempt
bash: error importing function definition for `BASH_FUNC_x'
test

Note the lack of “vulnerable” in the output.

Unfortunately when we come to run on the IB switches:


[root@dm01sw-ibb0 ~]# env 'x=() { :;}; echo vulnerable' 'BASH_FUNC_x()=() { :;}; echo vulnerable' bash -c "echo test"
vulnerable
bash: BASH_FUNC_x(): line 0: syntax error near unexpected token `)'
bash: BASH_FUNC_x(): line 0: `BASH_FUNC_x() () { :;}; echo vulnerable'
bash: error importing function definition for `BASH_FUNC_x'
test
[root@dm01sw-ibb0 ~]# bash: warning: x: ignoring function definition attempt
-bash: bash:: command not found
[root@dm01sw-ibb0 ~]# bash: error importing function definition for `BASH_FUNC_x'
> test
> 

It’s vulnerable. As apparently is the iLOM. There are as yet no fixes available for either of these.


Day 2 at Oracle Open World - best practices for WebLogic & Cloud Control

Yann Neuhaus - Tue, 2014-09-30 15:03

Today, in this post I will describe some Oracle WebLogic and Cloud Control best practices I have learned in the last sessions. It's always good to see what is advised by other people that are confronted with other or the same challenges.

 

Managing Oracle WebLogic Server with Oracle Enterprise Manager 12c

One session was related to the best pratices for managing WebLogic with Cloud Control 12c.

  • Use the administration functions:

Now you can, with Cloud Control 12c, do the WebLogic administration using its console. Starting and stopping the managed servers and applications were already possible but now you can do more like configuring the resources, deploying applications and so on.
As you are using the Cloud Control console you can sign in to several targets WLS servers. This means you have to enter each time the required password. By providing the credentials and saving them as the preferred ones (in Preferred Credentials) you avoid to enter the password each time.

  • Automate Tasks accross domains with predefined jobs:

Predefined jobs can be used to start automatically WLST scripts and this against one or more domains. Like with the WLS console you can register your actions into a .py script, update it for your new targets, create the job and set the schedule. This can obviously be a script for configuration but also for monitoring or creating statistics.  

  • Automatic response to issue via corrective action:

By including corrective actions in templates you can apply them to managed servers. If the corrective action fails, by using rules you can send email in a second step to inform that there is an issue which need to be solved.

  • Use EMCLI to manage the credentials
  • use APEX to query the Management Repository for reporting

 

Troubleshooting Performance Issues

An other session where best practices were explained was the session on "Oracle WebLogic Server: Best Practices for Troubleshooting Performance Issues". A very helpfull session, all chairs in the room were occupied and some people had to stand, meaning the session was expected.

Some general tips:  

  •  verbose:gc to find out if the performance issues are related to the garbage collection behaviour  
  •  Dweblogic.log.RedirectStdoutToServerLogEnabled=true  
  •  use the Java Flight Recorder (JFR)  
  •  use Remote Diagnostic Agent (RDA)  
  •  use WLDF to create an image of your system  
  •  Thread/heap dumps to see how your application is working

One of the first action you have to do is to read the log files as they can show you which kind of errors are logged; stuck threads, too many open files aso.

The same application can behave differently whether it is deployed on WebLogic running on Linux or on Windows. For instance a socket can remain in TIME_WAIT 4 minutes in Linux but only 1 minute under Windows.

In case you encounter OutOfMemory errors, log the garbage collector information

-verbose:gc -XX+PrintGCDetails -XX:PrintGCDateStamps -XX:-PrintGCTimeStamps

More information can be found in the document referred by ID 877172.1

Thread Dump
To analyze your application you can create a thread dump

  •  under Unix/Linux: kill -3
  •  jstack
  •  WLST threadDump()
  •  jcmdprint_thread (for Java HotSpot)
  •  jcmdThread.print (for Java 7)

More information can be found in the document referred by ID 1098691.1

Once the thread dump has been created you have to analyze it.
For that, several tools are available

  •  Samurai
  •  Thread Dump Analyzer (TDA)
  •  ThreadLogic

Some best practices I already know; one tool I want to test now is ThreadLogic to be trained in case I have to use it in a real case.

Let's see what will happen in the next days.

What to expect at this year’s Gartner Symposium [VIDEO]

Chris Foot - Tue, 2014-09-30 15:01

Transcript

Unsure of how IT will impact enterprises in the near future?

Hi, welcome back to RDX! CIOs will probably encounter a number of challenges in the years ahead. The Gartner Symposium will feature presentations on strategic IT procurement, critical industry trends and how businesses can gain value from the latest technologies.

The conference will be held at the Dolphin Hotel in Orlando, Florida from October 5th to the 9th. Notable speakers will be Microsoft CEO Satya Nadella and Lyft Inc. President and Co-Founder John Zimmer.

As you can imagine, we'll be informing attendees about our database monitoring and optimization services. If you want to find us, we'll be located at Booth 206 during show floor hours.

Thanks for watching! Can't wait to see you in Florida!

The post What to expect at this year’s Gartner Symposium [VIDEO] appeared first on Remote DBA Experts.

RDX IT Process Automation

Chris Foot - Tue, 2014-09-30 14:11

RDX’s IT Process Automation Strategy

Remote DBA Experts (RDX) is the largest pure-play provider of remote data infrastructure services. We have been providing remote services for over 20 years, which also makes us one of the pioneers in this space. We currently support hundreds of customers and thousands of database implementations.

Remote data infrastructure services is an extremely competitive market arena. Our competitors range from “2 guys in a garage” to major outsourcing providers like IBM and Oracle. Improving and enhancing our support architecture isn’t something beneficial to RDX; it is critical to our competitive survival.

One of our primary responsibilities at RDX is to research, and evaluate, leading-edge OS, database and application support technologies. The goal of these efforts is to ensure that RDX customers continue to receive the highest level of value from RDX’s support services. RDX’s strategy is to continue to be pioneers in the remote services space – just as we were 20 years ago. One of the key technologies that RDX is implementing to ensure our continued leadership as a remote services provider is IT Process Automation.

What is IT Process Automation?

Process automation, because of its wide range of application, takes many forms. Manufacturing companies have been using industrial robots to replace activities traditionally performed by humans for some time. Business process automation shares the same goal: to replace business functions performed by humans with software applications. Work activities that are repetitive in nature and require little intelligent analysis and decision making to complete are prime candidates for process automation.

Business software applications, by their essence, are designed to automate processes. Software programmers create intelligent decision trees to evaluate and refine stored data elements and display that processed data for human interaction or automate the decision making process entirely.

Automation products are designed to act upon stored data or capture it for processing. The data is analyzed using workflows (decision trees) and embedded rules. The automation product then performs a prescribed set of actions. The automation product can continue processing by executing additional workflows, prompt for human intervention or complete the process by performing an activity.

For the context of this article, IT Process automation is the implementation of software to programmatically automate routine (little decision making required), repetitive workflows and tasks performed by IT knowledge workers.

The Automation Tool Marketplace

A highly competitive market forces all automation vendors to accelerate the release of new products as well as enhancements to existing offerings. Automation vendors know that new features and functionalities are not a requirement for competitive advantage; they are a requirement for competitive survival. The more competitive the space, the greater the benefit to the consumer. Vendor competition will ensure that automation products become more intelligent, more cost effective and easier to implement and administer.

As the number of features provided by automation products grows, so does the importance of taking advantage of those new features. Automation product licensing and vendor maintenance contracts command a premium price in the marketplace. To gain the most return on their investment, companies must ensure that they are completely leveraging the benefits of the particular automation product being used. Understanding all of the inherent features is important, but selecting the features that bring each individual implementation the most benefit is the key to success.

The endless array of automation offerings add complexity to product selection. IT automation product features and functionality range the spectrum from niche offerings that focus on automating a very well-defined, specific set of tasks to products that provide a complete framework and set of tools designed to generate more global efficiencies by automating a wide range of activities. More traditional software vendors including database and monitoring tool providers realize that automation features provide their offerings with an advantage over competitors’ products.

RDX’s Automation Strategy

Process automation products have been on RDX’s technological radar for years. Various products provided bits and pieces of the functionality we required, but we were unable to identify an offering that provided a total automation solution.

Like many shops, RDX inter-weaved various scripts, programs and third-party products to automate repetitive tasks. Automation was done in an AD-HOC, opportunistic manner as the tasks were identified. RDX’s challenge was to select and implement a product that would provide a framework, architecture and set of tools that RDX could utilize to implement a company-wide automation architecture. The goal was to transform RDX’s automation activities from opportunistic and AD-HOC to a strategic initiative with a well-defined mission statement, clear set of achievable goals and detailed project plans with deliverables to obtain them.

RDX’s Process Automation Goals

RDX has two primary sources of repetitive tasks:

  • Customer event data collection, diagnosis and resolution
  • Internal support activities

Our goals for our automation strategy can be summarized into the following main points:

  • Improve the quality and speed of problem event analysis and resolution. Faster and higher quality problem resolution equals happy RDX customers.
  • Increase staff productivity by reducing the number of mundane, repetitive tasks the RDX staff is required to perform
  • Reduce operating costs through automation

Our environment is not entirely unique. Our service architecture can be compared to any IT shop that supports a large number of disparate environments. The resulting challenges we face are fairly common to any IT service provider:

  • RDX‘s desire to provide immediate resolutions to all performance and availability issues (reduce Mean Time to Resolution)
  • RDX looking to respond to client events with more accuracy
  • Implement a software solution that allows RDX to capture and record pockets of tribal knowledge and leverage that subject matter expertise by transforming it into automated processes to foster a culture of continuous process improvement
  • Reduce the amount of time RDX spends on both customer-facing and internal repetitive tasks to allow our support professionals to focus on higher ROI support activities
  • Provide the ability to quickly prove audit and compliance standards through report logs capturing the results of each automation task
  • RDX’s rapid growth requires us to process an exponentially increasing number of event alerts and administrative activities. The continuous hiring of additional resources to manage processes and data is not a scalable or cost-effective solution

RDX’s Automation Product Selection

RDX performed a traditional vendor analysis using a standardized evaluation methodology. A methodology can be loosely defined as a body of best practices, processes and rules used to accomplish a given task. The task in this case is to evaluate and select an automation product provider.

A needs analysis was performed to generate a weighted set of functional and technical requirements. The focus of the analysis was on selecting a product that would help us achieve our goal of implementing a strategic automation solution, as opposed to just buying a product. If we were unable to identify a solution that met our requirements, we were willing to delay the vendor selection process until we found one that did.

RDX selected GEN-E Resolve as our automation tool provider. GEN-E Resolve was able to provide the “end-to-end” architecture we required to automate both customer event resolution and RDX internal processes. GEN-E Resolve’s primary focus is on the automation of complex incident resolution and is a popular product with large telecommunication providers that support thousands of remote devices. What RDX found most beneficial was that the product did not require the installation of any software on our customers’ servers. All processing is performed on RDX’s Resolve servers running at our data center.

RDX’s First Step – Automatic Event Data Collection

The primary service we provide to our customers is ensuring their database systems are available at all times and performing as expected. Database administrators, by the very essence of our job descriptions, are the protectors of the organization’s core data assets. We are tasked with ensuring key data stores are continuously available. However, ensuring that data is available on a 24 x 7 basis is a wonderfully complex task.

When a mission-critical database application becomes unavailable, it can threaten the survivability of the organization. The financial impact of downtime is not the only issue that faces companies that have critical applications that are offline. Loss of customer goodwill, bad press, idle employees and legal penalties (lawsuits, fines, etc.) must also be considered.

It is up to the database administrator to recommend and implement technical solutions that deal with these unforeseen “technology disruptions.” When they do occur, it is our responsibility as DBAs to restore the operational functionality of the failed systems as quickly as possible.

RDX’s initial goal was to automate the collection of information required to perform problem analysis. The key to faster problem resolution is to reduce the amount of time collecting diagnostic data and spend that time analyzing it.

RDX prioritized customer events using the following criteria:

  • Frequency the event occurs
  • Severity of customer impact
  • Amount of time required to manually collect diagnostic data (reduce Mean Time to Resolution)
  • Complexity of the diagnostic data collection process (increase resolution accuracy)
  • Amount of human interaction required to collect diagnostic data (cost reduction)

RDX deployed a team of in-house automation specialists to collect the operational knowledge required to create the decision trees, workflows and data collection activities traditionally performed by RDX personnel. Our implementation, although still in its infancy, has met our initial expectations.

RDX has automated the diagnostic data collection process for several events and has proven that the automation tool can perform the tasks quickly, consistently and with high quality. RDX has also successfully implemented automatic problem resolution tasks for simple events. Subsequent enhancements to our automation capabilities are to leverage RDX’s collective operational knowledge to quickly resolve more complex issues.

Although our initial goal was to improve the speed and quality of our problem resolution process, our intent is to also move forward with the automation of our internal support processes. One of the key facets of the project’s success was to keep RDX personnel informed about the automation project and the benefits the implementation would provide to both RDX customers and internal support technicians. Promoting the product was crucial, as we found that it led to the generation of a veritable groundswell of internal process automation recommendations. Our intent is to formalize the internal process automation project by appointing RDX personnel as project owners and soliciting recommendations through company surveys (as opposed to an AD-HOC manner). Once the recommendations are collected, RDX will perform the same type of prioritization as we did during the initial stages of product implementation.

The Future of Automation

Although we will continue to see the greatest advances in automation in the traditional manufacturing spaces, IT process automation will continue to grow and mature until it becomes integrated into the fabric of most IT organizations. Larger shops will be the early adopters of IT automation, as they will be able to more quickly realize the benefits the solution provides than their smaller counterparts. As stated previously, a very competitive market arena will continue to accelerate the features and functionality provided by vendor products. As the offerings mature, they will become more robust, more intelligent and more cost effective. As a result, the adoption rate will continue to grow, as it would with any technology possessing these traits.

In the remote services space, it is how RDX intends to differentiate ourselves from our competitors. Outsourcing providers that manage large numbers of remote targets will be required to automate, or they will quickly lose market share to those competitors that do. It is RDX’s intention to be an innovator and not a “close follower” of automation technologies.

The post RDX IT Process Automation appeared first on Remote DBA Experts.

Plea For Tight Messages - OOW14

Floyd Teter - Tue, 2014-09-30 14:02
It’s so easy to lose track of time at Oracle OpenWorld.  I think I’m writing this on Tuesday, but can’t say for sure…

Lots of information being shared here:  incremental development of Simplified UI, a myriad of new cloud services announced (including a very cool Integration Cloud Service), new features for MySQL, new mobile applications for the E-Business Suite, Eloqua services for Higher Education, a visualization-oriented UI for OBIEE (and saw a very cool new visualization UI from the UX team, but I can’t talk about that yet), some interesting uses of Beacons…it’s like drinking from a firehose and darn near drowning in the attempt.  Info overload.

One of the cool things one gets to see at OOW: the rise of new third-party applications that improve and enhance Oracle products..  On Monday, I had the opportunity to sit down with the brain trust behind Xprtly!  What impressed me the most is the focus of their message - they’ve got it down to four slides (including a title).  Take a look and see if you get it.







So why do I bring this up?  Go back and read the second paragraph.  We’re all on information overload here.  The virtual noise level is incredible.  Tight, focused messages cut through the noise and get the point across.  Wish we saw more of this approach here…

Microsoft Hadoop: Taming the Big Challenge of Big Data – Part One

Pythian Group - Tue, 2014-09-30 11:12

Today’s blog post is the first in a three-part series with excerpts from our latest white paper, Microsoft Hadoop: Taming the Big Challenge of Big Data.

As companies increasingly rely on big data to steer decisions, they also find themselves looking for ways to simplify its storage, management, and analysis. The need to quickly access large amounts of data and use them competitively poses a technological challenge to organizations of all sizes.

Every minute, about two terabytes of data are being generated globally. That’s twice the amount from three years ago and half the amount predicted for three years from now.

Volume aside, the sources of data and the shape they take vary broadly. From government records, business transactions and social media, to scientific research and weather tracking, today’s data come in text, graphics, audio, video, and maps.

Download our full white paper which explores the impact of big data on today’s organizations and its challenges.

Categories: DBA Blogs

Packt Publishing - ALL eBooks and Videos are just $10 each or less until the 2nd of October

Surachart Opun - Tue, 2014-09-30 10:36
Just spread good campaign from Packt Publishing - It's a good news for people who love to learn something new - ALL eBooks and Videos are just $10 or less -- the more you choose to learn, the more you save:
  • Any 1 or 2 eBooks/Videos -- $10 each
  • Any 3-5 eBooks/Videos -- $8 each
  • Any 6 or more eBooks/Videos -- $6 each


Written By: Surachart Opun http://surachartopun.com
Categories: DBA Blogs

OCP 12C – Managing CDBs and PDBs

DBA Scripts and Articles - Tue, 2014-09-30 10:13

Connecting to a CDB or PDB Connecting to a CDB or PDB is not different of connecting to a standard database. When you create a PDB, it is automatically assigned a service which can be used to reach this specific PDB. You can either use the EZ connect syntax, the Oracle Net Service syntax or the [...]

The post OCP 12C – Managing CDBs and PDBs appeared first on Oracle DBA Scripts and Articles (Montreal).

Categories: DBA Blogs

Exadata: What’s Coming

Jason Arneil - Tue, 2014-09-30 10:09

This is based on the presentation Juan Loaiza gave regarding What’s new with Exadata. While a large part of the presentation focussed on what was already available, there are quite a few interesting new features that are coming down the road.

First of was a brief mention of the hardware. I’m less excited about this. The X4 has plenty of the hardware that you could want: CPU, memory and flash. You’d expect some or all of them to be bumped in the next generation.

New Hardware

This was skated over fairly quickly, but I expect an Exadata X5 in a few months. The X4 was released back in December 2013, first X4 I saw was January 2014. I wouldn’t be surprised if Oracle release the X5 on or around the anniversary of that release.

Very little was said about the new hardware that would be in the X5 except that the development cycle has followed what intel has released, and that cpu cores have gone up and flash capacity has gone up. No word was said on what CPU is going to be used on the X5.

The compute nodes on an X4-2 have Intel E5-2697 v2 chips this is a 12 core chip running at 2.7GHz. I’d expect an increase in core count. The X3 to X4 transition increased core count by 50%. If that happens again, we get to 18 cores. There is an Intel E5-2699 v3 with 18 cores but that’s clocked at 2.3GHz.

However, I think I’d be less surprised if they went with E5-2697 v3 which is 14 core chip clocked at 2.6GHz. That would be a far more modest increase in the number of cores. The memory speed available with this chip does go up though – it’s DDR4. Might help with In Memory option. I also wonder if they’ll bump the amount of memory supported – this chip (like the predecessor) can go to 768GB.

As I said, it was not mentioned which chip was going to be used, only that Intel had released new chips and that Oracle would be qualifying their use for Exadata over the coming months.

New Exadata Software

There was a bunch of interesting sounding new features coming down the road. Some of the ones that in particular caught my eye were:

The marketing friendly term “Exafusion”. Exafusion seems to be about speeding up OLTP, labelled as “Hardware Optimized OLTP Messaging” it’s a reimplementation of cache fusion. Messages bypass network stack leading to a performance improvement.

Columnar Flash Cache – This is Exadata automatically reformatting HCC data when written to flash as a pure column store for analytic workloads. Dual formats are stored.

Database snapshots on Exadata. This seems designed with pluggable databases in mind for producing fast clones for dev/test environments. Clearly something that was a gap with ASM as used on exadata, but ACFS does snapshots.

Currently the latest Linux release available on Exadata is 5.10. Upgrading across major releases is not supported – would have required reimaging. Not a pretty prospect. Thankfully Oracle are going to allow and enable upgrading in place to 6.5.

Some talk about reducing I/O outliers both in reading from hdd and in writing to flash.

Currently with IORM you can only enable or disable access to flash for a particular database. Full IORM seems to be coming for flash.

Final new feature that caught my eye was the long rumoured Virtualisation coming to Exadata. OVM is coming. The ODA for example has had VM capability for some time, so it’s in some ways an obvious extension. I’m expecting with the increasing number of cores lots of smaller organisations may not actually need all those cores and might think even if they could turn unused ones off, it’s a waste buying that hardware and not being able to use it.

I’m hoping to NOT see OVM on an Exadata in the wild anytime soon.

Software on Silicon

One final point almost tucked out of site, was that Juan had a little bullet point about “software on silicon”. Now this has me confused. My understanding is that when Larry was talking about this, it was specifically SPARC. That I can understand as Oracle controls what goes on the chip.

Ignoring the SPARC Supercluster, there is no SPARC on Exadata. So that leaves a closer collaboration with Intel or moving to SPARC. Collaborating closer with Intel is a possibility and Oracle had first dibs on the E7-8895 v2 for the X4-8.

I can’t imagine changing the compute nodes to SPARC that wouldn’t make sense. But “software on silicon” is a bit like offloading…

Exadata software is definitely keeping moving forward and the difference between running Oracle on Exadata compared with non-exadata is growing ever wider with each “exadata only” feature.


select a.b.c.d.e from t a

Laurent Schneider - Tue, 2014-09-30 07:37

I just learnt a new syntax. With 12.1.0.2, you can dive into your JSON document within sql, it looks a bit unusual to me


SQL> CREATE TABLE t
  (b CLOB CHECK (b IS JSON));
Table created.

SQL> insert into t(b) values 
  ('{"c":{"d":{"e":1}}}');
1 row created.

SQL> select a.b.c.d.e from t a;
C
----------------------
1

Largely inspired from Tom Kyte session

Level Up Campaign- Packt Publishing

Senthil Rajendran - Tue, 2014-09-30 06:08
Take your skills to the next level  For the next 7 days ALL eBooks and Videos are just $10 or less -- the more you choose to learn, the more you save:
  • Any 1 or 2 eBooks/Videos -- $10 each
  • Any 3-5 eBooks/Videos -- $8 each
  • Any 6 or more eBooks/Videos -- $6 each
The discounts above are automatically applied in your cart when you add the correct number of titles. Offer ends October 2nd.Explore more here Level Up !

Oracle Technology Network Monday in Review / Tuesday Preview - Oracle OpenWorld and JavaOne

OTN TechBlog - Tue, 2014-09-30 00:51

OTN's Tech Fest was AWESOME!  Thanks for joining us!  We had fun, and we hope you did, too!


The OTN team have been busy shooting video and attending sessions.  See what they've been up to so far -

Blogs -
The Java Source Blog
OTN DBA/DEV Watercooler

YouTube Channels -
OTN
Java
OTN Garage
OTN ArchBeat

Follow @JavaOneConf for conference-specific announcements

Hacking sessions and interviews on NightHacktingTV - LIVE from the Java Hub, 9am-4pm PT.


Special Activity in the OTN lounge, Moscone South Upper Lobby on Tuesday, September 30th -  OTN Wearable Meetup – 4 to 6pm  - See live demos of Oracle ideation and proof of concept wearable technology.  Show us your own wearables and discuss the finer points of use cases, APIs, integrations, UX design, and fashion and style considerations for wearable tech development, and lots more!

Complément : Chargeback

Jean-Philippe Pinte - Tue, 2014-09-30 00:48
A lire, l'article de Cap Gemini sur Enterprise Manager 12c, le Metering et le Chargeback :
http://www.capgemini.com/blog/capgemini-oracle-blog/2014/09/oracle-cloud-billing-with-metering-and-chargeback

1st day at Oracle Open World '14 : news on the upcoming Oracle WebLogic 12.2.1

Yann Neuhaus - Mon, 2014-09-29 19:33

Landing on Sunday 28th, after a 13 hours' trip my colleague Franck Pachot and I had just the time to do the registration, go to the hotel, and go back to the "Welcome Reception" where we could eat something. After a night where I could feel the jet lag :-) we where ready to "participate" in this amazing event, the Oracle Open World 2014.

The first session I attended was the keynote where new challenges were exposed, "moving" old 20 years applications; building new infrastructures with less budget as the money is put more to the business applications to fullfill the user demands and expectations; Big Data where the analyzes but also the delivery of the results has to be fast. To resume we are in a period where the infrastructure is changing by using more and more the cloud but the approach to deal with the new challenges has also to be changed to integrate this new digital world.

Another interesting session was the one from Mr. William Lyons about the Oracle WebLogic server strategy and roadmap. He talked about the Cloud Application Foundation like mobile development productivity, Foundation for Fusion Middleware and Application, High Availability, performance, multi-tenancy, cloud management and operation aso. He first recapitulated the new features from WebLogic 12.1.2 like the management of Coherence, Oracle HTTP server, webtier using only one tool like WLS console, WLST or OFMW console. He also talked about the dabatase integration with GriLink, RAC, multi tenant database, application continuity and Database Resident Connection Pool which improves the performance.

He passed then to the new features from 12.1.3 which has been released in June 2014. This new version improves functionnalities in the Fusion MiddleWare, Mobile as well as in High Availability areas. The developper can now have a free development license, they can install the product by using a zip version which contains also the patches. WebLogic 12.1.3 supports Java EE7 AND 8.

The next release which is plan for 2015 is WebLogic 12.2.1. With this version the multitenancy concept is covered where domain partition can be used to isolate resources for the different tenants. Regarding Java it will be fully compliant with Java EE7 and 8.

In this first day lots of information have been ingested but they have to be digested in the next weeks :-)

Let's see what will happend in the next days!

OOW14 Day 1 - Dbvisit #repattack

Yann Neuhaus - Mon, 2014-09-29 19:02

Oracle Open World is not only conferences but also practice and networking. Today at the OTN lounge have installed the following demos on my laptop:

  • a Dbvisit replicate #repattack with 
  • a Delphix cloning environement #cloneattack

I'll detail the former below and the latter tomorrow, but if you are at San Francisco and missed it, please come tomorrow to the same kind of session at the Oak Table World! You don't even need the OOW registration for that - it's independant but at the same place. Here are the details: http://www.oraclerealworld.com/oaktable-world/agenda/

 

Dbvisit replicate

This is the event:

Tweet:

Become a #RepAttack warrior at #OOW14 Get to the OTN lounge for briefing today from 3:00 pm http://t.co/fJRbOuMPqn

— Dbvisit Software (@dbvisit) September 29, 2014

 

Well, actually I did install everything a bit earlier as I had the #repattack environement before and I woke up very early because of the jet lag... The installation is straightforward and I've monitored it with anoter tool which I like (and we are partner as well): Orachrome Lighty.

Tweet:

I woke up because of #jetlag then I've installed @dbvisit #repattack on my laptop and monitor it with @orachrome pic.twitter.com/EVm1GZBo3l

— Franck Pachot (@FranckPachot) September 29, 2014

 

The idea is to quickly setup a source and a target virtualbox, with Oracle XE and Swingbench on the source. And then setup the replication on it. It is really straightforward and shows that logical replication is not too complex to set. So the OTN lounge was to occasion to meet the Dbvisit team.

 

Delphix

Here is the setup - I will continue tomorrow for cloning:

Tweet:

I have now @delphix on my laptop installed with @kylehhailey at #oow14. Cont. tomorrow at OTW http://t.co/QJLVhp93jg pic.twitter.com/QgoAgJPXyo

— Franck Pachot (@FranckPachot) September 30, 2014

Oracle OpenWorld 2014 – New Data Formats and Integration

Marco Gralike - Mon, 2014-09-29 18:13
Especially on this Monday it seems all about new database data intern ration technologies and…

OOW14 Update: PeopleSoft's new Delivery Model

Javier Delgado - Mon, 2014-09-29 17:30
Oracle Open World 2014 is a truly storm of information. Not only because of the myriad of sessions and the struggle needed to get from room to another with so many people around, but also due to the clouds, which seem to be everywhere. Ok, that was a really bad joke, but really clouds are omnipresent in Oracle's vocabulary nowadays.

Still, within the storm, I managed to attend some very interesting PeopleSoft sessions. I may discuss some other findings in future posts, but today I want to focus on the new Delivery Model.

This model, based in the PeopleSoft Update Manager, has been around since the release of PeopleSoft 9.2 applications. Initially, I had mixed thoughts on the approach. Naturally, it's great to be able to download a periodically updated Virtual Machine and be able to review the latest and greatest functionality. On the other side, the need of downloading the image files (which account for more than 30 Gb) to apply a single regulatory patch increases the time needed to apply single patches for which you had all the prerequisites.

However, Oracle has started to deliver important new functionalities in the latest update images (or plans to deliver some of them soon such as the first FluID applications in the HCM Update Image 9), so the benefit of the new delivery model becomes much more visible.

Oracle has announced that today there are no plans to deliver PeopleSoft 9.3. Is that bad news? Not necessarily. Actually, if new functionality keeps flowing through newly delivered update images, it becomes a significant improvement over having to perform an application upgrade every 4 or 5 years.

The new delivery model (a.k.a. Continuous Delivery Model) allows customers to pick new functionalities individually, without having to apply changes for other function points. This greatly simplifies the update process, reducing maintenance costs over the old application upgrade approach.

Oracle has recently release a white paper which I found very illustrating on the Continuous Delivery Model. On top of that, today the announcement was made that the Cumulative Feature Tool available at peoplesoftinfo.com will include the image numbers as releases in order to easily identify new functionalities.

Time will tell what the final innovation pace will be. Today, with FluID and self service application being delivered or planned frequently, the Continuous Delivery Model looks like a nice step forward.