Skip navigation.

Feed aggregator

OSP #3a: Build a Standard Cluster Platform

Jeremy Schneider - Thu, 2014-04-17 05:15

This is the fifth article in a series called Operationally Scalable Practices. The first article gives an introduction and the second article contains a general overview. In short, this series suggests a comprehensive and cogent blueprint to best position organizations and DBAs for growth.

We’ve looked in some depth at the process of defining a standard platform with an eye toward Oracle database use cases. Before moving on, it would be worthwhile to briefly touch on clustering.

Most organizations should hold off as long as possible before bringing clusters into their infrastructure. Clusters introduce a very significant new level of complexity. They will immediately drive some very expensive training and/or hiring demands – in addition to the already-expensive software licenses and maintenance fees. There will also be new development and engineering needed – perhaps even within application code itself – to support running your apps on clusters. In some industries, clusters have been very well marketed and many small-to-medium companies have made premature deployments. (Admittedly, my advice to hold off is partly a reaction to this.)

When Clustering is Right

Nonetheless there definitely comes a point where clustering is the right move. There are four basic goals that drive cluster adoption:

  1. Parallel or distributed processing
  2. Fault tolerance
  3. Incremental growth
  4. Pooled resources for better utilization

I want to point out immediately that RAC is just one way of many ways to do clustering. Clustering can be done at many tiers (platform, database, application) and if you define it loosely then even an oracle database can be clustered in a number of ways.

Distributed Processing

Stop for a moment and re-read the list of goals above. If you wanted to design a system to meet these goals, what technology would you use? I already suggested clusters – but that might not have been what came to your mind first. How about grid computing? I once worked with some researchers in Illinois who wrote programs to simulate protein folding and DNS sequencing. They used the Illinois BioGrid – composed of servers and clusters managed independently by three different universities across the state. How about cloud computing? The Obama Campaign in 2008 used EC2 to build their volunteer logistics and coordination platforms to dramatically scale up and down very rapidly on demand. According to the book In Search of Clusters by Gregory Pfister, these four reasons are the main drivers for clustering – but if they also apply to grids and clouds then then what’s the difference? Doesn’t it all accomplish the same thing?

In fact the exact definition of “clustering” can be a little vague and there is a lot of overlap between clouds, grids, clusters – and simple groups of servers with strong & mature standards. In some cases these terms might be more interchangeable than you would expect. Nonetheless there are some general conventions. Here is what I have observed:

CLUSTER Old term, most strongly implies shared hardware resources of some kind, tight coupling and physical proximity of servers, and treatment of the group as a single unit for execution of tasks. While some level of single system image is presented to clients, each server may be individually administered and strong standards are desirable but not always implied. GRID Medium-aged term, implies looser coupling of servers, geographic dispersion, and perhaps cross-organizational ownership and administration. There will not be grid-wide standards for node configuration; individual nodes may be independently administered. The grid may be composed of multiple clusters. Strong standards do exist at a high level for management of jobs and inter-node communication.

Or, alternatively, the term “grid” may more loosely imply a group of servers where nodes/resources and jobs/services can easily be relocated as workload varies. CLOUD New term, implies service-based abstraction, virtualization and automation. It is extremely standardized with a bias toward enforcement through automation rather than policy. Servers are generally single-organization however service consumers are often external. Related to the term “utility computing” or the “as a service” terms (Software/SaaS, Platform/PaaS, Database/DaaS, Infrastructure/IaaS).

Or, alternatively, may (like “grid”) more loosely imply a group of servers where nodes/resources and jobs/services can easily be relocated as workload varies. Google Trends for Computers and Electronics Category

Google Trends for Computers and Electronics Category

These days, the distributed processing field is a very exciting place because the technology is advancing rapidly on all fronts. Traditional relational databases are dealing with increasingly massive data volumes, and big data technology combined with pay-as-you-go cloud platforms and mature automation toolkits have given bootstrapped startups unforeseen access to extremely large-scale data processing.

Building for Distributed Processing

Your business probably does not have big data. But the business case for some level of distributed processing will probably find you eventually. As I pointed out before, the standards and driving principles at very large organizations can benefit your commodity servers right now and eliminate many growing pains down the road.

In the second half of this article I will take a look at how this specifically applies to clustered Oracle databases. But I’m curious, are your server build standards ready for distributed processing? Could they accommodate clustering, grids or clouds? What kinds of standards do you think are most important to be ready for distributed processing?

Webcast: Database Cloning in Minutes using Oracle Enterprise Manager 12c Database as a Service Snap Clone

Pankaj Chandiramani - Thu, 2014-04-17 04:02

Since the demands
from the business for IT services is non-stop, creating copies of production
databases in order to develop, test and deploy new applications can be
labor intensive and time consuming. Users may also need to preserve private
copies of the database, so that they can go back to a point prior to when
a change was made in order to diagnose potential issues. Using Snap Clone,
users can create multiple snapshots of the database and “time
travel
” across these snapshots to access data from any point
in time.


Join us for an in-depth
technical webcast and learn how Oracle Cloud Management Pack for Oracle
Database's capability called Snap Clone, can fundamentally improve the
efficiency and agility of administrators and QA Engineers while saving
CAPEX on storage. Benefits include:



  • Agile provisioning
    (~ 2 minutes to provision a 1 TB database)

  • Over 90% storage
    savings

  • Reduced administrative
    overhead from integrated lifecycle management


Register
Now!


April 24 — 10:00 a.m. PT | 1:00 p.m. ET

May 8 — 7:00 a.m. PT | 10:00 a.m. ET | 4:00 p.m. CET

May 22 — 10:00 a.m. PT | 1:00 p.m. ET





Categories: DBA Blogs

The Drive To Visualize Data: Dashboards

Usable Apps - Thu, 2014-04-17 03:03

Introduction: Cars and Context

Like many people of a certain age, my first exposure to the term dashboard was when I heard my dad using it when driving the car. He referred to it as “the dash”.

Dad’s “dash” was an analog affair that told him the car’s speed, the miles traveled, the engine oil level and temperature, if he had enough gas in the tank, and a few other little bits of basic information. It was all whirring dials, trembling needle pointers on clock-style faces, switches to toggle on and off, a couple of sliders, and little lights that blinked when there was trouble.

Drivers in those days needed to pay attention, all the time, to their dashboards.

Ford dashboard from the 1970s

Old school car dashboards: quaint and charming. And a lot of work. (Source: WikiMedia Commons)

Dashboards in cars, and how drivers use them, are different now. The days of a dashboard with switches to flick or dials to turn are gone.

Today, a family car generates hundreds of megabytes of data every second. Most of this data is discarded immediately, and is not useful to the driver, but some is and may even be life saving. Technology makes sense of the surging data so that drivers can respond easily to important information because it’s presented to them in a timely, easily consumed, and actionable way.

Car dashboards are now closer to the “glass cockpit” world that fighter jet pilots experience. Cars have tiny sensors, even cameras, and other technology inside and outside the vehicle that detect and serve up striking digital visualizations about the health of the car and driver performance. Drivers are empowered to be “situationally aware” about what’s going on (what us UXers would call “context”), as they listen to or watch for signals and cues and respond to them naturally, using voice, for example.

Some car dashboards even use heads-up displays, projecting real-time information onto the windshield. Drivers know what’s going on with their car without taking their eyes off the road.

Chevrolet Camaro Heads-up Display

Chevrolet Corvette Heads-up Display (Source: www.chevrolet.com)

Dashboard design itself is now the essence of simplicity and cutting edge technology, and stylish with it too, arising passions about what makes a great interface inside a car. It’s all part of creating an experience to engage drivers for competitive advantage in a tight automobile market.

Tesla Model S Dashboard

Tesla Model S Dashboard (Source: www.teslamotors.com)

The Emergence of Digital Dashboards User Experience

When it comes to software applications and websites, dashboards are around us everywhere too. We’re all long familiar with how such dashboards work and how to use them, beginning with the pioneering My Yahoo! portal that popularized the use of the “My” pronoun in web page titles, right through to today’s wearable apps dashboards that are a meisterwerk of information visualization, integrating social media and gamification along the way.

Fitbit Dashboard (Author's own)

FitBit Dashboard (Source: Author)

An enterprise application dashboard is a one-stop shop of information. It’s a page made up of portlets or regions, chunking up related information into displays of graphs, charts, and graphics of different kinds. Dashboards visualize a breadth of information that spans a whole range of activities in a functional area.

Dashboards aggregate data into meaningful visual displays and cues, using processor horsepower at the backend to do the work that users used to do with notepads, calculators or spreadsheets to find what out what’s changed or in need of attention.

Dashboards enable users to prioritize work and to manage exceptions by taking light-weight actions immediately from the page, or to drill down to explore and do more in a transactional or analytics work area, if necessary.

The dashboard concept remains a core part of the enterprise applications user experience, particularly for work roles that rely on monitoring of information, providing reports on performance, or needing a range of information to make well-timed and high-level decisions.

Developing Dashboards

In work, we now also have to deal with that other torrent of data we hear about: big data. Dashboards are ideal ways to make sense of this data and to represent the implications of its analysis to a viewer, bringing insight to users rather than the other way around.

To this end, Oracle provides enterprise application developers with the Oracle ADF Data Visualization Tools (DVT) components to build dashboards using data in the cloud, and with design guidance in the form of the Oracle Fusion Applications, Oracle Endeca and Oracle Business Intelligence Enterprise Edition UI patterns and guidelines for making great-looking dashboards.

Fusion Apps Desktop UI Dashboard

Typical Oracle Fusion Applications Desktop UI Dashboard (Source: Oracle)

Beyond Desktop Dashboards…

Dashboards’ origins as a desktop UI concept obviously predated the “swipe and pinch” world of mobility, today’s cross-device, flexible way of working with shared data in the cloud. Sure, we still have a need for what dashboards were originally about. But, we now need new ways for big data to be organized and visualized. We need solutions that reflect our changing work situations--our context --so that we that we can act on the information quickly, using a tablet or a smart phone, or whatever’s optimal. And, we need new ways of describing this dashboard user experience.

Enter the era of “glance, scan, and commit”, a concept that we will explore in a future Usable Apps blog.

MongoDB is growing up

DBMS2 - Thu, 2014-04-17 02:56

I caught up with my clients at MongoDB to discuss the recent MongoDB 2.6, along with some new statements of direction. The biggest takeaway is that the MongoDB product, along with the associated MMS (MongoDB Management Service), is growing up. Aspects include:

  • An actual automation and management user interface, as opposed to the current management style, which is almost entirely via scripts (except for the monitoring UI).
    • That’s scheduled for public beta in May, and general availability later this year.
    • It will include some kind of integrated provisioning with VMware, OpenStack, et al.
    • One goal is to let you apply database changes, software upgrades, etc. without taking the cluster down.
  • A reasonable backup strategy.
    • A snapshot copy is made of the database.
    • A copy of the log is streamed somewhere.
    • Periodically — the default seems to be 6 hours — the log is applied to create a new current snapshot.
    • For point-in-time recovery, you take the last snapshot prior to the point, and roll forward to the desired point.
  • A reasonable locking strategy!
    • Document-level locking is all-but-promised for MongoDB 2.8.
    • That means what it sounds like. (I mention this because sometimes an XML database winds up being one big document, which leads to confusing conversations about what’s going on.)
  • Security. My eyes glaze over at the details, but several major buzzwords have been checked off.
  • A general code rewrite to allow for (more) rapid addition of future features.

Of course, when a DBMS vendor rewrites its code, that’s a multi-year process. (I think of it at Oracle as spanning 6 years and 2 main-number releases.) With that caveat, the MongoDB rewrite story is something like:

  • Updating has been reworked. Most of the benefits are coming later.
  • Query optimization and execution have been reworked. Most of the benefits are coming later, except that …
  • … you can now directly filter on multiple indexes in one query; previously you could only simulate doing that by pre-building a compound index.
  • One of those future benefits is more index types, for example R-trees or inverted lists.
  • Concurrency improvements are down the road.
  • So are rewrites of the storage layer, including the introduction of compression.

Also, you can now straightforwardly transform data in a MongoDB database and write it into new datasets, something that evidently wasn’t easy to do before.

One thing that MongoDB is not doing is offer any ODBC/JDBC or other SQL interfaces. Rather, there’s some other API — I don’t know the details — whereby business intelligence tools or other systems can extract views, and a few BI vendors evidently are doing just that. In particular, MicroStrategy and QlikView were named, as well as a couple of open source usual-suspects.

As of 2.6, MongoDB seems to have a basic integrated text search capability — which however does not rise to the search functionality level that was in Oracle 7.3.2. In particular:

  • 15 Western languages are supported with stopwords, tokenization, etc.
  • Search predicates can be mixed into MongoDB queries.
  • The search language isn’t very rich; for example, it lacks WHERE NEAR semantics.
  • You can’t tweak the lexicon yourself.

And finally, some business and pricing notes:

  • Two big aspects of the paid-versus-free version of MongoDB (the product line) are:
    • Security.
    • Management tools.
  • Well, actually, you can get the management tools for free, but only on a SaaS basis from MongoDB (the company).
    • If you want them on premises or in your part of the cloud, you need to pay.
    • If you want MongoDB (the company) to maintain your backups for you, you need to pay.
  • Customer counts include:
    • At least 1000 or so subscribers (counting by organization).
    • Over 500 (additional?) customers for remote backup.
    • 30 of the Fortune 100.

And finally, MongoDB did something many companies should, which is aggregate user success stories for which they may not be allowed to publish full details. Tidbits include:

  • Over 100 organizations run clusters with more than 100 nodes. Some clusters exceed 1,000 nodes.
  • Many clusters deliver hundreds of thousands of operations per second (combined read and write).
  • MongoDB clusters routinely store hundreds of terabytes, and some store multiple petabytes of data. Over 150 clusters exceed 1 billion documents in size. Many manage more than 100 billion documents.
Categories: Other

MongoDB is growing up

Curt Monash - Thu, 2014-04-17 02:56

I caught up with my clients at MongoDB to discuss the recent MongoDB 2.6, along with some new statements of direction. The biggest takeaway is that the MongoDB product, along with the associated MMS (MongoDB Management Service), is growing up. Aspects include:

  • An actual automation and management user interface, as opposed to the current management style, which is almost entirely via scripts (except for the monitoring UI).
    • That’s scheduled for public beta in May, and general availability later this year.
    • It will include some kind of integrated provisioning with VMware, OpenStack, et al.
    • One goal is to let you apply database changes, software upgrades, etc. without taking the cluster down.
  • A reasonable backup strategy.
    • A snapshot copy is made of the database.
    • A copy of the log is streamed somewhere.
    • Periodically — the default seems to be 6 hours — the log is applied to create a new current snapshot.
    • For point-in-time recovery, you take the last snapshot prior to the point, and roll forward to the desired point.
  • A reasonable locking strategy!
    • Document-level locking is all-but-promised for MongoDB 2.8.
    • That means what it sounds like. (I mention this because sometimes an XML database winds up being one big document, which leads to confusing conversations about what’s going on.)
  • Security. My eyes glaze over at the details, but several major buzzwords have been checked off.
  • A general code rewrite to allow for (more) rapid addition of future features.

Of course, when a DBMS vendor rewrites its code, that’s a multi-year process. (I think of it at Oracle as spanning 6 years and 2 main-number releases.) With that caveat, the MongoDB rewrite story is something like:

  • Updating has been reworked. Most of the benefits are coming later.
  • Query optimization and execution have been reworked. Most of the benefits are coming later, except that …
  • … you can now directly filter on multiple indexes in one query; previously you could only simulate doing that by pre-building a compound index.
  • One of those future benefits is more index types, for example R-trees or inverted lists.
  • Concurrency improvements are down the road.
  • So are rewrites of the storage layer, including the introduction of compression.

Also, you can now straightforwardly transform data in a MongoDB database and write it into new datasets, something that evidently wasn’t easy to do before.

One thing that MongoDB is not doing is offer any ODBC/JDBC or other SQL interfaces. Rather, there’s some other API — I don’t know the details — whereby business intelligence tools or other systems can extract views, and a few BI vendors evidently are doing just that. In particular, MicroStrategy and QlikView were named, as well as a couple of open source usual-suspects.

As of 2.6, MongoDB seems to have a basic integrated text search capability — which however does not rise to the search functionality level that was in Oracle 7.3.2. In particular:

  • 15 Western languages are supported with stopwords, tokenization, etc.
  • Search predicates can be mixed into MongoDB queries.
  • The search language isn’t very rich; for example, it lacks WHERE NEAR semantics.
  • You can’t tweak the lexicon yourself.

And finally, some business and pricing notes:

  • Two big aspects of the paid-versus-free version of MongoDB (the product line) are:
    • Security.
    • Management tools.
  • Well, actually, you can get the management tools for free, but only on a SaaS basis from MongoDB (the company).
    • If you want them on premises or in your part of the cloud, you need to pay.
    • If you want MongoDB (the company) to maintain your backups for you, you need to pay.
  • Customer counts include:
    • At least 1000 or so subscribers (counting by organization).
    • Over 500 (additional?) customers for remote backup.
    • 30 of the Fortune 100.

And finally, MongoDB did something many companies should, which is aggregate user success stories for which they may not be allowed to publish full details. Tidbits include:

  • Over 100 organizations run clusters with more than 100 nodes. Some clusters exceed 1,000 nodes.
  • Many clusters deliver hundreds of thousands of operations per second (combined read and write).
  • MongoDB clusters routinely store hundreds of terabytes, and some store multiple petabytes of data. Over 150 clusters exceed 1 billion documents in size. Many manage more than 100 billion documents.

Twilio: Democratizing Communications to Build a Better User Experience in the Oracle Cloud

Usable Apps - Thu, 2014-04-17 02:11

Oracle has a powerful partner ecosystem in the Oracle Cloud, adding value to our applications in many areas. Enabling partners to integrate with our cloud applications is key to Oracle’s “Extending SaaS through PaaS” approach. Sharing our expertise with partners, which helps them to productively build a great user experience (UX), is a major drive of Oracle Applications User Experience (OAUX) outreach.

One of the latest additions to the Oracle PartnerNetwork  is the very cool and happening Twilio. Followers of the AppsLab know the OAUX team loves exploring the UX possibilities of Twilio-based voice and SMS integrations. I took a trip to Twilio's San Francisco HQ to ask David Wacker (@dlwacker) of Twilio Channel Sales and Partnerships to find out more about the whys and hows of integrating in the cloud and simplifying user experience...

Being in the cloud offers the potential to make a major difference with a superior UX. The days of cumbersome, on-premise installations and horrible UX are gone. Now scalable, cloud-based applications, customizable and reflecting each customer’s business, are changing the UX across datacenter management, CRM, marketing automation, and ERP, all driven through how we power communications.

Twilio is a cloud-based communications platform that offers a powerful, open API for building communications applications, what Twilio refers to as "democratizing access" to communication in a traditionally complex and expensive world of telephony.

Using Twilio, developers can easily access the means to create robust communications integrations, fundamentally changing the UX landscape for applications users in the cloud. Twilio’s open API framework means developers can utilize prebuilt solutions in the Oracle Marketing Cloud, Oracle Service Cloud, and Oracle Sales Cloud. Developers can build such UX integrations productively, without the cost and effort normally associated with such projects.

David pointed out a few ways how Twilio enhanced the user experience for Oracle application users, such as the Oracle Marketing Cloud, Oracle Service Cloud, and Oracle Sales Cloud.

Twilio’s seamless integration to the Oracle Marketing Cloud (Eloqua) means that users can just drag and drop the Twilio Cloud Connector onto a marketing campaign canvas to provide for outbound SMS, MMS (multimedia messaging), and voice calls. This delivers a great multichannel user experience, such as for mobile marketing campaigns with pictures or QR coupon codes.

Twilio Cloud Connector

Dragging the Twilio Cloud Connector onto a campaign canvas easily adds Twilio SMS, MMS, and voice to marketing campaigns.

Twilio's embedding of SMS and voice capabilities right into the Oracle Service Cloud (RightNow) means a superior customer experience built in a scalable, flexible way. A service agent can use click-to-call to phone an end customer, automatically creating the event on their system and then recording the call, for example. An SMS capability can also enable customers to chat with service agents using SMS on their phones instead of web chat, if preferred, and more.

Twilio Click-to-Call

Click-to-call for customer engagement, which allows customers to call inbound more effectively.

Twilio's integration into the Oracle Sales Cloud, drives efficiency by simplifying the UX. Twilio uses the Oracle Sales Cloud native CTI toolbar to track and record phone calls, allowing for seamless conference calls, and all integrated to drive sales productivity. For example, a sales rep can use Twilio’s click-to-call to contact opportunities, automated dialing, or conference line bridges powered by Twilio, creating events and logging activities easily within the Oracle Sales Cloud.

Twilio integrated with Oracle Sales Cloud

Computer Telephony Integration (CTI) toolbar for easy access to inbound and outbound dialing in Oracle Sales Cloud powered by Twilio.

David tells me that “Twilio’s integration possibilities are endless. That's the best part about working with developers in the Twilio and Oracle communities; finding new ways to solve user problems, unconstrained by technology or traditional project limitations. I’m excited to explore new and unique ways that the Oracle developer community and Twilio can change the UX landscape in the Oracle Cloud.”

Those are some great UX insights from David, and there are more to come. The OAUX team will be working with Twilio over the coming months, so stay tuned to your usual outreach and communications channels for news and events.

Twilio is also exhibiting at, and sponsoring, Oracle CloudWorld in Chicago on Thursday, April 17, 2014. Stop by the Twilio booth to learn more (or to just say, Hi!), and give the Usable Apps blog a shout-out.

Business professionals consider moving to Office 365

Chris Foot - Thu, 2014-04-17 02:00

Many executives favor Microsoft products over competing software. Since its inception, the corporation has established itself as a developer of business-standard technology, with millions of subscribers distributed throughout the world. Due to recent improvements spearheaded by new CEO Satya Nadella, many organizations previously unfamiliar with the company's products are implementing Microsoft solutions with the help of database administration services

Releasing a more affordable product 
Pete Pachal, a contributor to technology blog Mashable, noted that Microsoft began selling Office 365 Personal earlier this week for $6.99 a month, accommodating subscribers with applications such as Word, Excel, PowerPoint and Outlook, among others. In contrast to the solution's counterpart, Office 365 Home, Personal only allows users to install the program on a single PC or Mac. However, the offer makes sense for enterprises working primarily with such machines. 

Personal's integration with Microsoft's cloud solution, One Drive, enables employees to share, store and edit files seamlessly. As this process expedites business operations, senior-level management may consider Office 365 to be a viable option for satisfying the needs of their departments. For those looking to abandon products manufactured by Microsoft's competitors, however, the transition may be easier said than done. 

Steps for migration 
Moving a large variety of email into Office 365 may require the assistance of database administration professionals. According to InfoWorld contributor Peter Bruzzese, corporations need to consider what information should be transitioned into Outlook, where that data is stored and whether or not it will be manipulated after all digital intelligence is successfully relocated. In order to ensure a smooth transfer, Bruzzese recommended making the following considerations:

  • Perform a preparatory review of all messaging needs and orchestrate a plan that will supplement those requirements. 
  • If a company is migrating from Exchange, database support services can it transfer all on-premise data into the cloud through Exchange Web Services, which allows users to export 400GB a day. 
  • Those relocating data from Google, Network File Systems or Notes should consider using Archive360, which can filter data through Exchange and then transfer it into Office 365.
  • Companies transitioning email data from GroupWise could  find solace in funneling the information through Mimecast and connecting the storage with Office 365 mailboxes. 

Obviously, a command of certain programs is required, depending on what kind of route an organization chooses. For this reason, consulting database experts may be the best option. 

Indexing Foreign Key Constraints With Bitmap Indexes (Locked Out)

Richard Foote - Thu, 2014-04-17 01:29
Franck Pachot made a very valid comment in my previous entry on Indexing Foreign Keys (FK) that the use of a Bitmap Index on the FK columns does not avoid the table locks associated with deleting rows from the parent table. Thought I might discuss why this is the case and why only a B-Tree index does […]
Categories: DBA Blogs

SQL Developer’s Interface for GIT: Interacting with a GitHub Repository Part 2

Galo Balda's Blog - Wed, 2014-04-16 17:46

In this post I’m going to show to synchronize the remote and local repositories after an existing file in local gets modified. What I’ll do is modify the sp_test_git.pls file in our local repository and then push those changes to the remote repository (GitHub).

First, I proceed to open the sp_test_git.pls file using SQL Developer, add another dbms_output line to it and save it. The moment I save the file, the Pending Changes (Git) window gets updated to reflect the change and the icons in the toolbar get enabled.

modify_file

Now I can include a comment and then add the file to the staging area by clicking on the Add button located on the Pending Changes (Git) window. Notice how the status changes from “Modified Not Staged” to “Modified Staged”.

staged_file

What if I want to compare versions before doing a commit to the local repository? I just have to click on the Compare with Previous Version icon located on the Pending Changes (Git) window.

compare2

The panel on the left displays the version stored in the local repository and the panel on the right displays the version in the Staging Area.

The next step is to commit the changes to the local repository. For that I click on the Commit button located on the Pending Changes (Git) window and then I click on the OK button in the Commit window.

commit

Now the Branch Compare window displays information telling that remote and local are out of sync.

branch_compare2

So the final step is to sync up remote and local by pushing the changes to GitHub. For that I go to the main menu and click on  Team -> Git -> Push to open the “Push to Git” wizard where I enter the URL for the remote repository, the user name and password to complete the operation. Now I go to GitHub to confirm the changes have been applied.

updated_github


Filed under: GIT, SQL Developer, Version Control Tagged: GIT, SQL Developer, Version Control

Categories: DBA Blogs

Online Chat Available for Existing Service Requests

Joshua Solomin - Wed, 2014-04-16 16:16
Untitled Document

An online chat session can often answer a question or clarify a situation quickly.

My Oracle Support now offers a new chat feature that enables Oracle Support engineers to contact you instantly online to discuss an open Service Request—to ask a question, share detailed commands and troubleshooting information, or confirm that your issue is resolved.

Chat

You always control your availability for an online chat. When you are involved in critical projects or meetings, set your status to “Not Available” and the engineer will contact you using your preferred method. Keeping yourself in the “Available” status lets your Support engineer know when you are online and available for a chat about your Service Request.

If you receive a chat request from a Support engineer, you can decide to accept the chat, request a different time for the chat, or decline the chat.

Find out more
—watch a short video demonstration and read additional details.

KeePass 2.26 Released

Tim Hall - Wed, 2014-04-16 16:08

KeePass 2.26 has recently been released. I would suggest going with the portable version, which is an unzip and go application.

If you want to know how I use KeePass, check out my article called Adventures with Dropbox and KeePass.

Cheers

Tim…

KeePass 2.26 Released was first posted on April 16, 2014 at 11:08 pm.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.

MobaXterm 7.1 Released

Tim Hall - Wed, 2014-04-16 16:03

If you are using a Windows desktop, you need MobaXterm in your life! Version 7.1 has recently been released…

I know you think you can’t live without Putty, Cygwin and/or Xming, but you really can. Give MobaXterm a go and I would be extremely surprised if you ever go back to that rag-tag bunch of apps…

Cheers

Tim…

PS. Includes “Updated OpenSSL library to 1.0.1g (for “Heartbleed Bug” correction)”

MobaXterm 7.1 Released was first posted on April 16, 2014 at 11:03 pm.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.

ORA-00600 [3631] recovering pluggable database after flashback database in Oracle 12c

Bobby Durrett's DBA Blog - Wed, 2014-04-16 15:44

I was trying to recreate the scenario where a 12c container database is flashed back to a SCN before the point that I recovered a pluggable database to using point in time recovery.

I got this ugly ORA-00600:

RMAN> recover pluggable database pdborcl;

Starting recover at 16-APR-14
using channel ORA_DISK_1

starting media recovery
media recovery failed
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03002: failure of recover command at 04/16/2014 06:07:40
ORA-00283: recovery session canceled due to errors
RMAN-11003: failure during parse/execution of SQL statement: alter database recover if needed
 datafile 32 , 33 , 34 , 35
ORA-00283: recovery session canceled due to errors
ORA-00600: internal error code, arguments: [3631], [32], [4096], [4210689], [], [], [], [], [], [], [], []

I think the above error message stems from this bug:

Bug 14536110  ORA-600 [ktfaput: wrong pdb] / crash using PDB and FDA

There may have been some clever way to recover from this but I ended up just deleting and recreating the CDB through DBCA which was good experience playing with DBCA in Oracle 12c.  I’m trying to learn 12c but I have a feeling that I have hit a bug that keeps me from testing this flashback database, point in time recovery of a pluggable database scenario.  I wonder if I should patch?  I think that Oracle has included a patch for this bug in a patch set.  It could be good 12c experience to apply a patch set.

- Bobby

Categories: DBA Blogs

WordPress 3.9 Released

Tim Hall - Wed, 2014-04-16 15:20

Hot on the heels of WordPress 3.8.3 comes WordPress 3.9! The downloads and changelog are in the usual places.

I’ve just manually initiated the auto-update on five blogs and all went through OK.

Cheers

Tim…

WordPress 3.9 Released was first posted on April 16, 2014 at 10:20 pm.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.

Retailers use business intelligence to optimize supply chain

Chris Foot - Wed, 2014-04-16 10:07

Due to the prevalence of omnichannel retail, merchandisers are obligated to satisfy the inventory fulfillment requirements of brick-and-mortar stores and consumers. Instead of using human resources to scrutinize the distribution process, commodity-based companies are hiring database experts to implement business intelligence tools capable of providing actionable information regarding the supply chain.

What's redefining modern delivery systems?
E-commerce has allowed corporations to deliver their products to consumers residing in various parts of the country, creating variable demand for particular items. In order to anticipate customer desires, data analytics tools are being used to chart regional weather conditions, translate online surveys and monitor the distribution of materials. Jim Rice, a contributor to Supply Chain 24/7, stated that while transportation and storage processes cannot change the specifications of an item, they can revolutionize the way in which that particular product is delivered to a customer.

For example, a customized, direct-to-order method can transform consumer expectations. People don't want to wait to receive their purchased materials, even though an unspoken covenant was established the minute they finalized the order on a website. Therefore, database administration personnel employ programs that scrutinize which areas of the supply chain can be optimized to ensure that products are delivered as promptly as possible. The patterns these software solutions recognize are often overlooked by human eyes.

Enhancing global sourcing
Database engineering company Oracle recently announced the introduction of Oracle Global Trade Intelligence, a global commerce analytics application that provides organizations with the ability to leverage worldwide sourcing and distribution data to measure, predict and optimize the performance of their supply chains. Released in February, the program contains modifiable dashboards that enable enterprises to construct user-defined trade performance measurements that scrutinize import and export activities throughout the world.

Oracle experts and sourcing professionals are thrilled with the release, which also offers executives the chance to streamline communications between overseas departments. This process is expected to ensure that all materials are properly tracked, significantly reducing the chance of losing vital products. In addition, the program gives strategists the ability to anticipate the actions of both foreign and domestic competitors.

"Organizations are moving beyond automation of their global trade processes and are seeking ways to leverage their global trade data to make better business decisions," said Vice President of Value Chain Execution Product Strategy Derek Gittoes.

In the age of global commerce, it's imperative that companies possess programs akin to Oracle Global Trade Intelligence in order expedite the shipment of goods and reduce the cost for such products on the consumer's end.

Annonce : OVCA X4-2

Jean-Philippe Pinte - Wed, 2014-04-16 09:08
Oracle a annoncé  l' Oracle Virtual Compute Appliance X4-2


Plus d'informations :

Monitoring Oracle Golden Gate from SQL Developer

DBASolved - Wed, 2014-04-16 07:16

Last week I was at Collaborate 14 speaking in two sessions; one of the sessions I had done a couple of times before.  The other session was about the different ways of monitoring Oracle GoldenGate (If you are curious about the presentation it can be found here).  While at the conference I ran the idea of monitoring GoldenGate from SQL Developer by a few peers and there seems to be interest.  As for Oracle, this approach to monitoring GoldenGate is not on Oracle’s road map for SQL Developer.

To achieve this goal, the usage of XML extensions within SQL Developer is needed.  Using XML extensions, I’ve been able to leverage monitoring GoldenGate from SQL into a working extension.  The extension is not perfect and continues to need some work.  As you can see in image 1, I can get the status of a GoldenGate process and associated stats.

Image 1:image

The SQL Developer extension for Oracle GoldenGate is available for whoever would like to use it and extend on it.  This extension is included with my other GoldenGate monitoring scripts located here and on my scripts page.

Note: at some point, I will hopefully get this extension uploaded to a Github repository for community digestion.

This extension is to help DBAs have a way to monitor their GoldenGate environments without the need of going directly to the server. For now, it just gives up/down status and operation stats.  Hopefully, as this matures (as I and others work on it) it will become a robust extension for all monitoring with Oracle GoldenGate.

Enjoy!

twitter: @dbasolved

blog: http://dbasolved.com


Filed under: Golden Gate, Replication
Categories: DBA Blogs

Choice Hotels Reduces Time to Market with Oracle WebCenter

WebCenter Team - Wed, 2014-04-16 07:00
Choice Hotels is a global hotel franchise with 6000+ franchisees. Choice is in the process of completely re-building their distribution platform for the next generation.  They currently process and distribute over $6B worth of reservation activity to the franchisees of Choice Hotels. To continue to grow and support the strategic business initiatives of the company they needed to address the primary vehicle for delivering business to their customers.  Their current platform was an amalgam of 25 years of development on disparate systems that offered limited inter-operability and scalability.  A strategic decision was made to adopt the Oracle technology stack to build out this new platform.



 They introduced more self-service opportunities to their business to create greater empowerment and flexibility, thus reducing the dependency on IT to introduce new features to their production systems. They are using WebCenter Sites as their Web Experience Management platform deployed for their International Websites. Standardizing on WebCenter Sites has allowed them to consolidate four different Content Management System platforms and created a Content Center of Excellence making it easier for any of their business users to take advantage of the WYSIWYG interface.

Perhaps the biggest win at the enterprise level is standardizing on the Oracle platform. They have seen tremendous benefits by being standardized on a focused set of tools and applications.  Their development, testing, administration, and database engineers now have greater flexibility and mobility throughout their enterprise.  Since they all have the same relative skill set, they can flex on certain projects in order to get them to market quicker.  No more Sys Admins having to learn the ins and outs of a particular system and its idiosyncrasies because everything is now a standard Oracle deployment.

 Delivering Moments of Engagement Across the Enterprise To learn more about delivering the best digital experience today - join us for the Oracle WebCenter Digital Business Thought Leadership Series

This week:

Delivering Moments of Engagement Across the Enterprise
Five Steps for Mobilizing Digital Experiences

Geoffrey Bock, Principal, Bock & Company
Michael Snow, Product Marketing Director, Oracle WebCenter



An introduction into Oracle VM (V3.x)

Marco Gralike - Wed, 2014-04-16 06:00
One other internal presentation that was presented by me, showed the technical architecture and production experiences regarding Oracle VM V3. It was based on demo of a Hands-On Lab (HOL-9870, Oracle VM) given during Oracle Openworld 2013. Based on this HOL, structures were explained and demoed for architects and database administrators. An introduction into Oracle

Read More...

An introduction into Oracle Enterprise Manager Cloud Control 12c Release 3

Marco Gralike - Wed, 2014-04-16 05:51
I have given some internal and customer presentations lately that you might find useful or like, so hereby a direct share to the presentation about Oracle’s Enterprise Manager Cloud Control (V3) An introduction into Oracle Enterprise Manager Cloud Control 12c Release 3 from Marco Gralike