Skip navigation.

Feed aggregator

Oracle Voice Debuts on the App Store

Oracle AppsLab - Mon, 2014-08-11 16:05

Editor’s note: I meant to blog about this today, but looks like my colleagues over at VoX have beat me to it. So, rather than try to do a better job, read do any work at all, I’ll just repost it. Free content w00t!

Although I no longer carry an iOS device, I’ve seen Voice demoed many times in the past. Projects like Voice and Simplified UI are what drew me to Applications User Experience, and it’s great to see them leak out into the World.

Enjoy.

Oracle Extends Investment in Cloud User Experiences with Oracle Voice for Sales Cloud
By Vinay Dwivedi, and Anna Wichansky, Oracle Applications User Experience

Oracle Voice for the Oracle Sales Cloud, officially called “Fusion Voice Cloud Service for the Oracle Sales Cloud,” is available now on the Apple App Store. This first release is intended for Oracle customers using the Oracle Sales Cloud, and is specifically designed for sales reps.

Home_With_Frame

The home screen of Fusion Voice Cloud Service for the Oracle Sales Cloud is designed for sales reps.

Unless people record new information they learn, (e.g. write it down, repeat it aloud), they forget a high proportion of it in the first 20 minutes. The Oracle Applications User Experience team has learned through its research that when sales reps leave a customer meeting with insights that can move a deal forward, it’s critical to capture important details before they are forgotten. We designed Oracle Voice so that the app allows sales reps to quickly enter notes and activities on their smartphones right after meetings, no matter where they are.

Instead of relying on slow typing on a mobile device, sales reps can enter information three times faster (pdf) by speaking to the Oracle Sales Cloud through Voice. Voice takes a user through a dialog similar to a natural spoken conversation to accomplish this goal. Since key details are captured precisely and follow-ups are quicker, deals are closed faster and more efficiently.

Oracle Voice is also multi-modal, so sales reps can switch to touch-and-type interactions for situations where speech interaction is less than ideal.

Oracle sales reps tried it first, to see if we were getting it right.

We recruited a large group of sales reps in the Oracle North America organization to test an early version of Oracle Voice in 2012. All had iPhones and spoke American English; their predominant activity was field sales calls to customers. Users had minimal orientation to Oracle Voice and no training. We were able to observe their online conversion and usage patterns through automated testing and analytics at Oracle, through phone interviews, and through speech usage logs from Nuance, which is partnering with Oracle on Oracle Voice.

Users were interviewed after one week in the trial; over 80% said the product exceeded their expectations. Members of the Oracle User Experience team working on this project gained valuable insights into how and where sales reps were using Oracle Voice, which we used as requirements for features and functions.

For example, we learned that Oracle Voice needed to recognize product- and industry-specific vocabulary, such as “Exadata” and “Exalytics,” and we requested a vocabulary enhancement tool from Nuance that has significantly improved the speech recognition accuracy. We also learned that connectivity needed to persist as users traveled between public and private networks, and that users needed easy volume control and alternatives to speech in public environments.

We’ve held subsequent trials, with more features and functions enabled, to support the 10 workflows in the product today. Many sales reps in the trials have said they are anxious to get the full version and start using it every day.

“I was surprised to find that it can understand names like PNC and Alcoa,” said Marco Silva, Regional Manager, Oracle Infrastructure Sales, after participating in the September 2012 trial.

“It understands me better than Siri does,” said Andrew Dunleavy, Sales Representative, Oracle Fusion Middleware, who also participated in the same trial.

This demo shows Oracle Voice in action.

What can a sales rep do with Oracle Voice?

Oracle Voice allows sales reps to efficiently retrieve and capture sales information before and after meetings. With Oracle Voice, sales reps can:

Prepare for meetings

  • View relevant notes to see what happened during previous meetings.
  • See important activities by viewing previous tasks and appointments.
  • Brush up on opportunities and check on revenue, close date and sales stage.

Wrap up meetings

  • Capture notes and activities quickly so they don’t forget any key details.
  • Create contacts easily so they can remember the important new people they meet.
  • Update opportunities so they can make progress.
These screenshots show how to create tasks and appointments using Oracle Voice.

These screenshots show how to create tasks and appointments using Oracle Voice.

Our research showed that sales reps entered more sales information into the CRM system when they enjoyed using Oracle Voice, which makes Oracle Voice even more useful because more information is available to access when the same sales reps are on the go. With increased usage, the entire sales organization benefits from access to more current sales data, improved visibility on sales activities, and better sales decisions. Customers benefit too — from the faster response time sales reps can provide.

Oracle’s ongoing investment in User Experience

Oracle gets the idea that cloud applications must be easy to use. The Oracle Applications User Experience team has developed an approach to user experience that focuses on simplicity, mobility, and extensibility, and these themes drive our investment strategy. The result is key products that refine particular user experiences, like we’ve delivered with Oracle Voice.

Oracle Voice is one of the most recent products to embrace our developer design philosophy for the cloud of “Glance, Scan, & Commit.” Oracle Voice allows sales reps to complete many tasks at what we call glance and scan levels, which means keeping interactions lightweight, or small and quick.

Are you an Oracle Sales Cloud customer?

Oracle Voice is available now on the Apple App Store for Oracle customers using the Oracle Sales Cloud. It’s the smarter sales automation solution that helps you sell more, know more, and grow more.

Will you be at Oracle OpenWorld 2014? So will we! Stay tuned to the VoX blog for when and where you can find us. And don’t forget to drop by and check out Oracle Voice at the Smartphone and Nuance demo stations located at the CX@Sales Central demo area on the second floor of Moscone West.Possibly Related Posts:

The Business Value In Training

Rittman Mead Consulting - Mon, 2014-08-11 14:59

One of the main things I get asked to do here at Rittman Mead, is deliver the OBIEE front-end training course (TRN 202). This a great course that has served both us, and our clients well over the years. It has always been in high demand and always delivered with great feedback from those in attendance. However, as with all things in life and business, there is going to be room for improvement and opportunities to provide even more value to our clients. Of all the feedback I receive from delivering the course, my favorite is that we do an incredible job delivering both the content and providing real business scenarios on how we have used this tool in the consulting field. Attendees will ask me how a feature works, and how I have used it with current and former clients, 100% of the time.

This year at KSCope ’14 in Seattle, we were asked to deliver a 2 hour front-end training course. Our normal front-end course runs a span of two days and covers just about every feature you can use all the way from Answers and Dashboards, to BI Publisher. Before the invitation to KScope ’14, we had bee tooling with the idea to deliver a course that not only teaches attendees on how to navigate OBIEE and use it’s features, but also emphasizes the business value behind why those features exist in the first place. We felt that too often users are given a quick overview of what the tool includes, but left figure out on their own how to extract the most value. It is one thing to create a graph in Answers, and another to know what the best graph to use might be. So in preparation for the KScope session, we decided to build the content around not only how to develop in OBIEE, but also why, as a business user, you would choose one layout/graph/feature over another. As you would expect, the turn out for the session was fantastic, we had over 70 plus pre-register, with another 10 on the waiting list. This was proof that there is an impending need to pull as much business value out of the tool as there is to simply learn how to use it. We were so encouraged by the attendance and feedback from this event, that we spent the next several weeks developing what is called the “Business Enablement Bootcamp”. It is a 3 day course that will cover Answers, Dashboards, Action Framework, BI Publisher, and the new Mobile App Designer. This is an exciting time for us in that we not only get show people how to use all of the great features that are built into the tool, but to also incorporate years of consulting experience and hundreds of client engagements right into the content. Below I have listed a breakdown of the material and the value it will provide.

Answers

Whenever we deliver our OBIEE 5-day bootcamp, which covers everything from infrastructure to the front end, Answers is one of the key components that we teach. Answers is the building block for analysis in OBIEE. While this portion of the tool is relatively intuitive to get started with, there are so many valuable nuances and settings that can get over looked without proper instruction. In order to get the most out of the tool, a business user needs be able to not only create basic analyses, but be able to use many of the advanced features such as hierarchical columns, master-detail, and selection steps. Knowing how and why to use these features is a key component to gaining valuable insight for your business users.

Dashboards

This one in particular is dear to my heart. To create an analysis and share it on a dashboard is one thing, but to tell a particular story with a series of visualizations strategically placed on a dashboard is something entirely different. Like anything else business intelligence, optimal visualization and best practices are learned skills that take time and practice. Valuable skills like making the most of your white space, choosing the correct visualizations, and formatting will be covered. When you provide your user base with the knowledge and skills to tell the best story, there will be no time wasted with clumsy iterations and guesswork as to what is the best way to present your data. This training will provide some simple parameters to work within, so that users can quickly gather requirements and develop dashboards that more polish and relevance than ever before.

 Dashboard

 Action Framework

Whenever I deliver any form of front end training, I always feel like this piece of OBIEE is either overlooked, undervalued, or both. This is because most users are either unaware of it’s use, or really don’t have a clear idea of its value and functionality. It’s as if it is viewed as an add-on in the sense that is just simply a nice feature. The action framework is something that when properly taught how to navigate, or given demonstration of its value, it will indeed become an invaluable piece of the stack. In order to get the most out of your catalog, users need to be shown how to strategically place action links to give the ability to drill across to analyses and add more context for discovery. These are just a few capabilities within the action framework that when shown how and when to use it, can add valuable insight (not to mention convenience) to an organization.

Bi Publisher/Mobile App Designer

Along with the action framework, this particular piece of the tool has the tendency to get overlooked, or simply give users cold feet about implementing it to complement answers. I actually would have agreed with these feelings before the release of 11.1.1.7. Before this release, a user would need to have a pretty advanced knowledge of data modeling. However, users can now simply pick any subject area, and use the report creation wizard to be off and running creating pixel perfect reports in no time. Also, the new Mobile App Designer on top of the publisher platform is another welcomed addition to this tool. Being the visual person that I am, I think that this is where this pixel perfect tool really shines. Objects just look a lot more polished right out of the box, without having to spend a lot of time formatting the same way you would have to in answers. During training, attendees will be exposed the many of the new features within BIP and MAD, as well as how to use them to complement answers and dashboards.

Third Party Visualizations

While having the ability to implement third party visualizations like D3 and Flot into OBIEE is more of an advanced skill, the market and need seems to be growing for this. While Oracle has done some good things in past releases with new visualizations like performance tiles and waterfall charts, we all know that business requirements can be demanding at times and may require going elsewhere to appease the masses. You can visit https://github.com/mbostock/d3/wiki/Gallery to see some of the other available visualizations beyond what is available in OBIEE. During training, attendees will learn the value of when and why external visualizations might be useful, as well as a high level view of how they can be implemented.

Bullet Chart

Users often make the mistake of viewing each piece of the front end stack as separate entities, and without proper training this is very understandable. Even though they are separate pieces of the product, they are all meant to work together and enhance the “Business Intelligence” of an organization. Without training the business to complement one piece to another, it will always be viewed as just another frustrating tool that they don’t have enough time to learn on their own. This tool is meant to empower your organization to have everything they need to make the most informed and timely decisions, let us use our experience to enable your business.

Categories: BI & Warehousing

Websites: What to look for in a database security contract

Chris Foot - Mon, 2014-08-11 10:28

When shopping for a world-class database administration service, paying attention to what specialists can offer in the way of protection is incredibly important. 

For websites storing thousands or even millions of customer logins, constantly monitoring server activity is essential. A recent data breach showed just how vulnerable e-commerce companies, Software-as-a-Service providers and a plethora of other online organizations are. 

A staggering number 
A Russian criminal organization known as "CyberVor" recently collected 1.2 billion unique user name and password sequences and 500 million email addresses from websites executing lackluster protection techniques, Infosecurity Magazine reported.

Andrey Dulkin, senior director of cyber innovation at CyberArk noted the attack was orchestrated by a botnet – or a collection of machines working to achieve the same end-goal. CyberVor carefully employed multiple infiltration techniques simultaneously in order to harvest login data. 

Where do DBAs come into play? 
Database active monitoring is essential to protect the information websites hold for their subscribers and patrons. Employing anti-malware is one thing, but being able to perceive actions occurring in real-time is the only way organizations can hope to deter infiltration attempts at their onset. 

Although TechTarget was referring to disaster recovery, the same principles of surveillance apply to protecting databases. When website owners look at the service-level agreement, the database support company should be provide the following accommodations:

  • Real-time reporting of all sever entries, detailing which users entered an environment, how they're interacting with it and what programs they're using to navigate it. 
  • Frequent testing that searches for any firewall vulnerabilities, unauthorized programs, SQL orders, etc. 
  • On-call administrators capable of assessing any questions or concerns a website may have.

Applying basics, then language 
Although advanced analytics and tracking cookies can be applied to actively search for and eliminate viruses – like how white blood cells attack pathogens – neglecting to cover standard security practices obviously isn't optimal. 

South Florida Business Journal acknowledged one of the techniques CyberVor used was a vulnerability IT professionals have been cognizant of for the past decade – SQL injections. This particular tactic likely involved one of the criminals ordering the SQL database to unveil all of its usernames and passwords. 

SQL Server, Microsoft's signature database solution, is quite popular among many websites, so those using this program need to contract DBA organizations with extensive knowledge of the language and best practices. 

Finally, remote DBA services must be capable of encrypting login information, as well as the data passwords are protecting. This provides an extra layer of protection in case a cybercriminal manages to unmask a username-password combination. 

The post Websites: What to look for in a database security contract appeared first on Remote DBA Experts.

dotNet transaction guard

Laurent Schneider - Mon, 2014-08-11 10:16

also with ODP in 12c, you can check the commit outcome as in jdbc

let’s create a table with a deferred primary key


create table t (x number primary key deferrable initially deferred);

Here an interactive Powershell Demo


PS> [Reflection.Assembly]::LoadFile("C:\oracle\product\12.1.0\dbhome_1\ODP.NET\bin\4\Oracle.DataAccess.dll")

GAC    Version        Location
---    -------        --------
True   v4.0.30319     C:\Windows\Microsoft.Net\assembly\GAC_64\Oracle.DataAccess\v4.0_4.121.1.0__89b483f429c47342\Oracle.DataAccess.dll

I first load the assembly. Some of my frequent readers may prefer Load(“Oracle.DataAccess, Version=4.121.1.0, Culture=neutral, PublicKeyToken=89b483f429c47342″) rather than hardcoding the oracle home directory.

PS> $connection=New-Object Oracle.DataAccess.Client.OracleConnection("Data Source=DB01; User Id=scott; password=tiger")

create the connection

PS> $connection.open()

connect

PS> $cmd = new-object Oracle.DataAccess.Client.OracleCommand("insert into t values (1)",$connection)

prepare the statement

PS> $txn = $connection.BeginTransaction()

begin transaction

PS> $ltxid = ($connection.LogicalTransactionId -as [byte[]])

Here I have my logical transaction id. Whatever happends to my database server, crash, switchover, restore, core dump, network disconnection, I have a logical id, and I will check it later.


PS> $cmd.executenonquery()
1

One row inserted


PS> $connection2=New-Object Oracle.DataAccess.Client.OracleConnection("Data Source=DB01; User Id=scott; password=tiger")
PS> $connection2.open()

I create a second connection to monitor the first one. Monitoring your own session would be too much unsafe and is not possible.


PS> $txn.Commit()

Commit, no error.


PS> $connection2.GetLogicalTransactionStatus($ltxid)
     Committed     UserCallCompleted
     ---------     -----------------
          True                  True

It is committed. I see it Committed from $connection2. This is what I expected.

Because I have a primary key, let’s retry and see what happend.


PS> $txn = $connection.BeginTransaction()
PS> $ltxid = ($connection.LogicalTransactionId -as [byte[]])
PS> $cmd.executenonquery()
1
PS> $txn.Commit()
Exception calling "Commit" with "0" argument(s): "ORA-02091: Transaktion wurde zurückgesetzt
ORA-00001: Unique Constraint (SCOTT.SYS_C004798) verletzt"
At line:1 char:1
+ $txn.Commit()
+ ~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : OracleException
PS> $connection2.GetLogicalTransactionStatus($ltxid)
     Committed     UserCallCompleted
     ---------     -----------------
         False                 False

The commit fails, and from the connection2 we see it is not committed. It is a huge step toward integrity, as Oracle tells you the outcome of the transaction.

We see Committed=False.

Offline Visualization of Azkaban Workflows

Pythian Group - Mon, 2014-08-11 07:51

As mentioned in my past adventures, I’m often working with the workflow management tool ominously called Azkaban. Its foreboding name is not really deserved; it’s relatively straightforward to use, and offers a fairly decent workflow visualization. For that last part, though, there is a catch: to be able to visualize the workflow, you have to (quite obviously) upload the project bundle to the server. Mind you, it’s not that much of a pain, and could easily managed by, say, a Gulp-fueled watch job. But still, it would be nice to tighten the feedback loop there, and be able to look at the graphs without having to go through the server at all.

Happily enough, all the information we need is available in the Azkaban job files themselves, and in a format that isn’t too hard to deal with. Typically, a job file will be called ‘foo.job’ and look like

type=command
command=echo "some command goes here"
dependencies=bar,baz

So what we need to do to figure out a whole workflow is to begin at its final job, and recursively walk down all its dependencies.

use 5.12.0;

use Path::Tiny;

sub create_workflow {
  my $job = path(shift);
  my $azkaban_dir = $job->parent;

  my %dependencies;

  my @files = ($job);

  while( my $file = shift @files ) {
    my $job = $file->basename =~ s/\.job//r;

    next if $dependencies{$job}; # already processed

    my @deps = map  { split /\s*,\s*/ }
               grep { s/^dependencies=\s*// }
                    $file->lines( { chomp => 1 } );

    $dependencies{$job} = \@deps;

    push @files, map { $azkaban_dir->child( $_.'.job' ) } @deps;
  }

  return %dependencies;
}

Once we have that dependency graph, it’s just a question of drawing the little boxes and the little lines. Which, funnily enough, is a much harder job one would expect. And better left off to the pros. In this case, I decided to go with Graph::Easy, which output text and svg.

use Graph::Easy;

my $graph = Graph::Easy->new;

while( my( $job, $deps ) = each %dependencies ) {
    $graph->add_edge( $_ => $job ) for @$deps;
}

print $graph->as_ascii;

And there we go. We put those two parts together in a small script, and we have a handy cli workflow visualizer.

$ azkaban_flow.pl target/azkaban/foo.job

  +------------------------+
  |                        v
+------+     +-----+     +-----+     +-----+
| zero | --> | baz | --> | bar | --> | foo |
+------+     +-----+     +-----+     +-----+
               |                       ^
               +-----------------------+

Or, for the SVG-inclined,

$ azkaban_flow.pl -f=svg target/azkaban/foo.job

which gives us

Screen Shot 2014-08-10 at 3.09.42 PM
Categories: DBA Blogs

Rittman Mead and Oracle Big Data Appliance

Rittman Mead Consulting - Mon, 2014-08-11 07:00

Over the past couple of years Rittman Mead have been broadening our skills and competencies out from core OBIEE, ODI and Oracle data warehousing into the new “emerging” analytic platforms: R and database advanced analytics, Hadoop, cloud and clustered/distributed systems. As we talked about in the recent series of updated Oracle Information Management Reference Architecture blog posts and my initial look at the Oracle Big Data SQL product, our customers are increasingly looking to complement their core Oracle analytics platform with ones to handle unstructured and big data, and as technologists we’re always interesting in what else we can use to help our customers get more insight out of their (total) dataset.

An area we’ve particularly focused on over the past year has been Hadoop and R analysis, with the recent announcement of our partnering with Cloudera and the recruitment of a big data and advanced analytics team operating our of our Brighton, UK office. We’ve also started to work on a number of projects and proof of concepts with customers in the UK and Europe, working mainly with core Oracle BI, DW and ETL customers looking to make their first move into Hadoop and big data. The usual pattern of engagement is for us to engage with some business users looking to analyse a dataset hitherto too large or too unstructured to load into their Oracle data warehouse, or where they recognise the need for more advanced analytics tools such as R, MapReduce and Spark but need some help getting started. Most often we put together a PoC Hadoop cluster for them using virtualization technology on existing hardware they own, allowing them to get started quickly and with no initial licensing outlay, with our preferred Hadoop distribution being Cloudera CDH, the same Hadoop distribution that comes on the Oracle Big Data Appliance. Projects then typically move on to Hadoop running directly on physical hardware, in a couple of cases Oracle’s Big Data Appliance, usually in conjunction with Oracle Database, Oracle Exadata and Oracle Exalytics for reporting.

One such project started off by the customer wanting to analyse a dataset that was too large for the space available in their Oracle database and that they couldn’t easily process or analyse using the SQL-based tools they usually used; in addition, like most large organisations, database and hardware provisioning took a long time and they needed to get the project moving quickly. We came in and quickly put together a virtualised Hadoop cluster together for them, on re-purposed hardware and using the free (Standard) edition of Cloudera CDH4, and then used the trial version of Oracle Big Data Connectors along with SFTP transfers to get data into the cluster and then analysed.

NewImage

The PoC itself then ran for just over a month with the bulk of the analysis being done using Oracle R Advanced Analytics for Hadoop, an extension to R that allows you to use Hive tables as a data source and create MapReduce jobs from within R itself; the output from the exercise was a series of specific-answer-to-specific-question R graphs that solved an immediate problem for the client, and showed the value of further investment in the technology and our services – the screenshot below shows a typical ORAAH session, in this case analyzing the flight delays dataset that you can also find on the Exalytics server and in smaller form in OBIEE 11g’s SampleApp dataset.

NewImage

That project has now moved onto a larger phase of work with Oracle Big Data Appliance used as the Hadoop platform rather than VMs, and Cloudera Hadoop upgraded from the free, unsupported Standard version to Cloudera Enterprise. The VMs in fact worked pretty well and had the advantage that they could be quickly spun-up and housed temporarily on an existing server, but were restricted by the RAM that we could assign to each VM – 2GB initially, quickly upgraded to 8GB per VM, and the fact that they were sharing CPU and IO resources. Big Data Appliance, by contrast, has 64GB or RAM per node – something that’s increasingly important now in-memory tools like Impala are begin used – and has InfiniBand networking between the nodes as well as fast network connections out to the wider network, something thats often overlooked when speccing up a Hadoop system.

The support setup for the BDA is pretty good as well; from a sysadmin perspective there’s a lights-out ILOM console for low-level administration, as well as plugins for Oracle Enterprise Manager 12c (screenshot below), and Oracle support the whole package, typically handling the hardware support themselves and delegating to Cloudera for more Hadoop-specific queries. I’ve raised several SRs on client support contracts since starting work on BDAs, and I’ve not had any problem with questions not being answered or buck-passing between Oracle and Cloudera.

NewImageOne thing that’s been interesting is the amount of actual work that you need to do with the Big Data Appliance beyond the actual installation and initial configuration by Oracle to “on-board” it into the typical enterprise environment. BDAs are left with customers in a fully-working state, but like Exalytics and Exadata though, initial install and configuration is just the start, and you’ve then got to integrate the platform in with your corporate systems and get developers on-boarded onto the platform. Tasks we’ve typically provided assistance with on projects like these include:

  • Configuring Cloudera Manager and Hue to connect to the corporate LDAP directory, and working with their security team to create LDAP groups for developer and administrative access that we then used to restrict and control access to these tools
  • Configuring other tools such as RStudio Server so that developers can be more productive on the platform
  • Putting in place an HDFS directory structure to support incoming data loads and data archiving, as well as directories to hold the output datasets from the analysis work we’re doing – all within the POSIX security setup that HDFS currently uses which limits us to just granting owner, group and world permissions on directories
  • Working with the client’s infrastructure team on things like alerting, troubleshooting and setting up backup and recovery – something that’s surprisingly tricky in the Hadoop world as Cloudera’s backup tools only backup from Hadoop-to-Hadoop, and by definition your Hadoop system is going to hold a lot of data, the volume of which your current backup tools aren’t going to easily handle

Once things are set up though you’ve got a pretty comprehensive platform that can be expanded up from the initial six nodes our customers’ systems typically start with to the full eighteen node cluster, and can use tools such as ODI to do data loading and movement, Spark and MapReduce to process and analyse data, and Hive, Impala and Pig to provide end-user access. The diagram below shows a typical future-state architecture we propose for clients on this initial BDA “starter config” where we’ve moved up to CDH5.x, with Spark and YARN generally used as the processing framework and with additional products such as MongoDB used for document-type storage and analysis:

NewImage

 

Something that’s turned out to be more of an issue on projects than I’d originally anticipated is complying with corporate security policies. By definition, most customers who buy an Oracle Big Data Appliance and going to be large customers with an existing Oracle database estate, and if they deal with the public they’re going to have pretty strict security and privacy rules you’ll need to adhere to. Something that’s surprising therefore to most customers new to Hadoop is how insecure or at least easily compromised the average Hadoop cluster is, with Hadoop FS shell security relying on trusted networks and incoming user connections and interfaces such as ODBC not checking passwords at all.

Hadoop and the BDA only becomes what’s termed “secure” when you link it to a Kerebos server, but not every customer has Kerebos set up and unless you enable this feature right at the start when you set up the BDA, it’s a fairly involved task to add retrospectively. Moreover, customers are used to fine-grained access control to their data, a single security model over their data and a good understanding in their heads as to how security works on their database, whereas Hadoop is still a collection of fairly-loosely coupled components with pretty primitive access controls, and no easy way to delete or redact data, for example, when a particular country’s privacy laws in-theory mandate this.

Like everything there’s a solution if you’re creative enough, with tools such as Apache Sentry providing role-based access control over Hive and Impala tables, alternative storage tools like HBase that permit read, write, update and delete operations on data rather than just HDFS’s insert and (table or partition-level) delete, and tools like Cloudera Navigator and BDA features like Oracle Audit Vault that provide administrators with some sort of oversight as to who’s accessing what data and when. As I mentioned in my blog post a couple of weeks ago, Oracle’s Big Data SQL product addresses this requirement pretty well, potentially allowing us to apply Oracle security over both relational, and Hadoop, datasets, but for now we’re working within current CDH4 capabilities and planning on introducing Apache Sentry for role-based access control to Hive and Impala in the coming weeks. We’re also looking at implementing Cloudera’s “secure gateway” cluster topology with all access restricted to just a single gateway Hadoop node, and the cluster itself firewalled-off with external access to just that gateway node and HTTP / REST API access to the various cluster services, for example as shown in the diagram below:

NewImage

My main focus on Hadoop projects has been on the overall Hadoop system architecture, and interacting with the client’s infrastructure and security teams to help them adopt the BDA and take over its maintenance. From the analysis side, it’s been equally as interesting, with a number of projects using tools such as R, Oracle R Advanced Analytics for Hadoop and core Hive/MapReduce for data analysis, Flume, Java and Python for data ingestion and processing, and most recently OBIEE11g for publishing the results out to a wider audience. Following the development model that we outlined in the second post in our updated Information Management Reference Architecture blog series, we typically split delivery of each project’s output into two distinct phases; a discovery phase, typically done using RStudio and Oracle R Advanced Analytics for Hadoop, where we explore and start understanding the dataset, presenting initial findings to the business and using their feedback and direction to inform the second phase; and a second, commercial exploitation phase where we use the discovery phases’ outputs and models to drive a more structured dimensional model with output begin in the form of OBIEE analyses and dashboards.

NewImage

We looked at several options for providing the datasets for OBIEE to query, with our initial idea being to connect OBIEE directly to Hive and Impala and let the users query the data in-place, directly on the Hadoop cluster, with an architecture like the one in the diagram below:

NewImage

In fact this turned out to not be possible, as whilst OBIEE 11.1.1.7 can access Apache Hive datasources, it currently only ships with HiveServer1 ODBC support, and no support for Cloudera Impala, which means we need to wait for a subsequent release of OBIEE11g to be able to report against the ODBC interfaces provided by CDH4 and CDH5 on the BDA (although ironically, you can get HiveServer2 and Impala working on OBIEE 11.1.1.7 on Windows, though this platform isn’t officially supported by Oracle for Hadoop access, only Linux). Whichever way though, it soon became apparent that even if we could get Hive and Impala access working, in reality it made more sense to use Hadoop as the data ingestion and processing platform – providing access to data analysts at this point if they wanted access to the raw datasets – but with the output of this then being loaded into an Oracle Exadata database, either via Sqoop or via Oracle Loader for Hadoop and ideally orchestrated by Oracle Data Integrator 12c, and users then querying these Oracle tables rather than the Hive and Impala ones on the BDA, as shown in the diagram below.

NewImage

In-practice, Oracle SQL is far more complete and expressive than HiveQL and Impala SQL and it makes more sense to use Oracle as the query platform for the vast majority of users, with data analysts and data scientists still able to access the raw data on Hadoop using tools like Hive, R and (when we move to CDH5) Spark.

The final thing that’s been interesting about working on Hadoop and Big Data Appliance projects is that 80% of it, in my opinion, is just the same as working on large enterprise data warehouse projects, with 20% being “the magic”. A large portion of your time is spent on analysing and setting up feeds into the system, just in this case you use tools like Flume instead of GoldenGate (though GoldenGate can also load into HDFS and Hive, something that’s useful for transactional database data sources vs. Flume’s focus on file and server log data sources). Another big part of the work is data processing, ingestion, reformatting and combining, again skills an ETL developer would have (though there’s much more reliance, at this point, on command-line tools and Unix utilities, albeit with a place for tools like ODI once you get to the set-based filtering, joining and aggregating phase). In most cases, the output of your analysis and processing will be Hive and Impala tables so that results can be analysed using tools such as OBIEE, and you therefore need skills in areas such as dimensional modelling, business analysis and dashboard prototyping as well as tool-specific skills such as OBIEE RPD development.

Where the “magic” happens, of course, is the data preparation and analysis that you do once the data is loaded, quite intensively and interactively in the discovery phase and then in the form of MapReduce and Spark jobs, Sqoop loads and Oozie workflows once you know what you’re after and need to process the data into something more tabular for tools like OBIEE to access. We’re building up a team competent in techniques such as large-scale data analysis, data visualisation, statistical analysis, text classification and sentiment analysis, and use of NoSQL and JSON-type data sources, which combined with our core BI, DW and ETL teams allows us to cover the project from end-to-end. It’s still relatively early days but we’re encouraged by the response from our project customers so far, and – to be honest – the quality of the Oracle big data products and the Cloudera platform they’re based around – and we’re looking forward to helping other Oracle customers get the most out of their adoption of these new technologies. 

If you’re an Oracle customer looking to make their first move into the worlds of Hadoop, big data and advanced analytics techniques, feel free to drop me an email at mark.rittman@rittmanmead.com  for some initial advice and guidance – the fact we come from an Oracle-centric background as well typically makes it easier for us to relate these new concepts to the ones you’re typically more familiar with. Similarly, if you’re about to bring on-board an Oracle Big Data Appliance system and want to know how best to integrate it in with your existing Oracle BI, DW, data integration and systems management estate, get in contact and I’d be happy to share experiences and our delivery approach.

Categories: BI & Warehousing

Vote for Rittman Mead at the UKOUG Partner of the Year Awards 2014!

Rittman Mead Consulting - Mon, 2014-08-11 03:00

Rittman Mead are proud to announce that we’ve been nominated by UKOUG members and Oracle customers for five categories in the upcoming UKOUG Parter of the Year Awards 2014;  Business Intelligence, Training, Managed Services, Operating Systems Storage and Hardware, and Emerging Partner, reflecting the range of products and services we now offer for customers in the UK and around the world.

NewImage

Although Rittman Mead are a worldwide organisation with offices in the US, India, Australia and now South Africa, our main operation is in the UK and for many years we’ve been a partner member of the UK Oracle User Group (UKOUG). Our consultants speak at UKOUG Special Interest Group events as well as the Tech and Apps conferences in December each year, we write articles for Oracle Scene, the UKOUG members’ magazine, and several of our team including Jon and myself have held various roles including SIG chair and deputy chair, board member and even editor of Oracle Scene.

Partners, along with Oracle customers and of course Oracle themselves, are a key part of the UK Oracle ecosystem and to recognise their contribution the UKOUG recently brought in their Partner of the Year Awards that are voted on by UKOUG members and Oracle customers in the region. As these awards are voted on by actual users and customers we’ve been especially pleased over the years to win several Oracle Business Intelligence Partner of the Year Gold awards, and last year we were honoured to receive awards in five categories, including Business Intelligence Partner of the Year, Training Partner of the Year and Engineered Systems Partner of the Year.

This year we’ve been nominated again in five categories, and if you like what we do we’d really appreciate your vote, which you can cast at any time up to the closing date, September 15th 2014. Voting is open to UKOUG members and Oracle customers and only takes a few minutes – the voting form is here and you don’t need to be a UKOUG member, only an Oracle end-user or customer – these awards are a great recognition for the hard work out team puts in, so thanks in advance for any votes you can put in for us!

Categories: BI & Warehousing

Test your WebLogic 12.1.3 enviroment with Robot

Edwin Biemond - Sun, 2014-08-10 11:42
Robot Framework is a generic test automation framework which has an easy-to-use tabular test data syntax and it utilizes the keyword-driven testing approach. This means we can write our tests in readable and understandable text. If we combine this with the REST Management interface of WebLogic 12.1.3 we are able to test every detail of a WebLogic domain configuration and when we combine this

Asus Zenfone Smartphone Android Terbaik

Daniel Fink - Sat, 2014-08-09 18:51
Asus Zenfone Smartphone Android Terbaik - This isn’t unhealthy albeit you have got ne'er taken baking categories before as a result of the training method takes you from basic skills to additional advance work. a number of these colleges additionally supply berth programs thus you'll observe what you have got learned below the direction of a master chocolatier.

But nothing compares to education right there within the room. you simply got to create the time and if you can’t create it to categories on weekdays, see if there area unit those being offered on weekends.

The cost of tuition in chocolate creating colleges varies reckoning on the kind of program and additionally if this can be tired the room or reception. For those that commit to learn within the room, they don’t need to worry as a result of all the materials they have can already be provided. For those reception, they need to shop for these from the craft store and move with what they need.

Learning the way to create chocolate with the assistance of trained professionals is way higher than attempting to good however it's done through trial and error. After all, it's a certain science once it involves intermixture the ingredients and somewhat little bit of selling if you're attending to sell this product within the market.

Once you get the suspend of things, you'll strive some experiments to create concoctions of your own. After all, chocolates don't forever are available boxes. The Different Processes in creating Chocolate There area unit alternative ways within which you'll learn in creating chocolate. the primary issue that you simply need to fathom is wherever do these delicious treats come back from? Most of you'll already understand the solution. Chocolates area unit made up of the beans of cocoa.

From the trees to the chocolate manufacturers, however such processes extremely evolve? Through time, there are several developments relating to chocolate creating. Technology has benefited plenty of life's endeavors. This additionally applies to the method of chocolate creating.

But such advancement solely applies on the gathering half. The process primarily remains constant, the recent standard means. As what is been same, don't fix a issue if it is not broken. perhaps constant rule is being applied to the current venture.

It feels smart to eat chocolates. however does one wish to grasp regarding the Zenfone various strategies that go behind such concept? Here area unit some.

Roasting It takes a decent quantity of cooking moreover as cocoa seed fermentation to return up with the standard of chocolate that you simply area unit searching for. within the pre-roasting stage, the beans area unit directed to infrared beaming heaters. This method can exclude the nibs of the beans from the shells. The temperature for this half is one hundred to a hundred and forty degree Celsius. This takes regarding twenty up to forty minutes.

Roasting also can be done directly. when the beans area unit roast, the shells are often simply removed. this can be favored by most chocolate manufacturers as a result of it retains the flavour of the beans. For this half, the temperature is at one hundred fifty to a hundred and sixty degree Celsius.

Fermentation This is done to decrease the extent of sugar, aldohexose moreover as laevulose and additionally amino acids within the beans. This brings within the flavor of the beans that the method of cooking are going to be able to enhance. however not a soul will try this. It takes a master to hone this craft. Beans will rot if one thing goes wrong with this method.

Shelling To be able to take away the shells from the beans, it takes additional processes than you'll ever imagine. This includes edge, then winnowing and last, winnowing. each step is vital thus on come back up with the grains that have the correct size. Tasting If you think that that this can be a simple task, well, that appears to be not the case. This involves ability and experience. One should have studied each style of the various types and variations of chocolates to be able to proclaim that they'll perform well on this and be a choose on what varieties ought to be given to the market.

These individuals are often compared to wine specialists. simply a bite from a chocolate treat can tell them what processes it went through, what quite beans was used or wherever it had been really created. And there area unit Asus Zenfone nonetheless totally different sorts of chocolates out there within the market. Imagine what all those need to bear simply to be able to reach your favorite food market in order that you'll purchase them for your own consumption.

You don't need to be Associate in Nursing skilled in creating chocolate. however you'll begin following some techniques within the tasting half. If you're treated with a crammed chocolate, let it linger on your mouth till it melts and you'll style all its flavors. you'll then chew it for regarding 5 times, enough for the flavour and therefore the coating to mix in.

Chocolate has its unchanged charm that hooks many another person with a appetency. Then again, a number of the chocolates area unit extremely expensive . In reality, given many tips and tricks, you'll really produce your own chocolate and save yourself cash and increase your delight due to your self-creation.

Federal Reserve Board backs up e-Literate in criticism of Brookings report on student debt

Michael Feldstein - Sat, 2014-08-09 13:30

I have been very critical of the Brookings Institution report on student debt, particularly in my post “To see how illogical the Brookings Institution report on student loans is, just read the executive summary”.

D’oh! It turns out that real borrowers with real tax brackets paying off off real loans are having real problems. The percentage at least 90 days delinquent has more than doubled in just the past decade. In fact, based on another Federal Reserve report, the problem is much bigger for the future, “44% of borrowers are not yet in repayment, and excluding those, the effective 90+ delinquency rate rises to more than 30%”.

More than 30% of borrowers who should be paying off their loans are at least 90 days delinquent? It seems someone didn’t tell them that their payment-to-income ratios (at least for their mythical average friends) are just fine and that they’re “no worse off”.

Well now the Federal Reserve Board themselves weighs in on the subject with a new survey, at least as described by an article in The Huffington Post.  I have read the Fed report and concur with HP analysis – it does argue against the Brookings findings.

Among the emerging risks spotlighted by the survey is the nation’s $1.3 trillion in unpaid student debt, suggesting that high levels of student debt are crimping the broader economy. Nearly half of Americans said they had to curb their spending last year in order to make payments on student loans, adding weight to the fear among federal financial regulators that the burden of student debt on households will depress economic growth for years to come.

Some 35 percent of survey respondents who are paying back student loans said they had to reduce their spending by “a little” over the past year to keep up with their student debt payments. Another 11 percent said they had to cut back their spending by “a lot.”

The Fed’s findings appear to challenge recent research by a pair of economists at the Brookings Institution, highlighted in The New York Times and cited by the White House, that argues that households with student debt are no worse off today than they were two decades ago.

The full Fed report can be found here. Much of the survey was focused on borrowers and their perceptions of how their student loans impact them, which is much more reliable than Brookings’ assumptions on how convoluted financial ratios should affect borrowers. In particular, consider this table:

Fed Table 11

Think about this situation – amongst borrowers who have completed their degrees, almost equal numbers think the financial benefits of a degree outweigh the costs as think the opposite (41.5% to 38.1%). I don’t see this as an argument against getting a degree, but rather as clear evidence that the student loan crisis is real and will have a big impact on the economy and future student decision-making.

Thanks to the Federal Reserve Board for helping us out.

Update: Clarified that this is Federal Reserve Board and not NY Fed.

The post Federal Reserve Board backs up e-Literate in criticism of Brookings report on student debt appeared first on e-Literate.

Required Field Validation in Oracle MAF

Shay Shmeltzer - Fri, 2014-08-08 16:38

A short entry to explain how to do field validation in Oracle MAF. As an example let's suppose you want a field to have a value before someone clicks to do an operation.

To do that you can set the field's attribute for required and "show required" like this:

  <amx:inputText label="label1" id="it1" required="true" showRequired="true"/>

 Now if you run your page, leave the field empty and click a button that navigates to another page, you'll notice that there was no indication of an error. This is because you didn't tell the AMX page to actually do a validation. 

 To add validation you use an amx:validationGroup tag that will surround the fields you want to validate.

For example:

     <amx:validationGroup id="vg1">

    <amx:inputText label="label1" id="it1" required="true" showRequired="true"/>

    </amx:validationGroup>

Then you can add a amx:validateOperation tag to the button that does navigation and tell it to validate the group you defined before (vg1 in our example).

       <amx:commandButton id="cb2" text="go" action="gothere">

        <amx:validationBehavior id="vb1" group="vg1"/>

      </amx:commandButton>

Now when you run the page and try to navigate you'll get your validation error.

Categories: Development

Partner Webcast - The Revolution of Oracle Java 8

Java 8, released in March 2014, is a revolutionary release of the world’s #1 development platform. It is the single largest upgrade ever to the programming model, with coordinated core code evolution...

We share our skills to maximize your revenue!
Categories: DBA Blogs

Social Commerce: Shopping Inside of Social

Linda Fishman Hoyle - Fri, 2014-08-08 13:12

A Guest Post by Mike Stiles, Senior Content Manager for Oracle Social (pictured left)

We know the value of friends recommending products to friends, but are we seeing these motivated transactions conducted immediately on the social platforms themselves? Is social commerce still a thing?

 What really seems to matter most is whether or not brand participation on social channels is generating incoming traffic to wherever transactions happen to be transacted. In fact, the very definition of sCommerce has quietly morphed over the years from sales conducted on Facebook, to sales resulting from social.

On-Facebook stores are still available, of course. Brands like J.C. Penney, GNC, Levi’s and 1-800-Flowers have done it or are doing it. But the real drive, budget-wise, is to use social to generate traffic and leads as opposed to building social stores. Social budgets are also moving to rounding up leads and sales as opposed to branding. The expectations for pre-sold shoppers to come from social to the brand’s transaction location and make the purchase are high.

And yet…despite a Shopify survey that found Facebook driving almost two-thirds of social visits to Shopify stores and claiming a 129% year over year increase of orders from social, and despite the barely known Polyvore driving the top average order value of $66.75, less than 2% of traffic to retailers’ sites comes from social. And almost half of retailers said less than 1% of social shoppers wound up buying anything. The best social conversion rate is Facebook’s 1.85%.

So what’s broken? Every hoop a buyer has to jump through is a golden opportunity for that buyer to reconsider, change their mind, or put off the purchase. The shortest, most frictionless path from discovery to reassurance to sale should be every brand’s Apollo mission. And since two of those three things are happening primarily on social, sales inside of social, that original definition of sCommerce, might be worth a solid second look.

The social nets are inching forward. Pinterest, the proclaimed king of purchase intent, has rich pins so prices and inventory can be updated real-time. You can reply to tweets with Amazon product links adding #AmazonCart and throw the item into your shopping cart. You can make AMEX purchases by adding a hashtag. But these things amount to better social catalog experiences or buy link usage, not purchase-inside-social opportunities.

Pictures leaked from Fancy in January gave us a peek at Twitter Commerce. Brand tweets can be expanded to show a Buy button, from which you could purchase the item inside the Twitter app. Now we’re talking. OpenSky is trying to get there as well.

The goal is to capitalize on everything social brings in terms of shopping and exposure to products tied to users’ visible interests, capitalize on the trusted recommendations of social connections, use content as your virtual end-aisle displays, use the ongoing social relationships you have with customers and rich social data to keep bumping them toward a purchase, customize their experiences, and find the quickest way to satisfy the buying impulse when it strikes.

Finding something you want to buy in a store and then being told by the clerk you have to go two buildings down to buy it sounds silly. Digital hoops are equally silly.

Grid/CRS AddNode or runInstaller fails with NullPointerException

Jeremy Schneider - Fri, 2014-08-08 12:43

Posting this here mostly to archive it, so I can find it later if I ever see this problem again.

Today I was repeatedly getting this error while trying to add a node to a cluster:

(grid)$ $ORACLE_HOME/oui/bin/addNode.sh -silent -noCopy CRS_ADDNODE=true CRS_DHCP_ENABLED=false INVENTORY_LOCATION=/u01/oraInventory ORACLE_HOME=$ORACLE_HOME "CLUSTER_NEW_NODES={new-node}" "CLUSTER_NEW_VIRTUAL_HOSTNAMES={new-node-vip}"
Starting Oracle Universal Installer...

Checking swap space: must be greater than 500 MB.   Actual 24575 MB    Passed
Oracle Universal Installer, Version 11.2.0.3.0 Production
Copyright (C) 1999, 2011, Oracle. All rights reserved.

Exception java.lang.NullPointerException occurred..
java.lang.NullPointerException
        at oracle.sysman.oii.oiic.OiicAddNodeSession.initialize(OiicAddNodeSession.java:524)
        at oracle.sysman.oii.oiic.OiicAddNodeSession.<init>(OiicAddNodeSession.java:133)
        at oracle.sysman.oii.oiic.OiicSessionWrapper.createNewSession(OiicSessionWrapper.java:884)
        at oracle.sysman.oii.oiic.OiicSessionWrapper.<init>(OiicSessionWrapper.java:191)
        at oracle.sysman.oii.oiic.OiicInstaller.init(OiicInstaller.java:512)
        at oracle.sysman.oii.oiic.OiicInstaller.runInstaller(OiicInstaller.java:968)
        at oracle.sysman.oii.oiic.OiicInstaller.main(OiicInstaller.java:906)
SEVERE:Abnormal program termination. An internal error has occured. Please provide the following files to Oracle Support :

"Unknown"
"Unknown"
"Unknown"

There were two notes on MOS related to NullPointerExceptions from runInstaller (which is used behind the scenes for addNode in 11.2.0.3 on which I had this problem). Note 1073878.1 describes addNode failing in 10gR2, and the root cause was that the home containing CRS binaries was not registered in the central inventory. Note 1511859.1 describes attachHome failing, presumably on 11.2.0.1 – and the root cause was file permissions that blocked reading of oraInst.loc.

Based on these two notes, I had a suspicion that my problem had something to do with the inventory. Note that you can get runInstaller options by running “runInstaller -help” and on 11.2.0.3 you can debug with “-debug -logLevel finest” at the end of your addNode command line. The log file is produced in a logs directory under your inventory. However in this case, it produces absolutely nothing helpful at all…

After quite a bit of work (even running strace and ltrace on the runInstaller, which didn’t help one bit)… I finally figured it out:

(grid)$ grep oraInst $ORACLE_HOME/oui/bin/addNode.sh
INVPTRLOC=$OHOME/oraInst.loc

The addNode script was hardcoded to look only in the ORACLE_HOME for the oraInst.loc file. It would not read the file from /etc or /var/opt/oracle because of this parameter.

On this particular server, there was not an oraInst.loc file in the grid ORACLE_HOME. Usually the file is there when you do a normal cluster installation. In our case, it’s absence was an artifact of the specific cloning process we use to rapidly provision clusters. As soon as I copied the file from /etc into the grid ORACLE_HOME, the addNode process continued as normal.

Sometimes it would be nice if runInstaller could give more informative error messages or tracing info!

12c: Fun with WITH!

Pythian Group - Fri, 2014-08-08 11:30

Last night I couldn’t sleep and what else you’re going to do? I was thinking about Oracle stuff.

In Oracle version 12, Oracle has enhanced the WITH clause – traditionally used for sub-query factoring – to allow the declaration of functions and procedures. This can be (ab)used to create a very interesting scenario, that is not very common in Oracle: Reading data within the same SELECT statement, but from two different points in time. And the points in time are in the future, and not in the past.

Let’s say I want to take a snapshot of the current SCN, and then another one 5 or 10 seconds after that. Traditionally we’d have to store that somewhere. What if I could take two snapshots – at different SCNs – using a single SELECT statement ? Without creating any objects ?

col value for a50
set lines 200 pages 99

with  
procedure t (secs in number, scn out varchar2)
  is
    pragma autonomous_transaction;
  begin
    dbms_lock.sleep(secs);
    select 'at ' || to_char(sysdate,'HH24:MI:SS') || ' SCN: ' 
                 || dbms_flashback.get_system_change_number 
      into scn 
      from dual;
  end;
function wait_for_it (secs in number) 
 return varchar2 is
    l_ret varchar2(32767);
  begin
    t(secs, l_ret);
    return l_ret;
  end;
select 1 as time, 'at ' || to_char(sysdate,'HH24:MI:SS') || ' SCN: ' 
                || dbms_flashback.get_system_change_number as value 
  from dual
union all
select 5, wait_for_it(5) from dual
union all
select 10, wait_for_it(5) from dual
/

And the result is:

      TIME VALUE
---------- --------------------------------------------------
         1 at 09:55:49 SCN: 3366336
         5 at 09:55:54 SCN: 3366338
        10 at 09:55:59 SCN: 3366339

 


We can clearly see there, that the SCN is different, and the time shown matches the intervals we’ve chosen, 5 seconds apart. I think there could be some very interesting uses for this. What ideas can you folks come up with ?

Categories: DBA Blogs

We Have Slap Bands

Oracle AppsLab - Fri, 2014-08-08 09:34

As part of a secret project Noel (@noelportugal) and Raymond are cooking up, Noel ordered some AppsLab-branded slap bands.

appslab-slap-band-1

The bands were produced by Amazing Wristbands (@AMZG_Wristbands), and Noel has nothing but good things to say about them, in case you’re looking for your own slap bands.

Anyway, I’m sure we’ll have some left over after the double-secret project. So, if you want one, let us know.

Find the comments.Possibly Related Posts:

Transaction guard

Laurent Schneider - Fri, 2014-08-08 08:05

Getting the logical transaction id in 12c will greatly simplify your error handling and enhance your business continuity in your application.

In 11g and below, your java code use to look like


try {
  insert into...
} catch () {
  error_handling()
}

but one probably assumed the insert failed when it was committed (e.g. database server process core dump).

Now in 12c, you can get a logical transaction id and then later, from another session, check if that transaction was committed. Which solves quite a bunch of integrity issues (e.g. duplicate rows)

Let’s try


import java.sql.*;
import oracle.jdbc.pool.*;
import oracle.jdbc.*;

public class TG {
  public static void main(String argv[]) throws
      SQLException {
    String url = "jdbc:oracle:thin:@(DESCRIPTION"
      +"=(ADDRESS=(PROTOCOL=TCP)(Host=srv01)("
      +"Port=1521))(CONNECT_DATA=(SERVICE_NAME="
      +"svc01)))";
    OracleDataSource ods=new OracleDataSource();
    ods.setURL(url);
    ods.setUser("SCOTT");
    ods.setPassword("tiger");
    OracleConnection conn = (OracleConnection) 
      ods.getConnection();
    LogicalTransactionId ltxid = conn.
      getLogicalTransactionId();
    try {
      System.out.println("Start");
      conn.prepareStatement(
        "insert into t values (1)").execute();
      if (Math.random() > .5) {
        throw new Exception();
      }
      System.out.println("OK");
    } catch (Exception e) {
      System.out.println("ERROR");
      OracleConnection conn2 = 
        (OracleConnection) ods.getConnection();
      CallableStatement c = conn2.prepareCall(
        "declare b1 boolean; b2 boolean; begin" 
        +"DBMS_APP_CONT.GET_LTXID_OUTCOME(?,b1,"
        +"b2); ? := case when B1 then "
        +"'COMMITTED' else 'UNCOMMITTED' end; "
        +"end;");
      c.setBytes(1, ltxid.getBytes());
      c.registerOutParameter(2, 
        OracleTypes.VARCHAR);
      c.execute();
      System.out.println("Status = "+
        c.getString(2));
    }
  }
}

getLogicalTransactionId gives me a transaction id (this is internally saved in SYS.LTXID_TRANS so it survives reboots, failover and disconnections) and GET_LTXID_OUTCOME gets the outcome.

There is few preparation steps


GRANT EXECUTE ON DBMS_APP_CONT TO SCOTT;
declare PARAMETER_ARRAY dbms_service.
  svc_parameter_array; 
begin 
  PARAMETER_ARRAY('COMMIT_OUTCOME'):='true';
  dbms_service.create_service(
    'SVC01','TNS01',PARAMETER_ARRAY); 
  dbms_service.start_service('SVC01'); 
end;
/
CREATE TABLE SCOTT.T(x number);

Due to my Random() call, I get exceptions sometimes, but it is always commits


C:\> java TG
Start
OK

C:\> java TG
Start
ERROR
Status = COMMITTED

C:\> java TG
Start
ERROR
Status = COMMITTED

No need to redo the insert.

Now I dropped the table t and run the same code


SQL> drop table scott.t;

Table dropped.

C:\>java TG
Start
ERROR
Status = UNCOMMITTED

Now it fails and I know it!

Log Buffer #383, A Carnival of the Vanities for DBAs

Pythian Group - Fri, 2014-08-08 07:34

This Log Buffer Edition picks few of the informative blog posts from Oracle, SQL Server, and MySQL fields of database.


Oracle:

g1gc logs – Ergonomics -how to print and how to understand

In Solaris 11.2, svcs gained a new option, “-L”.  The -L option allows a user to easily look at the most recent log events for a service.

ADF Thematic Map component from DVT library was updated in ADF 12c with marker zoom option and area layer styling

When cloning pluggable databases Oracle gives you also SNAPSHOT COPY clause to utilize storage system snapshot capabilities to save on storage space.

It is normal for bloggers including myself to post about the great things they have done.

SQL Server:

In six years Microsoft has come from almost zero corporate knowledge about how cloud computing works to it being an integral part of their strategy.

A brief overview of Columnstore index and its usage with an example.

The Road To Hell – new article from the DBA Team

Encryption brings data into a state which cannot be interpreted by anyone who does not have access to the decryption key, password, or certificates.

How to test what a SQL Server application would do in the past or in the future with date and time differences.

MySQL:

MySQL for Visual Studio 1.2.3 GA has been released

An approach to MySQL dynamic cross-reference query.

The MySQL replication and load balancing plugin for PHP, PECL/mysqlnd_ms, aims to make using a cluster of MySQL servers instead of a single server as transparent as possible.

Picking the Right Clustering for MySQL: Cloud-only Services or Flexible Tungsten Clusters? New webinar-on-demand.

Collation options for new MySQL schemas and tables created in MySQL for Excel

Categories: DBA Blogs

Oracle Database RAC Diagnostics and Tuning

Oracle Real Application Clusters (Oracle RAC) is a clustered version of Oracle Database based on a comprehensive high-availability stack that can be used as the foundation of a database cloud system...

We share our skills to maximize your revenue!
Categories: DBA Blogs

Oracle Priority Service Infogram for 07-AUG-2014

Oracle Infogram - Thu, 2014-08-07 15:05

OpenWorld
It’s closing on us fast! From Proactive Support - Portals: Learn, Connect and Explore at Oracle OpenWorld 2014
Security
Five Computer Security Myths, Debunked by Experts, from LifeHacker.
A new white paper: Mobile Security in a BYOD World.
Oracle issues a virtual strongbox for enterprise encryption keys, from PCWorld.
Neowin is the bearer of good news: CryptoLocker malware has been cracked, victims able to decrypt their files for free.
RDBMS
From Oracle DB/EM Support: Master Note for ORA-1555 Errors.
SQL
From Galo Balda's Blog: New in Oracle 12c: Querying an Associative Array in PL/SQL Programs.
Solaris
From OSTATIC: Oracle Delivers Solaris 11.2 with OpenStack, Integrated SDN Features.
HA-LDOM live migration in Oracle Solaris Cluster 4.2, from the Oracle Solaris Cluster Oasis.
Java
From The Java Source: Tech Article: Generics: How They Work and Why They Are Important.
From DZone: Using Oracle AQ in Java Won’t Get Any Easier Than This.
EPM
Infogram contributor Yin-Jen Su pointed out this great YouTube channel: Oracle EPM Products YouTube channel.
Here’s the announcement from Jan Greenburg:
I’m pleased to announce 3 new videos on our YouTube Channel (https://www.youtube.com/user/OracleEPMWebcasts/videos)!
For Oracle Planning and Budgeting Cloud Service (PBCS): Using Predictive Planning  -- demonstrates generating statistical predictions based on historical data.Managing Database Properties- - demonstrates accessing database properties.For on-premise EPM:
Four-part series on deploying EPM System Products:Part 1 Overview -- demonstrates the standard deployment methodology. It contains links to parts 2, 3 and 4.Part 2 Preparing for Deployment -- demonstrates preparing for standard deployment.Part 3 Installing and Configuring an Initial Instance -- demonstrates installing and configuring an initial instance. Part 4 Scaling Out and Installing EPM System Clients -- demonstrates scaling EPM System components and installing EPM System client software.
Fyi … in addition to accessing videos from our YouTube channel, you can also access our videos from these Oracle Learning Libraries (OLL):
EPM Consolidation and Planning Videos Oracle Learning Library (on premise videos): https://apex.oracle.com/pls/apex/f?p=44785:141:25017935607895::NO:141:P141_PAGE_ID%2CP141_SECTION_ID:133%2C959Oracle Planning and Budgeting Cloud Service Library: https://apex.oracle.com/pls/apex/f?p=44785:141:108373392382468::NO:141:P141_PAGE_ID%2CP141_SECTION_ID:91%2C658
OLL provides social networking capabilities that allow you to bookmark, share items through social media, review items, recommend items and create collections that can be private or public.
Oracle Community
Lots of goings on at Oracle Community.
EBS
From the Oracle E-Business Suite Support Blog:
Introducing the Trading Community Architecture APIs Information Center
Value Chain Planning, Advanced Supply Chain Planning, and Inventory Optimization Safety Stock
Use the Item Open Interface to Quickly Add or Update Inventory Items
So How is Everyone Doing Submitting Enhancements in the Procurement Community?
StartUP Demantra, Configuration & Troubleshooting, Steps, Tips & Tricks
Oracle Application Management Pack for Oracle E-Business Suite (AMP) Release 12.1.0.3.0 is Available
Overview of Inventory Transaction Accounting, Part 1 of 3
New Upgrade, Patching & Maintenance Advisors for R12.2.x
Guided Resolution Now Available for Cancel or Discard Issues!
What's New in the My Oracle Support Community (MOSC)
From Oracle E-Business Suite Technology
JRE 1.7.0_67 Certified with Oracle E-Business Suite
New Solaris SPARC OS Requirements for EBS R12
Business
10 Things Speakers Should Never Do, from collaborate.
…and Finally

An Oracle that was not us: The Oak Ridge Automatic Computer and Logical Engine (ORACLE), Oak Ridge National Laboratory, 1953, from Adafruit.