Feed aggregator

Oracle VPN

Peter O'Brien - Sat, 2009-12-26 17:31
Virtual Private Networks have made remote working and access to critical corporate resources so much easier over the past few years. In fact, there is so much about the way we work today that is dependent on being able to get at data remotely yet in a secure manner. The VPN diagram to the left provides a simple representation of what a VPN does. It creates a secure tunnel (red lines) for the transfer of data (green line) between two points.

So what do you do when you need to get some work done, but you've got your corporate laptop safely and securely locked in a drawer in the office?

There is a website for Oracle employees that has software downloads, remote access tools, patches and so on. It is called Oracle Desktop External Support and can be accessed over the internet without the corporate VPN. Almost all downloads on the site are suitable for installation and are licensed for use on Oracle employee's personal systems. I have the site on my synchronised bookmarks and it comes in very handy.

Happy Xmas to everybody !!!!!

Bas Klaassen - Fri, 2009-12-25 05:40
Bas Klaassenhttp://www.blogger.com/profile/04080547141637579116noreply@blogger.com0
Categories: APPS Blogs

Making Merge, Revert and Branching Easier with SVN

Susan Duncan - Wed, 2009-12-16 06:04
In the latest release of JDeveloper ( we've added new declarative UIs to help simplify those most crucial and complex of SVN tasks - Merging and Branching. These SVN operations can be tricky. They involve ensuring that you pick the correct location in your repository, the correct revision, and mistakes can be costly and time consuming to fix.

Take a look at the wizard below. This is available whether you choose Merge Working Copy or if you want to Merge a single file. My top tip is that you should almost always use Merge Working Copy, especially if you are working with ADF (Application Development Framework) as many of the XML metadata files for your application will have dependencies on Java files and ending up with missed or out of sync files on a merge of any sort can lead to problems.

In previous releases we simply gave you the ability to enter your URLs and revisions, almost as you would if using the command line to access SVN. But in JDeveloper we pride ourselves on providing declarative help for developers. This wizard is invoked from a checked out application context - so you are always working within your working copy.

In this first step you choose the type of merge you are undertaking, for instance Merge Selected Revision Range is used if you want to update a branch with changes from the trunk. Let's take a common scenario: User 1 is working on Branch 123_susan. She knows that primarily her work will not cause any conflict with other work on other branches or the trunk so she has not worried too much about merging the latest trunk code into her branch. However, she comes across a problem and asks User 2 to fix something in the trunk so she can progress. She is told the revision that contains the specific fix she needs so she uses Merge Selected Revision Range to get that revision into her branch. The dialog below shows the path to trunk/blues as the source and her working copy - her branch - is the destination for the merge.

The revision to be merged - 275 - was picked using the List Revisions dialog. This dialog is used whenever revisions need to be browsed or selected. Notice that selecting 275 in the top panel shows the details of that revision - the comments, the files, the action and the author
Now the user can complete move to step 3 of the wizard, choose other merge options and carry out a test merge to check her work and finally complete the merge of that one revision into her working copy. Finally, she will commit her working copy back to its branch on the repository as normal.

Once she has completed the work on her branch she will be ready to use the Re-integrate A Branch merge to get it back into the trunk of the development. As always, the way to do this is to carry out any merge into a local working copy and then commit. So the first step for the user is to do a final commit of her branch. At this stage she can remove her branched working copy as it is finished with. She checks out the latest trunk revision and now she is ready to merge her branch to her new (trunk) working copy. Let's look at that another way:

1. A branch is created from trunk
2. User 1 checks out working copy from branch and edits code
3. Commits back to branch periodically
4. User 2 commits revision 200 to trunk and User 1 merges that revision to the working copy
5. Finishes coding working copy and commits to branch
6. Checks out new working copy from trunk
7. Merges branch into working copy
8. Commits working copy back to trunk

Before Subversion 1.5 the merge of a single revision in step 4 would have caused problems when the final merge of the branch back to the trunk was done (step 7). But SVN now has merge tracking so it keeps track of the revisions that have already been merged so that the final merge of all revisions goes ahead without error.

The third merge option is Merge Two Different Trees. This allows you to compare two different branches and merge and differences back into your working copy. This is the most complicated and, I would say, the least used of the merges.

One final tip - what if you want to revert your working copy to a specific revision? Use the Merge Selected Revision Range option. In List Revisions select the range of revisions from the latest to the revision you want to revert to and click OK. This will enter the range. But the range list will be ascending - eg 258-359. To revert to revision 258 simply reverse the range order and hey presto! a revert back rather than a merge up will be done - try it out in the test merge panel to check it - and you're done!

Connecting to DB2 using JDeveloper

Peeyush Tugnawat - Tue, 2009-12-15 13:53


Get the following Jar files from the DB2 server and store them in a local folder




Follow the following steps











Browse to where you have these three jar files












Other References:



Connecting to DB2 using JDeveloper

Peeyush Tugnawat - Tue, 2009-12-15 13:53


Get the following Jar files from the DB2 server and store them in a local folder




Follow the following steps











Browse to where you have these three jar files












Other References:



Working with Versioned Database Models

Susan Duncan - Tue, 2009-12-15 08:47
What? 3 posts in 2 days? After months of silence? Isn't it amazing how time flies. Here in the JDeveloper Product Management group we've been busy all year with releases and conferences and many opportunities to get out and talk to people about the great features in our tool. But as the year draws nearer to a close and all the traveling is over I've finally got time to get out on my blog some of the things that I've been evangelizing about all year. As you'll have noticed, one of those is the great new functionality that we've introduced in database modeling.

Periodically I'm asked why we don't introduce a database repository to store database models. One that can be queried and can store versions of the modeled objects? We've been down that road (those of you who've been around for a while know what I'm talking about!) but think that what we have now in JDeveloper is much more flexible. What is it that you want from a repository? Amongst the most important answers to that question is the ability to store multiple versions of your database objects, to query and to compare them.

In previous posts I've demonstrated how you can use our Database Reporting to query your database models and output the results. In this post I want to show you how you can use JDeveloper's integrated versioning system capabilities not only to maintain multiple versions of your database model but how you can resolve any conflicts that arise when multiple users update the model, using a declarative interface.

In this example I am using Subversion (SVN), one of the versioning systems that is integrated in JDeveloper. It is an open source system, widely used in the application development world. I am not going to step through a complete process for versioning in this post, there are various tutorials, how-to, demos. white papers available on OTN and if you want more information on SVN there is a very good online book.

Here is a part of my application. Note that it contains an offline database model (Blues) and one schema (GOTTHEBLUES) containing a number of database objects. This is a project that I'm currently working on. It is stored in an SVN repository - each of the objects has a revision number next to it. Notice that the PERSONS table is at version 378. In SVN the repository revision number is advanced each time a check in of files is done. So this does not mean that the PERSONS table has been checked in 378 times, it shows that the last time PERSONS was checked in the repository moved to version 378. Likewise the last revision of PUBLICATIONS is 258, so PUBLICATIONS has not been changed since revision 258.

If I open PERSONS from the navigator - I get the declarative UI that allows me to edit that object. But that is not how the information is stored. JDeveloper stores its database models by object - in XML. Below is an extract of what you would see if you opened the underlying file PERSONS.table in a browser.

It's not impossible to read - the extract shows the detail for column LAST_NAME, a VARCHAR2 of length 25, with a comment that it was created via a DB Transform (from a class model). However, the declarative UI abstracts you from the raw XML.

But imagine if you needed to compare two versions of the PERSONS table - or you had two users who were both working on the PERSONS table and checked in their changes to the SVN repository? Comparing the raw XML files is possible, and JDeveloper does recognize XML formating, but in the latest version of JDeveloper ( there is a new declarative UI for that too.

Imagine this scenario: two users have checked out the latest version of the application. SVN uses a copy-modify-merge versioning approach. This means that when an application is checked out a copy of that application is created on the user's local machine. SVN does not keep a record of who and where copies are checked out. It is only interested when something is commited back to the repository. So, as a user I can check out a working copy, make changes to it and if I never commit those changes back to the repository, so be it. I could do some 'what if' type coding and then decide to discard the whole copy.

In my scenario the two users have been discussing the PERSONS table and agree that the length of the LAST_NAME column is too short at 25. Unfortunately, they both decided to modify the column length in their working copy. User 1 happens to be the first to commit her changes back to the repository - so it now has LAST_NAME with a length of 50.

Now user 2 modifies her working copy, editing LAST_NAME to length 40. As good practice dictates, she Updates her working copy with the latest revision from the repository - so she can resolve any code conflicts in her working copy prior to merging her copy back in.

In this case she finds that she has a conflict between her code and the repository that JDeveloper cannot resolve automatically. She sees this in her Application Navigator. It shows the PERSON table with a conflict overlay and the differing versions of the table are also listed (see below)

But help is at hand. Double-clicking on PERSONS 379 opens the three-panel Merge window. On the left is her local version of PERSONS - with length 40, on the right is the repository version with length 50. After reviewing this she has shuffled her version into the middle Result of Merge panel. As that was the only conflict in the two files the Save and Complete Merge icon is enabled in the toolbar.
Once she saves this merge and refreshes the Application Navigator the extra conflict files will disappear and the updates will be applied to her working copy. Now she is ready to commit her working copy back to the repository so that it reflects the latest code position.

Not all changes made by multiple users cause conflicts. If the changes are complimentary - for instance one user adds a new column LOCATION and another uses changes the type of PERSONS_TYPE these changes will be added to the repository as SVN and JDeveloper recognize that there is no conflict. This is how SVN works by default, copy-modfy-merge in action. Many systems work this way, and others use the lock-modify-unlock paradigm.

If you want to review changes to a database object prior to updating your working copy, you can use one of the Compare options (Compare with Latest, with Previous, with Other Revision) before you update your working copy with the repository contents. In the example below, user 1 has used Compare with Latest to check her changes: A change to the size of FIRST_NAME and a new column COUNTRY - as shown in the right hand panel with the latest version in the repository (as shown in the left hand panel).

Finally, in this post, a short list of other tips
  • Using Compare with Other Revision don't forget to scroll to the right - and see not only revision numbers but the commit notes. Also use the filters optimize the revisions listed

  • If you make changes to a file but want to revert back to the version you checked out from the repository - use menu Versioning - Revert

  • Use the graphical Version Tree to review branches and versions of your objects

Visualizing the DB Reporting Metamodel

Susan Duncan - Tue, 2009-12-15 05:31
In my last post I gave examples of using both the pre-built reports and building custom reports on your database models.
As a P.S. to that post I want to point you to the JDeveloper Help system that comes with diagrams to help you define and understand the Database Reporting metamodel. Below is the JDEV table of contents - open at Working with Database Reports

and one small part of the Tables Report Metamodel to give you the idea. Have fun!

Revisiting JDBC Connection Pooling and Spring Data Source implementation

Marcelo Ochoa - Mon, 2009-12-14 18:11

Today, I have revisited the Oracle Implementation of OCI Connection Pooling.
Using my previous post

Using native Oracle Data Source and connection caching/pooling from Spring
I have updated the file examples:
The idea behind this tip is the same, do the connection pooling implementation by using the JDBC driver which is faster and safer.
If you are using an old database version, for example 10g, you can try installing at the middle tier the Install Client which is available for multiples platform, including Mac OS/X
HTH, Marcelo.

Using Database Reporting to Ensure Standards Compliance

Susan Duncan - Mon, 2009-12-14 05:30
In JDeveloper 11gR1 we introduced a SQL-like interface for reporting on your database models. Watch this demo to review the basic functionality. In the latest release we've added a set of pre-built reports. These reports not only get you started with reporting but also illustrate how you can use database reporting to ensure that the objects in your offline database model comply with your team's naming standards.

I say SQL-like interface because the database model you develop offline in JDeveloper is stored in XML. However, we took the decision that as most database developers are familiar with SQL it made perfect sense to provide a familiar way to interrogate the model. For instance, the pre-built report to list all tables with no Primary Key is:

T.SCHEMA '.' T.NAME "Table"

As you get more familiar with the meta-model you might write your own SQL queries, but there is also a declarative UI to help you traverse the model and develop the query you want.
Below is an image of the wizard, showing how the FROM clause for the above query is selected. You can also select the objects, apply different JOINs, add a WHERE, GROUP BY and aliases using the declarative UI.

Here is an example of a more complex SQL query - to report on any tables that do not have the audit column CREATION_DATE*


Now the world is your reporting oyster - how about creating a series of reports in a Standards Project - and distributing that project to every team that is doing database development? Here is one approach:
  1. Check the Standards Project out of source control into the required application
  2. In the Standards Project open the Project Properties, in Project Source Paths select the Offline Database node
  3. Add the projects containing your database model project(s) - see image below

  4. Open each report (double-click) and select the offline database you want to run against
  5. Go to the Publish tab and enter the details of how and where you want the report to be save. In the example below I've chosen to save the report in the project that contains the database - so my audit reports can be shown to represent my offline DB Model.
  6. Use the context menu of the report format to Run the report. In my example the HTML version of the report (AuditTableCheck) is now stored in Project1 along with the database, the report format (Check_Audit_Tables) is stored in the Standards project along with the other report formats.

Let me know how you get on - it would be good to build up a list of reports written by you that could be shared with others.

*For a full example of using this in a tutorial see this Oracle By Example

This blog was migrated to http://blogs.oracle.com/soabpm

Clemens Utschig - Sun, 2009-12-13 12:04
After a while of non blogging - I started again - this time, under the oracle flag - at blogs.oracle.com/soabpm.

Hope to see you there.

RAC, ASM and Linux Forum, December 15, 13:30 - 18:00 Beit HP Raanana

Alejandro Vargas - Sun, 2009-12-13 03:04

It's time for our 2nd, 2009 RAC, ASM and Linux Forum in Tel Aviv!

I would like to warmly invite you to our next RAC, ASM and Linux forum to be held at Beit HP in Raanana, on December 15.

You can register on the Israel Oracle User Group site.

On this forum we will have a 11g R2 Technology Update and 2 very interesting Customer Presentations about minimum downtime VLDB Migration to RAC on Linux and Creating and Managing RPM's for Oracle Silent Install on Linux.

Panel on Grid Infrastructure, ASM, Clusterware and RAC 11g R2, Technology Update

Annie Flint, Principal Member of Technical Staff, Oracle Corporation
Ofir Manor, Senior Sales Consultant, Oracle Israel
Alejandro Vargas, Principal Support Consultant, Oracle Advanced Customer Services

In the few months since last forum on June many things happened, 11g Release 2 is already production and brought a revolution in terms of performance and deep changes to the world of ASM, Oracle Clusterware and RAC.

Exadata Release 2 was released opening the way for OLTP databases based on the new Oracle - Sun Database Machine.

In this seminar we will review the new face of ASM, Oracle Clusterware and RAC on 11g Release 2 and we will comment on some of the incredible performance enhancements of the new version.

Migration of a VLDB to RAC 11g with Minimal Downtime

Dotan Mor,
Senior DBA
Pelephone DBA Team

Dotan will tell us the whole story of migrating an 8TB Datawarehouse, with near 0 downtime, from Linux 3 with OCFS2 to Linux 5, with CRS, RAC and ASM 11g, and Infiniband; and how his customer still cannot believe the incredible improvements they got in performance.

He will tell us also all the several problems faced in the way to this big success and how all of them were overcome.

Packaging Application and Database Together On Linux for Super-Silent-Installs

Liron Amitzi,
VP Professional Services

Liron will explain us how to build a Linux RPM that contains inside the whole set of files required to easily and fast deploy a complete application, from the database to last application executable.

See you there!

Best Regards,


Categories: DBA Blogs

Check IO Scripts

Alejandro Vargas - Wed, 2009-12-09 19:32

These scripts are very useful to check throughput.

The original version can be found on My Oracle Support Note 422414.1 by Luca Canali.

set lines 250 pages 50000

alter session set nls_date_format='dd-mm-yyyy hh24:mi';

col Phys_Read_Total_Bps for 999999999999
col Phys_Write_Total_Bps for 999999999999
col Redo_Bytes_per_sec for 999999999999
col Phys_Read_IOPS for 999999999999
col Phys_write_IOPS for 999999999999
col Phys_redo_IOPS for 999999999999
col OS_LOad for 999999999999
col DB_CPU_Usage_per_sec for 999999999999
col Host_CPU_util for 999999999999
col Network_bytes_per_sec for 999999999999
col Phys_IO_Tot_MBps for 999999999999
col Phys_IOPS_Tot for 999999999999

spool io_max_checkup.log

select min(begin_time), max(end_time),
sum(case metric_name when 'Physical Read Total Bytes Per Sec' then maxval end) Phys_Read_Tot_Bps,
sum(case metric_name when 'Physical Write Total Bytes Per Sec' then maxval end) Phys_Write_Tot_Bps,
sum(case metric_name when 'Redo Generated Per Sec' then maxval end) Redo_Bytes_per_sec,
sum(case metric_name when 'Physical Read Total IO Requests Per Sec' then maxval end) Phys_Read_IOPS,
sum(case metric_name when 'Physical Write Total IO Requests Per Sec' then maxval end) Phys_write_IOPS,
sum(case metric_name when 'Redo Writes Per Sec' then maxval end) Phys_redo_IOPS,
sum(case metric_name when 'Current OS Load' then maxval end) OS_LOad,
sum(case metric_name when 'CPU Usage Per Sec' then maxval end) DB_CPU_Usage_per_sec,
sum(case metric_name when 'Host CPU Utilization (%)' then maxval end) Host_CPU_util, --NOTE 100% = 1 loaded RAC node
sum(case metric_name when 'Network Traffic Volume Per Sec' then maxval end) Network_bytes_per_sec,
from dba_hist_sysmetric_summary
group by snap_id
order by snap_id;

spool off

spool io_maxtot_summary.log

select min(begin_time), max(end_time),
sum(case metric_name when 'Physical Read Total Bytes Per Sec' then maxval end)/1024/1024 +
sum(case metric_name when 'Physical Write Total Bytes Per Sec' then maxval end)/1024/1024 +
sum(case metric_name when 'Redo Generated Per Sec' then maxval end)/1024/1024 Phys_IO_Tot_MBps,
sum(case metric_name when 'Physical Read Total IO Requests Per Sec' then maxval end) +
sum(case metric_name when 'Physical Write Total IO Requests Per Sec' then maxval end) +
sum(case metric_name when 'Redo Writes Per Sec' then maxval end) Phys_IOPS_Tot,
sum(case metric_name when 'Current OS Load' then maxval end) OS_LOad,
sum(case metric_name when 'CPU Usage Per Sec' then maxval end) DB_CPU_Usage_per_sec,
sum(case metric_name when 'Host CPU Utilization (%)' then maxval end) Host_CPU_util, --NOTE 100% = 1 loaded RAC node
sum(case metric_name when 'Network Traffic Volume Per Sec' then maxval end) Network_bytes_per_sec,
from dba_hist_sysmetric_summary
group by snap_id
order by snap_id;

spool off

Categories: DBA Blogs

Using OCFS2 as a Generic Cluster File System?

Sergio's Blog - Wed, 2009-12-09 00:56
If you use OCFS2 for purposes other than running Oracle software, please leave a comment or drop me a line: sergio[dot]leunissen[at]oracle[dot]com I'm curious how you use OCFS2, how big your filesystems are, how many nodes are in the cluster, what you like about OCFS2, etc.
Categories: DBA Blogs

Oracle Advanced Compression Advisor

Fairlie Rego - Sat, 2009-12-05 20:28
Am currently working on how best to use compression techniques in an OLTP environment on a current 10g database which is due to be migrated to 11.2 in the next 6 months and I came across this OTN note


The following statement got me very interested

This package can be used on Oracle Databases running Oracle Database 9i Release 2 through 11g Release 1.
A compression advisor (DBMS_COMPRESSION) comes packaged with Oracle Database 11g Release 2.

So lets give it a try...

I downloaded the package and compiled it..

SQL> @dbmscomp.sql

Package created.

Synonym created.

Grant succeeded.

No errors.

SQL> @prvtcomp.plb

Package body created.

Synonym created.

Grant succeeded.

No errors.

SQL> set serveroutput on

SQL> exec dbms_comp_advisor.getratio('OWNER1','CROSS','OLTP',1);

Invalid Compression option for the current COMPAT setting

Thankfully the package body is not encrypted and a quick look at the code reveals that it does the following

SQL> create table DBMS_COMPRESSION_TEMP_UNCMP as select /*+ full('CROSS') */ * from cross sample block( 1);

Table created.

SQL> create table DBMS_COMPRESSION_TEMP_CMP compress for all operations as select * from DBMS_COMPRESSION_TEMP_UNCMP;
create table DBMS_COMPRESSION_TEMP_CMP compress for all operations as select * from DBMS_COMPRESSION_TEMP_UNCMP
ERROR at line 1:
ORA-00922: missing or invalid option

Hmmm... not sure how they expect "compress for all operations" to work in a pre 11g database...

How to run eBusiness Suite R12 using IE8

Brenden Anstey - Sat, 2009-12-05 02:40

IE8 is not supported for R12 as yet but it does work by disabling a security setting related to Cross Site Scripting (XSS). This solution works on Vista 64 using IE8 on Release 12.1

Here is how to do it:
Firstly we need to add the EBS application server to trusted sites within IE as we definitely do want the IE8 XSS filter active for general Internet browsing.
1. In IE8 Select Tools->Internet Options->Security(tab)->Trusted Sites
2. Click Sites and type in the hostname of the server running R12, untick Require HTTPS if needed and add the website to the list of trusted sites. Hit close.
3. Still in the Security tab click the custom level button and scroll right to the bottom and the third option from the bottom at the time of writing is 'Enable XSS Filter" set the option value to Disable.

Make sure you only disable the XSS filter for Trusted sites which should be a small list of intranet servers that you trust the content from.

Lifetime Social Value, How Sony Screwed the Pooch

Ken Pulverman - Fri, 2009-12-04 23:08
About a year after I started dating my girlfriend, who is now my Fiancee, she needed a new laptop. I helped her do the research and suggested a Sony laptop. Two months after it went out of warranty, the motherboard died. This was a well known issue at the time and was all over their website. Elaine called Sony, but they wouldn't budge and would only offer an unreasonable fee of over $600 to fix the problem. During this time, laptop prices had gone down and this fee made no economic sense.

I felt horrible about it. My family has been life long Sony bigots. Our first color TV was a Sony and we've had a strong preference for Sony products ever since including desktops, laptops, DVD players, stereos, Walkmen, clock radios, etc..

Not only had Elaine bought the laptop I recommended, she recommended Sony to several other people some of whom also bought.

Elaine resolved to never buy another Sony product ever again. .....This is where the story gets interesting from a marketing standpoint. Every time I suggest buying a new product that happens to be Sony, I get brow beaten to death. Elaine also tells anyone that will listen that she'll never buy a Sony product again. The result of all this Sony dogma is that I haven't bought another Sony product since Elaine's laptop failed two years ago.

It's a real shame for Sony too. We are both in the information technology industry and to say we are heavy consumers of technology would be an understatement. To give you some insight, Elaine is willingly accompanying me to CES this January for the second time since we've been going out, and both trips were personal and had nothing to do with our jobs. I believe we are the ideal consumers for an electronics company...early adopters who generally get asked by our friends, both male and female, what models of popular electronics they should buy.

We really aren't gamers (except for Elaine's new for an early Christmas present Nintendo Wii). However, tonight I told Elaine that I wanted to get a Sony PS3 because after comparison shopping, it looks like a very credible Blueray player that will be easily upgradable for some time to come. The first words out of Elaine's mouth were: "I am not paying for that." I would have never thought to ask her to contribute to my pursuit of 1080p (Blueray resolution), but she went on breathing Sony fire and damnation for several minutes more.

Effectively because Sony had no concept of how connected Elaine was to so many tech consumers, Sony screwed themselves by not fixing a known defect with Elaine's laptop. She can't help but badger me into not buying their products which compelled me to write this blog entry as an interesting marketing issue.

This story begins before the heavy use of the type of social media that will now see this story go from my marketing blog to my Twitter account and then on to my Facebook account automatically.

From a new marketing angle, this story made me think of two principles:

1. The Perpetual Profile - Businesses don't spend enough time profiling customers after they buy. We know we want them to buy again, so if they spend a few thousand dollars with us, shouldn't we figure out who they are? Elaine is a Technology Development VP, spent $2,000 on our laptop and lives with Ken who loves(ed) Sony. Perpetually profiling people who spend bigger dollars with us would be a necessary step to realize the theoretical lifetime value we all strive to achieve.

2. Poor Product Support Judgement Calls in the Social Era will Kill Brands - Effectively, some unimpowered support rep in the bowels of Sony effectively ruined my lifetime relationship and value for Sony and I am not even the one that had the direct issue.

Both these points speak to the need to get much smarter about how we deal with our customers and increasingly the web of people they influence and those who influence them. The good news is that for the products we like, we will give away this information freely. Everything Sony needed to know to alert them, perhaps electronically, that screwing up this issue would be a costly landmine for them was there for the taking.

Sony, I think you lost your opportunity to be our One and Only, but maybe there's still hope to get your act together and save a few more people like me.

The lesson for most companies is that your installed base matters. In fact they are the engine of your growth which is now amplified positively and negatively in the social era. Fail to identify or ignore those who have been your biggest or most loyal customers and deeply suffer the consequences. We used to say that those who had a bad experience told 10 others. This may be a more modern case, but I think it is the shape of things to come. So we are going to have to revise the phrase.

People who know people who've had a bad experience tell thousands of others. (via social media).

The good news, Sony, is that there is a short window to learn from your mistakes. The reverberating effect of Social Media is well documented, but its impact is really just revving up. Act fast and my second LCD TV may not be another Samsung.

Progress… YES?

V.J. Jain - Fri, 2009-12-04 14:45

After dealing with several challenges during the installation, I have successfully installed Oracle Apps on Oracle Enterprise Linux.  I’ve spent some time on the post-installation checks.

One of my specific challenges was that I already had the Oracle database installed previously.  This was causing some conflicts as the environment was not getting setup correctly.

Another issue that I faced was a problem with libdb.so.2.  This required creating a symbolic link as shown below:

root@localhost ~]# ls /usr/lib/libdb.so.2
ls: /usr/lib/libdb.so.2: No such file or directory
[root@localhost ~]# ln -s /usr/lib/libgdbm.so.2.0.0 /usr/lib/libdb.so.2
[root@localhost ~]# ls /usr/lib/libdb.so.2

The next issue I faced was to do with creating the Login Page.  I kept getting an error…

checking URL = http://localhost.localdomain:8000/OA_HTML/AppsLogin

RW-50016: Error: – {0} was not created:
File = {1}

I got around this by unlocking the Oracle accounts, running autoconfig, and bouncing the services.

Finally, I was able to see the login page and login as sysadmin.  I went to open the Applications by clicking on one of the forms and I faced another problem.  When opening the forms I received another error.

FRM-92101: There was a failure in the Forms Server during startup. This could happen due to invalid configuration
Please look into the web-server log file for details

Java Exception
oracle.forms.net.ConnectionException:Forms sessionfailed during startup:no response from runtime process
at oracle.forms.net.ConnectionException.createConnectionException(Unknown Source)
at oracle.forms.net.HTTPNStream.getResponse(Unknown source)
at oracle.forms.net.HTTPNStream.doFlush(Unknown Source)
at oracle.forms.net.HTTPNStream.flush(Unknown Source)
at java.io.DataOutputStream.flush(Unknown source)

I looked into the forms server log file which is given below

and found the following error

testMode: false
09/04/09 08:12:17 Oracle Containers for J2EE 10g ( initialized
09/04/09 08:14:27 ListenerServlet init()
09/04/09 08:14:28 Forms session aborted: runtime process failed during startup with
errors /u03/oracle/VIS/apps/tech_st/10.1.2/bin/frmweb: error while loading shared libraries:
/usr/X11R6/lib/libXm.so.2: ELF file data encoding not little-endian

This required installing the openmotif package from Oracle’s download site http://oss.oracle.com/projects/compat-oracle/files/Enterprise_Linux/  The following package was missing: openmotif21-2.1.30-11.EL5.i3861.rpm

After trying again, a similar error appeared:

09/12/04 11:55:22.317 formsweb: Forms session <1> aborted: runtime process failed during startup with errors /u01/oracle/VIS/apps/tech_st/10.1.2/bin/frmweb: error while loading shared libraries: libXp.so.6: cannot open shared object file: No such file or directory

Metalink says the following about the libXP package…

The  i386 version of the libXp package is missing. The Oracle Universal Installer ( OUI  ) requires the file libXp.so.6 and this file is installed with the package libXp in RHEL 5 and OEL 5.

Issue the following to confirm the missing rpm:

% rpm -qa –queryformat “%{NAME}-%{VERSION}-%{RELEASE} (%{ARCH})\n” | grep libXp

The package libXp was not installed in RHEL 5/OEL 5 with default RPM packages, so it will need to be installed manually.   In RHEL 4 this file was provided by the package xorg-x11-deprecated-libs which is installed with default package installation but this is not the case for RHEL5/OEL5 and is now provided with a different rpm, the libXp rpm.

Install the package libXp by the command:

# rpm -ivh </path/to/>libXp.<version>.i386.rpm

Unfortunately, I did not have the DVD in front of me, so I found the RPM on http://rpm.pbone.net

[root@localhost tmp]# rpm -ivh libXp-1.0.0-8.1.el5.i386.rpm –nodeps
warning: libXp-1.0.0-8.1.el5.i386.rpm: Header V3 DSA signature: NOKEY, key ID a7048f8d
Preparing…                ########################################### [100%]
1:libXp                  ########################################### [100%]
[root@localhost tmp]# rpm -qa –queryformat “%{NAME}-%{VERSION}-%{RELEASE} (%{ARCH})\n” | grep libXp
libXp-1.0.0-8.1.el5 (i386)

After installing the libXp, my Apps Forms opened up at last.

At Last, I have my Vision instance up and running.

Login problems R12

Bas Klaassen - Fri, 2009-12-04 06:39
On our R12 eBS environment we are facing a problem when loggin in. It does not happen all the time, because we use more then one web node (loadbalancer), but when trying to acces the login page the following error is shown..."Unable to generate forwarding URL. Exception: oracle.apps.fnd.cache.CacheException"or a blank page is shown instead of the login page. In the applications.log file ($Bas Klaassenhttp://www.blogger.com/profile/04080547141637579116noreply@blogger.com5
Categories: APPS Blogs


Subscribe to Oracle FAQ aggregator