Feed aggregator

Auditing on FND_FLEX_VALUES: How to see Audit History

Jo Davis - Tue, 2007-01-09 21:23
To keep an audit trail be kept for mapping of segment values to external values (such as for extracts to other systems) and changes to it
1) Define DFF on FND_FLEX_VALUES to contain the mapping values
2) enabled auditing on FND_FLEX_VALUES,
3) if they wish to query the audit data start with this:
select *
where attribute1 is not null
FND_FLEX_VALUES_AC1 is a view of the audit table

How to Migrate Personalizations from TEST to PROD on Release

Jo Davis - Tue, 2007-01-09 20:35
This applies to iProcurement (11.5.10 only), iExpenses 2nd Generation (i.e. above 11.5.7 and the white screens, not the blue ones) and anything else that uses the new self-service architecture (i.e. NOT iTime or anything else with those blue screens and no bouncing balls across the top.... you know what I mean)

It's all covered in Metalink Note 370734.1 but suffice is to say

- it can be done, you don't need to redo all the personalizations

- you can only use this on release

- you will need the appltest and applprod passwords and a passing familiarity with Unix (or a tame developer or DBA to help you)

Have a great day


A new SOA year has begun - what will the SOA Santa bring?

Clemens Utschig - Thu, 2007-01-04 16:00
Wow, the new SOA year is here - and many cool things are about to arrive. But before looking into the future, let's look back into 2006 - and what happened there.

My first year, living in San Francisco and working within a wonderfull group, envisioning future and creating our next generation products went by - and I must admit it was far better than I thought it'd be.

First, and most important - we delivered SOA Suite - the first major, integrated SOA plattform. Not just integrated, but also with two new key components, the Oracle Enterprise Service Bus (ESB) and Rules. Since OOW 2006 we we presented the first public available release, our customers are using it heavily to build their next generation SOA - and according to many of them - they enjoy doing so :D

Also, we signed an OEM agreement with IDS Scheer, to build the ORACLE BPA Suite on top of it. Using Business Process Architect - for the first time Business Processes will make it all the road down to be executable, and all the way back to feed real world data into simulation and continous refinement.

Another also, was the overwhelming amount of SOA related sessions at Oracle Open World, and the great feedback we received. It was a blast, more than 40000 people made the city for one week being red, being Oracle. Elton John played during the big Oracle party, and so on.

Open SOA - www.osoa.org went live - and with it the big players in the SOA market space, to drive next generation, easy to use standards.

So after all what's in for 2007?

While we all work hard on delivering 11g - our next generation SOA infrastructure, based on Service Component Architecture (SCA) and supporting Service Data Objects (SDO) - we expect a patchset for (which will offer a ton of new features, and little fixes here and there).

BPEL 2.0 will see the light of life - after more than 2 years, and the last public review finished, some more polishing needs to happen - and then we are ready to go. With it - BPEL will present a number of long awaited cool features, such as the new variable layout ($inputVariable) and scoped partnerlinks.

Evangelism on SOA will continue to drive the spirit of SOA and help customers adopting to it - I really look forward to even more presence around the globe, and many customers going live on SOA Suite.

Personally for me - it's moving on with my SOA Best practices series on OTN, supporting people around the globe and the OTN forums, doing evangelism on SOA & help creating 11g. More than enough.

10G SQL Access Advisor and SQL Tuning Advisor

Vidya Bala - Thu, 2006-12-28 15:45
I was able to take advantage of the holidays to complete a 9i to 10g cross platform migration. While I was quite happy with the migration process (especially on how datapump has made this effort much easier), I was a little bit disappointed on what SQL Access and SQL Tuning Advisor had to offer.

There were a bunch of queries that we have been meaning to tune for quite some time , I thought I could create a tuning task with 10G SQL Tuning advisor and see if I could get some valuable recommendations.

The Recommendations I got were far from anything of significance (eg: add an index to a small lkup table).

I couldn’t help but wonder is there is much success/help using Oracle 10g SQL Access/Tuning Advisor in the industry
Categories: Development

Merry Christmas, Happy New Year, and a Poll

Marcos Campos - Sun, 2006-12-24 04:30
It has been a great year. My daughter was born as well as this blog. I have launched this blog at the beginning of the year (January first to be more precise) and the readership has been great. Amongst the posts, Time Series and Automatic Pivoting were probably the most viewed. I am on vacation in Brazil right now enjoying a family reunion. I have a big family and it is hard to get everyone Marcoshttp://www.blogger.com/profile/14756167848125664628noreply@blogger.com0
Categories: BI & Warehousing

ASM and NetApp Filer

Vidya Bala - Thu, 2006-12-21 15:37
Link to ASM and NetApp

I have been spending the last few days looking into what advantages we would have using ASM on a NFS mount as opposed to having the database files directly on NFS. If your on RAC then ASM is mandatory but for non RAC 10g instances and NetApp - ASM is not mandatory.

The biggest benefit I see is volume management features with ASM.

does this mean I can change volume sizes etc actually online ? may be IO balancing across different volumes an added feature. I am walking into a totally new area (ASM on any kind of direct attach storage I can for sure see it being beneficial on NFS I am not so sure?)

anybody with sucess stories?
Categories: Development

Cross Platform Migration 9i to 10g

Vidya Bala - Thu, 2006-12-21 15:05
Migration Procedure Implemented.

Cross Platform Migration SLES8) Standard Edition to 10g Rel 2 (Solaris 10)

This is a quick overview of a migration procedure I have just finished implementing on a test environment– If you see anything else in the procedure that should be added or should be noted, please feel free to post comments – as always there has been a lot of mutual learning and help from my blog readers.

Step1: Server A :
Clone Production Database to Preprod Environment. (Datafile,Redologfile,controlfiles all on Shared File System NFS).
Database Release :
Suse Linix version : SLES 8
Database Size : 50.89 G

Step2: Server B:
Install 10g Release 2 on a new SLES 9 server. Note 10g Release 2 is not supported on SLES8.
Make sure shared file systems on ServerA are mounted on Server B.
Copy parameter file from Server A to Server B . Make appropriate path changes to parameter file on Server B.
Database Release : 10g Release 2
Suse Linix version : SLES9

Step3: Upgrade 9i database to 10G
On Server B upgrade database to 10G

sqlplus /nolog
startup upgrade

CREATE TABLESPACE sysaux DATAFILE ' sysaux01.dbf'

Set the system to spool results to a log file for later verification of success:

SQL> SPOOL upgrade.log

Run catupgrd.sql:

SQL> @catupgrd.sql

Run utlu102s.sql to display the results of the upgrade:

SQL> @utlu102s.sql

Turn off the spooling of script results to the log file:


Shut down and restart the instance to reinitialize the system parameters for normal operation.


Run utlrp.sql to recompile any remaining stored PL/SQL and Java code.

SQL> @utlrp.sql

Verify that all expected packages and classes are valid:

SQL> SELECT count(*) FROM dba_objects WHERE status='INVALID';
SQL> SELECT distinct object_name FROM dba_objects WHERE status='INVALID';

Exit SQL*Plus.

Total Time to Upgrade Database : 38 minutes

Step 4 – Upgrade 10g Database (using expdp)

Now that we have the database migrated to 10gRel2 on Server B(SLES9), we can export the database using 10g datapump. We will export only the Application Related Tablespaces. Tablespaces excluded are as below

INDX --- no application related objects in this tablespace

Before running the export – OWM and OLAP options need to be de-installed if not being used to avoid export errors

If the Oracle Workspace Manager feature is not used in this database: de-install the Workspace Manager:
SQL> @$ORACLE_HOME/rdbms/admin/owmuinst.plb

clean up AW procedural objects:SQL> conn / as SYSDBASQL> delete from sys.exppkgact$ where package = 'DBMS_AW_EXP';

Afterwards, run the export.

CREATE OR REPLACE DIRECTORY pump_dir AS 'xxxxxxxxxxxxxxxx';

Export only application related tablespaces

$ORACLE_HOME/bin/expdp system/manager tablespaces=\(t1,t2 \) directory=pump_dir dumpfile=pump.dmp logfile=pump.log

Full database export of 50+G database took about 80 minutes to export

Step5 – Prepare Target environment – Server C with 10g Release 2

Install 10g Release 2 on Solaris 10 Servers (SERVERC).

Installing Oracle Database 10g Products from the Companion CD
The Oracle Database 10g Companion CD contains additional products that you can install. Whether you need to install these products depends on which Oracle Database products or features you plan to use. If you plan to use the following products or features, then you must complete the Oracle Database 10g Products installation from the Companion CD:
· JPublisher
· Oracle JVM
· Oracle interMedia
· Oracle JDBC development drivers
· Oracle SQLJ
· Oracle Database Examples
· Oracle Text supplied knowledge bases
· Oracle Ultra Search
· Oracle HTML DB
· Oracle Workflow server and middle-tier components

On Server C use DBCA to create database creation scripts. Select the config parameters you need for your database as you go through the DCA wizard.

The scripts will create a standard database with no application related objects yet. Run create scripts to create database.

Tablespaces created (this is assuming none of the additional components were installed)

Make sure the Listener is up.
is the em console for the database

Note the below before proceeding with the em console
Oracle Enterprise Manager 10g Database Control is designed for managing a single database, which can be either a single instance or a cluster database. The following premium functionality contained within this release of Enterprise Manager 10g Database Control is available only with an Oracle license:
t(void 0,'12')
Database Diagnostics Pack
Automatic Workload Repository
ADDM (Automated Database Diagnostic Monitor)
Performance Monitoring (Database and Host)
Event Notifications: Notification Methods, Rules and Schedules
Event history/metric history (Database and Host)
Dynamic metric baselines
Memory performance monitoring
t(void 0,'12')
Database Tuning Pack
SQL Access Advisor
SQL Tuning Advisor
SQL Tuning Sets
Reorganize Objects
t(void 0,'12')
Configuration Management Pack
Database and Host Configuration
Patch Database and View Patch Cache
Patch staging
Clone Database
Clone Oracle Home
Search configuration
Compare configuration

Step6 – Prepare Target environment – Server C with Application related Objects

IMPDP will be used to import Application Related objects into this database.
Before running IMPDP the target database will need to be prepared with the Application Tablespaces and Application Schema’s. This is also a great opportunity to reorg objects if you need to. Scripts to create Application Tablespaces and Schemas are prepared.
This is the most important step in preparing the target environment.

Once the Target environment is prepared – import the dumpfile using the following command.
$ORACLE_HOME/bin/impdp system/manager full=y directory=pump_dir1 dumpfile=pump.dmp logfile=pump_import.log

Before opening the database for public connections
1)Recompile for invalid objects (run utrp.sql)
2)Gather statistics for entire database

There will be a regression tests run at the end of all this to test Application Functionality, long datatypes etc.
Categories: Development

Future Direction - 10g Forms/Reports Developer vs JDeveloper

Vidya Bala - Wed, 2006-12-20 15:15
We are on this new effort to move some legacy rbase programs to Oracle. we are required to evaluate - a)would it be better to go with Oracle 10g forms/reports or with Jdeveloper. The data is going to reside on Oracle database servers. The concern about going with Oracle Forms/Reports was a) will Oracle support it in the future? my instant reaction was ofcourse Oracle will......we have bigger problems if Forms/Reports go away considering that the EBS Suite uses Forms/Reports technology as well.

so the final question while migrating new Apps would it be better to use Oracle Forms/Reports or Jdeveloper????? my 2 cents
1)If the application is database centric without too much business logic involved and if your team has a PL/SQL back ground as opposed to a Java backgrouund then 10g Forms/Reports may be a better bet.
2)If the team is pretty much a J2ee development team then JDeveloper may be the route to go.

I am not too worried about Oracle's strategy to support Forms/Reports (I think they will). The above is just my 2 cents any input from my blog readers will be greatly appreciated.
Categories: Development

Run Flashback commands only from Sql*Plus 10.1.x or newer

Mihajlo Tekic - Fri, 2006-12-15 23:24
getting ORA-08186: invalid timestamp specified each time I tried to run a FVQ.

ORA-08186: invalid timestamp specified
Well ... take a look at the following example

First I wanted to make sure, that the format I use is the correct one.

1* select to_char(systimestamp,'DD-MON-RR HH.MI.SSXFF AM') from dual
SQL> /

14-DEC-06 AM

Now, when I tried to run FVQ, I got "ORA-30052: invalid lower limit snapshot expression". That was an expected result, since my lower limit did not belong in (SYSDATE-UNDO_RETENTION, SYSDATE] range. (UNDO_RETENTION parameter was set to 900).
But you can agree with me that Oracle successfully processed timestamp values that I used in this query.

SQL> ed
Wrote file afiedt.buf

1 select comm
2 from scott.emp
3 versions between timestamp
4 to_timestamp('14-DEC-06 AM','DD-MON-RR HH.MI.SSXFF AM') and
5 to_timestamp('14-DEC-06 AM','DD-MON-RR HH.MI.SSXFF AM')
6 where
7* empno = 7369
SQL> /
from scott.emp
ERROR at line 2:
ORA-30052: invalid lower limit snapshot expression

So I modified the lower limit to fit in the right range, and I got ORA-08186: invalid timestamp specified. !?!?!?

SQL> ed
Wrote file afiedt.buf

1 select comm
2 from scott.emp
3 versions between timestamp
4 to_timestamp('14-DEC-06 AM','DD-MON-RR HH.MI.SSXFF AM') and
5 to_timestamp('14-DEC-06 AM','DD-MON-RR HH.MI.SSXFF AM')
6 where
7* empno = 7369
SQL> /
from scott.emp
ERROR at line 2:
ORA-08186: invalid timestamp specified

After some time that I spent trying to resolve this issue (I couldn't dare to open SR about it:-)) I remembered I have had similar problems while trying to test some flashback features (flashback table to before drop) on Sql*Plus 9.2.x while ago….and I was using Sql*Plus 9.2 again.

I tried the same example on Sql*Plus

… and everything worked well.

Are you exporting and importing compressed partitions?

Dong Jiang - Fri, 2006-12-15 11:45

Your luck just ran out.
Oracle imp utility uses convention inserts exclusively and partitions will lose compression after import as the inserts are not direct-path. The shiny 10g datapump has the same limitation. You will have to recompress the partitions later, like
1. Insert(append) into an empty table from the uncompressed partition.
2. Partition exchange
3. Truncate the table.
Then repeat for every uncompressed partition.

Announcement: Oracle Data Mining Consultants Partnership Program

Marcos Campos - Fri, 2006-12-15 10:03
We're starting a program to work with qualified data mining consultants.You and your colleagues are invited to participate in a 2 day hands-on session designed for data mining consultants here in the Oracle Burlington MA office February 7 & 8, 2007. It is also possible to attend remotely via webminar. Space is limited, so please RSVP asap.The Oracle Data Mining Consultants Partnership Program hasMarcoshttp://www.blogger.com/profile/14756167848125664628noreply@blogger.com2
Categories: BI & Warehousing

online reorg options if you are on Standard Edition

Vidya Bala - Wed, 2006-12-13 14:16
A couple of the online reorg options may not be available if your on Std Edition. Quest Central (Space Management) Live Reorg options should help you get past this problem.

Quest Central Space Management have the following Reorg Options
a)Standard Reorg(offline mode no DML activity allowed on Reorg table)
b)Live Reorg (online mode) - has 2 option
1)TLOCK switch (can be used if your on Std edition - a copy table is created and a trigger based approach to get the copy table in sync, when you are ready the TLOCK switch can be performed - during the switch you will need to have a downtime but will be nothing significant)2)Online Switch(This option needs Oracle Partitioning enabled - pretty much you will need enterprise edition for this option)
Categories: Development

From JavaPolis to Nordwijk aan Zee

Clemens Utschig - Wed, 2006-12-13 11:41
After two intense and really cool days at JavaPolis .. and a our slot on standards based applications, that caught a lot of interest - it's now off to an internal architects camp meeting in the Netherlands to discuss on future product strategy.

For the organizers of JavaPolis - it was a great time theree - and 2 thumbs up - for a VERY well organized conference. For me it was something new, speaking in a cinema, and definetely in front of the biggest screen ever (15 * 8 meters)

More pictures can be found here

So I'd recommend for next year to put JavaPolis 2007 into your calendar and join me and 1000 other java geeks :D

Siebel Analytics Install and Siebel Analytics Administration:

Vidya Bala - Tue, 2006-12-12 17:17
In this post I will review Siebel Analytics Administration Tool. But before we begin you will need to Install the following.
Logon to http://edelivery.oracle.com/Ok finding the Siebel Analytics Download can be tricky on edelivery

Product – pick Oracle Business Intelligence and the appropriate Platform

pick the Business Intelligence media pack
If you plan to evaluate on Windows download
B30721-01 Part 1 of 2 and B30721-01 Part 2 of 2
The above 2 parts will your Sibel Analytics Server, Siebel Analytics Web, Siebel Analytics Scheduler, Siebel Analytics Java Host, Siebel Analytics Cluster

Now if your looking to download third party products like Informatica , Actuate etc you will have to download B27745-01 Parts 1 through 4. This post will focus on the Siebel Analytics Server.

Once you have downloaded B30721-01 Part 1 of 2 and B30721-01 Part 2 of 2 , extract the zip files , find the installer and walk through the install. The install is pretty intuitive. If you run into any issues with the install (post comments on the blog and I can help you out with it).

Once you have completed the install , if you are on windows you will see 5 services created Sibel Analytics Server, Siebel Analytics Web, Siebel Analytics Scheduler, Siebel Analytics Java Host, Siebel Analytics Cluster – these are your key components for the Siebel Analytics Server.
Make sure your Siebel Analytics Server and Siebel Analytics Web service is started.

A couple of Siebel Analytics shortcuts will be installed on your desktop.

The first step in using Siebel Analytics to generate Reports is to define the metadata layer. The metadata layer is defined using the Siebel Analytics Administration tool.Click on the Siebel Analytics Administration Tool.

you can see the Administration Tool has 3 layers. The physical layer, Business Model and Mapping Layer and Presentation Layer.

Step1: Physical Layer:

Define your datasources in this layer. Create an ODBC datasource for the source database. For the purpose of this test we will be connecting to the perfstat schema on a DEV1 Instance.
In the physical layer right click and create your database connection

once you create your database folder import your database objects

select the schema you want to import and click import (choose FK constraints if you want to import the objects with FK constraints) – once you have imported the schema you should see it in the dev1 folder

Perfstat schema has been chosen just for illustration purposes , ideally you want your source database to be a warehouse or a mart , in the abscence of one and oltp system can also be your source (note if an oltp db is your source it will call for more work on the business mappings layer)– however in this post I will attempt to design this schema for Reporting Purposes.

Assuming that the crux of your reporting is Reporting on sql statement statististics :
SQL_STMTS_STATS will be our fact table in the Business Mappings Layer.
Some Dimesions around it will be
Instance Details
Execution Plan Cost Details
This is like a 2 dimensional star schema.

Now let us see how the following objects exist in our physical layer and model it in our business layer.
The 4 objects we will be looking at is
Select the above 4 objects in your physical layer - right click and view physical diagram of the selected objects

Now create a new business model folder and drag and drop the 4 objects to the business model layer.

Once you have dropped the objects in the Business layer you can define the relationship between the 4 objects in the Business Layer(select the objects , right click and define the relationships in the Business Diagram area) - this is where a traditional normalized oltp schema in the physical layer will lend into a star or snowflake schema. in the Business layer.

Now let us also look into what attributes we actually need for the Presentation layer and what dimensions we need.

The Business layer is where I start modeling and maping objects with the Business Model in mind.For instance if STATS$DATABASE_INSTANCE is a good candidate for a dimension then right click on the object in the Business Model Layer and say create dimension

Once I have modeled my Business Layer to the way I want it to be , I can drag and drop objects in the Presentation Layer.

so we started with the perfstat schema and this is what we cameup with in the Presentations Layer

Instance Details
- Instance Name
- Database Name
Sql Stats Details
- Sql Statement
- Fetches
- Executions
- Loads
- Parse Calls
- Disk Reads
Sql Plan Cost
- Hash Value
- Cost

All the underlying relationships and hierarchy is masked at this presentation layer. All you see at this presentation layer is key Presentation Elements that a Business user really care about.

The next post will cover how the presentation layer can be used to build Reports using Siebel Analytics Web Console (typically the power users)

For questions – please feel to free to post them in the comments sections.
Make sure to save your work on the Siebel Analytics Administration Tool –a Repository consistency check is done at the time of saving your work – also check in changes will do a Repository Consistency Check.

Categories: Development

expdp of 9i database from 10g Oracle_Home

Vidya Bala - Tue, 2006-12-12 11:47
for some reason I thought I would be able to use expdp to export a 9i database using 10g expdp - obviously not supported as mentioned below.

Compatibility Matrix for Export & Import Between Different Oracle Versions
Doc ID:

Leaves me with the option of either using traditional "exp" can be very slow on a 100G+ database not sure if Cross Plaform Migration is an option with Standard Edition.
Categories: Development

Siebel Analytics

Vidya Bala - Mon, 2006-12-11 10:41
BI Suite Enterprise Edition Getting Started:

I have been spending the last few days setting up BI Reports using BI Suite Enterprise Edition as a proof of concept for Business users to evaluate the Product.

I have liked the Product so far,
(i) very user friendly
(ii)once the metadata layer is defined Business users are masked from underlying tables and relationships they don’t need to know about
(iii)A lot of sleek display features in the Product.
(iv)Skill Level not extremely difficult. If you have worked in the Database and BI world it should be fairly easy to learn how to use the Product.
(v) I will review a step by step evaluation of the Product once I have it installed in my system.

Details on the Product:

Key feature of the Product (mainly Siebel Analytics/Answers):

(i)Has a BI Server ; BI Web Console; BI Admin Client Tool
(ii)BI Server is not integrated with the 10g Application Server (it runs separately and is not a container in your Application Server like the way it was with Discoverer)
(iii)I believe there are claims that with 11G App Server the OEM Console can manage the BI Server as well.
(iv)When a request is sent to the webserver a Logical query is sent to the BI Server – The BI Server then checks if the data is in the Cache – if not in the cache a physical SQL is sent to the database.
(v)All metadata information is stored in flat files as opposed to any repository – so should be easy to move across environments – the metadata flat files also support multiuser capability.
(vi)Security Services available with your BI Server (VPD security)
(vii)BI Admin Tools – has pretty much 3 layers (a) the Physical Layer where datasources are defined; ODBC is used to define the datasources (b) the Business/ Mappings layer where you build your Business mappings and (c) Presentation Layer where define how your data needs to be presented.
(viii)The Webconsole has (a) Answers – this is what is used by your Business users to create Reports using data items available in your presentation layer. (b)Dashboard – is where Reports built using Answers can be published on your Dashboard (c)Admin – to manage user accounts , analytics catalog, dash board permissions etc (d) Delivers – can schedule jobs for web cache refreshes. Also Oracle XML Publisher is a part of the BI EE Suite

I am excited about having all our Reports moved to Siebel Analytics – I will have an end to end sneak preview of the Product posted on my blog soon.
Categories: Development

Coming back from Gartner's SOA Summit and heading out to Europe

Clemens Utschig - Fri, 2006-12-08 15:32
for the last days we have been at Gartner's SOA Summit, to show our new SOA Suite and discuss the latest trends on SOA and Integration.

Two sessions, of the many I attended, left me thinking on SOA and where we go from here.

Darrell Plummer, Gartner Analyst, held a How to workshop on creating SOA applications, which turned out great.. He fostered a broad discussion in the audience, on what methodologies to use, what a service really is - beyond WSDL, and so forth. A nice metric: out of 15 slides he wanted to show, he showed 5 :D (the rest of the time went into discussion)

and the Chief Architect of Eskom, talked about SOA and his experiences in South Africa - about a company, that adapted to agility, about key stakeholders, and the value of SOA. Overall - SOA is about humans - two thumbs up.

After being home for two days - I'll head to Europe today night (yes on LH 459, Kranich) - for JavaPolis, and to visit customers. Getting the chance to speak at JavaPolis 2006 is a great opportunity to evangelise on standards based SOA - and also that we got a slot for the university part.

Hugo Brand (from our EMEA PM team), Demed Lher (OEMS/ESB PM) and me will talk about Pragmatic SOA, and how you can build composite SOA applications today.

So if you are around, come by to get a 3 hour, intense, and interactive how to session - or just to say hello :D

AskTom website exception

Dong Jiang - Fri, 2006-12-08 05:17

This is the exception I got from AskTom website this morning, 6AM EST.
ORA-01688: unable to extend table ASK_TOM.WWC_ASK_ACTIVITY_LOG$ partition PART_12 by 64 in tablespace ASK_TOM_LOG

What can we tell from this?
1. Schema name is ASK_TOM
2. There’s a tablespace ASK_TOM_LOG
3. There’s a partitioned table WWC_ASK_ACTIVITY_LOG$. Partition PART_12 is in ASK_TOM_LOG tablespace.
4. Eh, time to call DBA?

GNUstep on my Solaris desktop

Siva Doe - Thu, 2006-12-07 19:40

After a couple of days work, I was able to run GNUstep on my Solaris (Nevada) box. Check out the screenshot. Boy, need I say, I like it a lot. Very snappy compared to the GNOME desktop (no flames please).

I also built some nice applications like GNUmail.app. I had set up to use the Menu interface to NSMacintoshInterfaceStyle. In the attached screenshot, you can see the Macintosh like menu.

I am also running Windowmaker as my window manager. In my startup script, I set the environment variable GTK2_RC_FILES=$HOME/.themes/GTK2-Step/gtk-2.0/gtkrc, which gives my other GTK apps like Firefox, Gaim to look very much like an GNUstep applicaiton.

I sure miss those days loong (may be one more 'o'?) time back, when we at Sun had an option to run OpenStep as an alternative desktop to CDE. Very slow it was though.

Some wishes though. More GNUstep apps, of course. Like a Web browser (Safari??), IM & Calendar clients. There is an IRC client though, which I dont use much anyway.

GNUMail.app/Addressbook lacks LDAP auto completion which I am so much used to.

I dont know how long can I continue using GNUstep, as for some reason or other I end up using the default JDS desktop only.



Oracle SOA Suite Patterns and Best Practices, Part 1 and 2

Clemens Utschig - Thu, 2006-12-07 16:50
While having this idea for a while now, it took considerable time to make it reality, first to allocate the necessary time, and maybe even more to get over the start of writing :D

The first 2 chapters are live and can be found here.

Chapter 1: Virtualizing your service endpoints in ESB and using them from BPEL
Chapter 2: Using the Service Registry to enable a dynamic, reusable SOA

- more to come soon

Jeff, Mr. B2B is on the way finishing his contribution on BPEL and B2B and how those two components can make trading simple and solid.

Also coming in the next time, migration, deployment concepts, and governance, all from the source.

We are also keen for your feedback, or if you like to contribute, jsut drop me a line with what you think helps the community most


Subscribe to Oracle FAQ aggregator