Feed aggregator

Changing Your PS Database Platform: The Design Phase

Brent Martin - Sun, 2013-01-27 22:52

In my previous article I described how you might approach planning and requirements gathering for a PeopleSoft database replatforming project.  You would exit that phase with solid RICE object inventories and an idea of any new PeopleTools functionality you will deploy.


I’d like to take you through some thoughts and considerations for the remaining phases (Design, Build, Test, Deploy).  This article will just focus on the Design phase and i'll publish the other articles in the near future.  For this discussion, I’m going to assume we’re doing a PeopleTools upgrade along with the replatforming effort, and we’re going to keep existing functionality/features/customizations the same as the current state.  This is usually a good idea because the technology changes will be challenging enough. 


The Design Phase


You might think a basic replatforming project doesn’t require a lot of design.  Once you have the object inventories from the planning/requirements gathering phase you have enough information to start modifying the SQL to make it work on the new database platform.   The one thing I would suggest though would be to bundle the raw object lists into logical chunks of work.  For example, if you have a component, page and process that work together to execute a process you should bundle them together so a developer can unit test all of them at the same time.  If you want to deploy new PeopleTools features you’ll want to spend some time deciding which ones will be useful and how you will configure, test, and deploy them.


But there’s a bit more work you need to do in this phase.  First, you’ll want to identify any external system that might need to be updated as well.  Any system that uses database links or has dependencies on your current PeopleTools environment (think Component Interface libraries) will need to be investigated to determine the impact and appropriate action to take.


Another decision you’ll need to make is with reports and queries.  You probably have a LOT of public queries, and you may have a lot of reports.  nVisions in particular seem to multiply if you don’t have strong governance processes to limit users to specific standard reports. 


So how do you deal with this situation?  It’s not always cost effective to upgrade and test every one.  Here are a couple of suggestions to manage this problem:


1)   Ask your users to provide their business critical reports and queries.  This will be the list that you “certify” that they will work correctly and perform well on the new platform.  You’ll spend whatever time is necessary during development and the testing phases to make very sure that these queries and reports are defect free.


2)   Identify all of the reports and queries that have been run in the last couple of years via process scheduler and query audit tables.  All of these will be your priority 2 set.  Scan this set using automated techniques to identify problems, correct any that fall out, and unit test everything that is modified.  Be sure a good percentage of these are tested in future test phases and give users an opportunity to test them during user acceptance testing.


3)   Other reports and queries won’t be touched.  Breaks here will be handled by your post go-live break-fix process.


The Design Phase is also when you should prepare your testing plan and your communication plan.


While this phase is progressing, your DBA team should execute an initial replatform from the database on the old platform to the database on the new platform.  For this exercise, we’ll just use data mover to extract every table on the source database platform to a flat file, and to import it to the new database platform.  Once on the new DB platform you’ll need to manually adust tables like PSDBOWNER, PSOPTIONS, etc.  Execute the PeopleTools upgrade (if necessary) and you’re done. Don’t expect this to go quickly the first time around – allow yourself 2-4 weeks in the schedule.  And capture all of the steps in detail because it will be the start of your cutover plan.  The environment this exercise produces will become your new development environment so that you can start your build phase. 


Also during this phase you should make decisions about the technologies and tools you’ll use to make your job easier.  One is Oracle GoldenGate.  GoldenGate is a Change Data Capture tool that supports multiple database platforms.  It gives you some amazing capabilities around extracting table data from one platform in parallel, shipping the extracted files to a target file server, and importing them in parallel while the extract is stiRead More...

Active Directory user management with the IdentityStore framework of FMW

Edwin Biemond - Sun, 2013-01-27 12:09
With the IdentityStore framework of Fusion Middleware you can change or create any user or role in almost every LDAP server. With this framework the authenticated user change their own password from ADF or in your own ADF Admin page you can create an user with its roles without knowing LDAP or know what LDAP server is used. IdentityStore framework will automatically detect the configured

Is DELETE necessary?

Kubilay Çilkara - Sat, 2013-01-26 04:35
A  delete is a very destructive operation for a database. An operation which requires a lot of  thinking and preparation and most of the time is irrevocable. You can get in a lot of trouble if you delete the wrong set of rows and lose all that very important data!

Maybe the DELETE command should be banned from SQL. It was probably invented when the disks were tiny and it might not be suitable for our times, especially nowadays when we talk about things like BIG DATA.

Why do we have to delete anyway? We should invalidate and age out data and never destroy it. I am one of those who would say ‘keep it’ and never lose it. Even a typo, an error while entering somebody’s name in a text field, tells us a story. The number of attempts to get it right, the number of characters typed which were wrong, the time the wrong entry took place and much more, is data. Temporal databases come to mind. Why delete it?

But, one can argue that not deleting can be dangerous, too! One can claim that by refusing to delete we might end up with a lot of old, duplicated, bad quality, uncleaned, irrelevant and untrustworthy data.
 

Maybe the worse thing that can happen to data is to refuse to delete it enough. Perhaps because of this lack of enough deletes and fear of deletes we end up with all the 'bad data' found in some  database systems today.

Whatever you choose to do, delete or not delete, one thing's for sure and that is that you should know what you are deleting or not deleting. Choosing confidently what to delete comes from understanding and knowing your data model, and the implications a delete will have on the data model and never because the data is not required for your project! 




Categories: DBA Blogs

Oracle Database Appliance 2.5 And Virtualization (Updated)

Fuad Arshad - Fri, 2013-01-25 12:21
So the cats out of the bag and Oracle has finally announced that Jan 31st ODA 2.5 will be available. The ODA 2.5 is a pretty significant update to the ODA Platform. it will have 1. Virtualization as an option with OVM 3.1.1 2. Official External ASR management. So The virtualization is huge since it allows for using the capacity of the ODA for things other than the Database. The concept is a DOM0 and an ODA_BASE which is a privileged domain that ensures that The Database Appliance performs optimally. Intially the DOMU or user Domain will not have access to ASM Disks or ACFS mounts but that is something that will show up in a future release . The User Domain will not be patched as part of the Single Patching technique which mean if you go with virtualization the user domain needs to be patched independently. Also initially only 250GB extra slice is the only space available for the DOM0 and DOMU areas. This can bring in a little more complexity but from a DR perspective it helps in building idle capacity with a smaller footprint . The question that is important is should ODA deployments start with a Virtual deployment even though there might be no requirement for it initially. Oracle will ship the physical deployment and customers can bare metal it to be virtualized. Oracle Claims to be able to provide virtualization with ease of use but only time will tell. I am pretty excited about this since i can build virtualized boxes and do DR at a cheaper cost but the same horse power that i need. I do intend to test this and see how the ease of use to determine if that is my goign in stance for all new ODA deployments . Update: The Database Appliance has 500GB of Local Disk space which means DOM0 and DOMU will have access to 250GB and ODA_BASE will have 250GB available to it.

How long did Oracle materialized view refresh run?

Ittichai Chammavanijakul - Mon, 2013-01-21 09:38

The LAST_REFRESH_DATE column of the DBA_MVIEWS or the LAST_REFRESH column of the DBA_MVIEW_REFRESH_TIMES indicates the start refresh time. But what if we’d like to find out how long the refresh of the materialized view really takes. Well, we can query the DBA_MVIEW_ANALYSIS.

For Complete Refresh, the refresh duration will be in the FULLREFRESHTIM column of the DBA_MVIEW_ANALYSIS. For Fast Refresh duration, it will be in the INCREFRESHTIM column.

Both values are in seconds.

SELECT mview_name, last_refresh_date, fullrefreshtim, increfreshtim
FROM dba_mview_analysis
WHERE owner='JOHN';

MVIEW_NAME               LAST_REFRESH_DATE      FULLREFRESHTIM INCREFRESHTIM
------------------------ ---------------------- -------------- -------------
MV_CHANGE_HISTORY        07-JAN-13 04.36.58 PM               0            36
MV_ITEM_HISTORY          07-JAN-13 04.36.58 PM               0             9

This shows that the recent refresh of the MV_CHANGE_HISTORY and MV_ITEM_HISTORY are the fast refreshes for 36 and 9 seconds respectively.

Put in one query to calculate and display the end time.

SELECT 
   mview_name,
   last_refresh_date "START_TIME",
   CASE
      WHEN fullrefreshtim <> 0 THEN
         LAST_REFRESH_DATE + fullrefreshtim/60/60/24
      WHEN increfreshtim <> 0 THEN
         LAST_REFRESH_DATE + increfreshtim/60/60/24
      ELSE
         LAST_REFRESH_DATE
   END "END_TIME",
   fullrefreshtim,
   increfreshtim
FROM all_mview_analysis
WHERE owner='JOHN';

MVIEW_NAME              START_TIME             END_TIME               FULLREFRESHTIM INCREFRESHTIM
----------------------- ---------------------- ---------------------- -------------- -------------
MV_CHANGE_HISTORY       07-JAN-13 04.36.58 PM  07-JAN-13 04.37.34 PM               0            36
MV_ITEM_HISTORY         07-JAN-13 04.36.58 PM  07-JAN-13 04.37.07 PM               0             9

Reference: How To Calculate MVIEW Refresh Duration? What Does DBA_MVIEWS.LAST_REFRESH_DATE and DBA_MVIEW_REFRESH_TIMES.LAST_REFRESH Indicate? [ID 1513554.1]

Categories: DBA Blogs

Geek quotient

Andrew Clarke - Sun, 2013-01-20 16:57
I only scored 9/10 on the 'How big a David Bowie fan are you?' quiz. And I scored 20/20 on the 'can you tell Arial from Helvetica?' quiz. But I only scored 32.1032% on the Geek Test. So I still have some way to go.

Changing Your PeopleSoft Database Platform

Brent Martin - Sun, 2013-01-20 01:43

There are several reasons why you might decide to migrate PeopleSoft to a new database platform.  It could be that you need to move to a more scalable platform.  Or you may be looking to reduce your annual license costs and want to switch to a more cost-effective platform.  Whatever the reason, you can certainly take advantage of PeopleSoft's multiple-database support and change your database platform.  This article will give you some ideas about how to plan the effort.


One of the first things to consider is whether or not you want to upgrade to the latest version of PeopleTools.  This may be required, especially if you want to deploy the latest version to the database platform you’re migrating to.   If this can be done in advance of the replatforming that makes the actual replatforming easier, but it does require it’s own testing and deployment cycle, and depending on the age of your current database platform it might be impossible.  In that case you will have to do the tools upgrade along with the database replatforming. 


 If you’re upgrading PeopleTools as part of the upgrade, you need to decide if you’ll want to introduce the new look and feel of PeopleTools roll out new PeopleTools enhancements to your users.  Review the release notes to get a good list of new features and enhancements, carefully choose what will be deployed as part of your scope.


 You’ll probably need to purchase new hardware to run your new database.  This may also be a good time to do a larger hardware refresh on your web/app/batch tests.  If you’re adding new hardware, be sure you size the hardware to run the new database according to your hardware vendor’s recommendations. They all have PeopleSoft-specific sizing questionnaires and will be happy to assign an engineer to assist you with your needs.  Also give yourself adequate time in your project plan to procure the hardware and have it installed and configured.


 Minimize non-technical changes. This isn’t the best time to change business processes or implement new functionality.  It’s not even the best time to de-customize.  If you have to do it, plan for the additional testing and change management (training/communication) that will be required.


One of the first things you should do to plan out this effort is to get a good list of the objects that will need to be updated as part of the replatforming effort.  We can assume that the delivered code will work on whatever supported database platform you want to migrate to, so we can focus much of our level of effort analysis on custom and customized code.


Processes, Reports and Queries


Get a list of all process and reports from the process definition tables.  Use the process scheduler tables to flag which processes have been executed in the last 2 years or so.  This will serve as your active processes list and will define the in scope processes for this effort.  Any processes, reports and queries discussed should be filtered against this active process list to prevent spending time on unused or unneeded processes.


To capture which nVision and Crystal reports are being run you’ll need to join PSPRCSRQST to PSPRCSPARAM and pick out the report name from the command line stored in the ORIGPARMLIST field.


PeopleCode Analysis


·         Extract the custom Peoplecode from PSPCMPROG to a flat file using  DecodePeopleCode.sqr.   Run another program to extract all SQLEXEC statements, and execute each one against the target database platform.   Be sure to flag any App Engines as being Used in your process list if you find active application engine PeopleCode.


Views Analysis


Analyzing views are a bit more straightforward.  Use App Designer or Data Mover to build ALL views on the target database platform, and identify which ones error out during the build. 


SQL Object Analysis


Extract the custom SQL from SSQLTEXTDEFN into a flat file and search it for specific keywords that aren’t going to work on your current platform.  For example, if you’re migrating from SQLServer to Oracle you might look for keywords like getdate, nolock, datediff, dateadd, day, month, year, str, left, right, +, ..,*=, datepart,isnull,convert,select top, len, inner, outer, .dbo.,xact_abort,implicit,round. 


This approach still requires a human touch.  Some of the SQL flagged to be modified may be fine because the keywords may be valid PeopleSoft functions, which are prefixed with “%”,  eg Dateadd. 


 Also keep in mind different database platforms have different conventions for outer joins, allowing subquery joins, etc.  This type of syntax different is very diRead More...

Oracle Forms LAF new Instant Messaging feature?

Francois Degrelle - Fri, 2013-01-18 02:56
Hi every Look and Feel happy user! I would like to know if you would be interested in an embeded Instant Messaging system as a new LAF version ? If you have something to say like Yes or No, feel free to answer the little poll I've just created on the...

All Software Development is Schema Management

Kenneth Downs - Wed, 2013-01-16 20:36
Before you automatically disagree, stop a bit and think about it.  Can you think of any code you have ever written that did not handle data of some sort?  Of course not.  This is not about relational data either, it is about any data.  There is no software that does not process data.  Even the textbook example of some function that squares a number is processing the parameter and returning a value.  The first line of that function, in most languages, names the input parameter and its type.  That is a schema.

This remains true as you climb out of the toy textbook examples into simple one-off programs into small packages and ultimately to apps that build to megabytes of executables running on a rack (or a floor) of servers.  Just as each method call has parameters, so does each class have its attributes, every table its columns, every XML file its XSD (or so we hope) and that code works if and only if everybody on the team understood what was supposed to happen to the data.

Are We Talking UI or Platform or Middleware?

Generally today we are talking about the server side of things.  This is about any project that is going to take a user request and consult a data store: the file system, a search engine, a database, a NoSQL database, or an XML database.  If you go to the disk, that is what we are talking about.

New Versions Are (Almost Always) About Schema Changes

So imagine you've got some working code.  Maybe a single script file, or perhaps a library or package, or maybe some huge application with hundreds of files.  It is time to change it.  In my experience new code means some kind of change to the schema.

I cannot prove that, nor do I intend to try.  It is a mostly-true not an always-true.

Schema Here Does Not Mean Relational

This is not about relational schemas, though they are included.  If you are using Mongo or some other NoSQL Database which does not manage the schema for you, it just means you are managing the schema yourself somewhere in code.   But since you cannot write code that has no knowledge of the structure of the data it handles, that schema will be there somewhere, and if the code changes the schema generally changes.

Does It Need To Be Said?

Sometimes you will find yourself in a situation where people do not know this.  You will notice something is wrong, they first symptom is that the conversation does not appear to be proceeding to a conclusion.  Even worse, people do not seem to know what conclusion they are even seeking.  They do not know that they are trying to work out the schema, so they wander about the requirements trying to make sense of them.

Order and progress can be restored when somebody ties the efforts down to the discovery and detailing of the schema.  The question is usually, "What data points are we handling and what are we doing to them?"
Categories: Development

Code Today's Requirements Today

Kenneth Downs - Tue, 2013-01-15 20:06
In Part 1 of this series, Do You Know What Day It Is? (written a mere 6 months ago, I really ought to update this blog more often), we looked at Ken's First Law of Architecture:

Today's Constant is Tomorrow's Variable


If you do not know this, there are two mistakes you can make.

Mistake 1: You Ain't Gonna Need It
Mistake 1 is creating a variable, option, switch, parameter, or other control for something which as far as we know for today is a constant.  You can avoid this principle if you remember that You Ain't Gonna Need It.

It is this simple: you can invent an infinite number of variables that the user may wish to control in the future, but the chance of guessing which ones they will actually want to control is near zero.  It is a total complete waste of time.

Let's take an example.  Imagine you are asked to do a task.  Any task.  It does not even have to be programming.  The task will be loaded with points where programmers invent decisions that nobody asked them to make.  For example, consider this blog post.

1) Should I have used a different background color up there in that green box?
2) Should I have used a sub-section heading by this point, maybe I should go back and do that?
3) Should this list be numbered or maybe just bullets instead?

Notice that this list is kind of arbitrary.  I did not think of it very hard, I just made up those questions.  Another person might have made up different questions.  The problem is that the list never ends.

This can never be eradicated, it is fundamental to the human condition.  Well meaning individuals will raise irrelevancies in an effort to be helpful and the rules of polite society work against weeding these out.  Committees will put two heads on a horse and you will end up with a bunch of YAGNI options.  But we can still fight against them.

Mistake 2: Constants in Code - Nobody Does That!
It is possible to go the other way, which is to embed constants into your code so that when you need to change them you find it is expensive, difficult and error prone.  Most of us learn not to do this so early that it seems unthinkable to us.  Why even write a blog post about it in 2013?

Well the very good reason to write about it is that we all still embed "constants in code" in ways that we never think about.

For example, if you are using a relational database, then the structure of your table is a "constant in code", it is something that is woven through every line.  Your code "knows" which tables are in there and what columns they have.  When you change the structure you are changing something very much "constant" in your system.   This is why it is so important to use a lot of library code that itself reads out of data dictionaries, so that you have removed this particular "constant in code."

The great problem here is that we cannot see the future, and you never think that a basic feature of your system is an embedded constant in code.  In a simpler period in my life a system lived on a server.  One server, that was it, who ever heard of a server farm?  That was a constant: number of servers=1, and all of my code assumed it would always be true.  I had to weed out that constant from my code.

Those who understand this basic inability to see the future are most at risk of slipping into over-generalization and generating YAGNI features.  It is an attempt to fight the very universe itself, and you cannot win.

Keep it Simple, Keep it Clean

In the end you can only write clean code, architect clean systems, follow the old knowledge and be the wise one who learns to expect the unexpected.
Categories: Development

Oracle NoSQL Database Storage Node Capacity Parameter

Charles Lamb - Tue, 2013-01-15 11:16

I noticed in this article about Cassandra 1.2 that they have added the concept of vnodes, which allow you to have multiple nodes on a piece of hardware. This is pretty much the same as Oracle NoSQL Database's capability to place multiple Rep Nodes per Storage Node using the Capacity parameter. In general, the recommended starting point in configuring multiple Replication Nodes per Storage Node is one Rep Node per spindle or IO Channel.

The article also talks about Atomic Batching, which has been available in Oracle NoSQL Database since R1 through the various oracle.kv.KVStore.execute() methods. This capability allows an application to batch multiple operations against multiple records with the same major key in one atomic operation (transaction). Our users have all said that this is an important capability.



OTN Yathra

Hans Forbrich - Sun, 2013-01-13 18:45
What does Yathra mean?

Yathra is a Sanskrit word which means Journey.  In the northern states of India Yathra is spelled as Yatra.

Why is this relevant?

Oracle Technology Network, and the Oracle ACE and ACE Director programs are sponsoring a 6 city OTN tour, or Yathra, in India this February. 
  • February 16 in Delhi
  • February 18 in Mumbai
  • February 20 in Pune
  • February 22 in Bangalore
  • February 25 in Hyderabad
  • February 27 in Chennai
And I'm going!  (Assuming my Visa application is approved.)  Speaker's list is
  • Speaker                           Country
  • Vinay Agrawal                  India
  • Hans Forbrich                  Canada
  • PS Janakiram                  India
  • Lucas Jellema                 Netherlands
  • Satyendra Kumar            India
  • Raj Matamall                   USA
  • Harshad Oak                   India
  • Edward Roske                 USA
  • Vijay Seghal                    India
  • Aman Sharma                 India
  • Vivek Sharma                  India
  • Ganapthy Subramanian  India
  • Murali Vallath                   India
For more details, check out 
Categories: DBA Blogs

Oracle Forms to ADF Mobile: The Ultimate Modernization Challenge

Francois Degrelle - Fri, 2013-01-11 02:59
Using a good old Forms application on a brand new mobile? Are you kidding? Mia Urman and Oraplayer will be presenting the full process in detail in a FREE webinar hosted by ODTUG called Take Your Oracle Forms on the Road Using ADF Mobile “In our constant...

Upgrading the JasperReports libraries to 5.0.1

Dietmar Aust - Wed, 2013-01-09 17:04
Would you like to upgrade your existing JasperReportsIntegration with the latest 5.0.1 libraries of JasperReports?

Here you go ...

Step 1: Download the files for 5.0.1
You can download the files here:


 Step 2: Shutdown the Apache Tomcat J2EE server Step 3:  Remove the existing JasperReportsLibraries from your existing installation
Typically, after you have installed your previous version of the JasperReportsIntegration toolkit on your Apache Tomcat J2EE server, the files will be located in the directory $CATALINA_HOME/webapps/JasperReportsIntegration/WEB-INF/lib, for example version 4.7.0 of JasperReports, where $CATALINA_HOME represents the path to your installation of Tomcat.

Then you would have to remove these libraries first. In this directory you should find two files for removal: _jasper-reports-delete-libs-4.7.0.sh and _jasper-reports-delete-libs-4.7.0.cmd, for *nix or Windows respectively. For *nix systems you would have to make it executable, though, e.g.: chmod u+x _jasper-reports-delete-libs-4.7.0.sh . Then you can call it and it will remove all files for version 4.7.0. But it will NOT remove the file for the JasperReportsIntegration and all other libraries which YOU might have placed there deliberately.

You can always find the required removal scripts here: http://www.opal-consulting.de/downloads/free_tools/JasperReportsLibraries/ . 
 Step 4: Install the new 5.0.1 libraries
Now you can just copy the new libraries from JasperReportsLibraries-5.0.1.zip into $CATALINA_HOME/webapps/JasperReportsIntegration/WEB-INF/lib.
 Step 5: Start the Apache Tomcat J2EE server again
Now you system should be upgraded to the most current JasperReports 5.0.1 !

Just drop me a note when you need updated libraries for 5.0.2, 5.0.3, ... 6.0.0, etc. I have scripts in place to create a new package of the libraries. 

Here you can find the notes from my last upgrade (4.5.0 => 4.8.0), I hope it makes sense:

**
** download the libraries from:
**  http://www.opal-consulting.de/downloads/free_tools/JasperReportsLibraries/4.8.0/JasperReportsLibraries-4.8.0.zip
** to /home/jasper/JasperReportsLibraries

cd /home/jasper
mkdir JasperReportsLibraries

** unzip them
cd JasperReportsLibraries
unzip JasperReportsLibraries-4.8.0.zip -d JasperReportsLibraries-4.8.0

** stop tomcat server
/home/jasper/tomcat/bin/shutdown.sh

** remove libraries of current jasper reports release
cd /home/jasper/tomcat/webapps/JasperReportsIntegration/WEB-INF/lib
chmod +x _jasper-reports-delete-libs-4.5.0.sh
dos2unix _jasper-reports-delete-libs-4.5.0.sh

./_jasper-reports-delete-libs-4.5.0.sh

** copy libraries of the new release to the WEB-INF/lib directory
cp /home/jasper/JasperReportsLibraries/JasperReportsLibraries-4.8.0/* /home/jasper/tomcat/webapps/JasperReportsIntegration/WEB-INF/lib

** restart tomcat
/home/jasper/tomcat/bin/startup.sh


Cheers,
~Dietmar.

Getting started with Couchbase and node.js on Windows

Tugdual Grall - Fri, 2013-01-04 09:56
In a previous post I have explained how to use Couchbase and Node.js on OS X. Since it is quite different on Windows here another article about it. Install Couchbase Server 2.0 If you have not installed Couchbase Server already, do the following : Download Couchbase Server from here Run the installer Configure the database at http://localhost:8091 (if you have issue take a look to this Tugdual Grallhttps://plus.google.com/103667961621022786141noreply@blogger.com6

PaaS lets you pick the right tool for the job, without having to worry about the additional operational complexity

William Vambenepe - Sun, 2012-12-30 02:06

In a recent blog post, Dan McKinley explains “Why MongoDB Never Worked Out at Etsy“. In short, the usefulness of using MongoDB in addition to their existing MySQL didn’t justify the additional operational complexity of managing another infrastructure service.

This highlights the least appreciated benefit of PaaS: PaaS lets you pick the right tool for the job, without having to worry about the additional operational complexity.

I tried to explain this a year ago in this InfoQ article. But the title was cringe-worthy and the article was too long.

So this blog will be short. I even made the main point bold; and put it in the title.

 

Categories: Other

TROUG 2012 değerlendirme

H.Tonguç Yılmaz - Fri, 2012-12-28 15:02
2013 ‘e ışık tutması açısından, 2012 ‘de TROUG ile yaşadığınız deneyim paralelinde sizlerin geri beslemelerini merak ediyorum. Hatırlatma amaçlı 2012 ‘de ne gibi aktiviteler gerçekleşti sitemiz anasayfadaki “Geçmiş Etkinlikler” bölümünü inceleyebilirsiniz: http://troug.org/

UKOUG 2012

Rob van Wijk - Sat, 2012-12-22 06:39
My third time visiting the annual UKOUG conference in Birmingham started all wrong. At Schiphol Airport, the usual luggage check routine took place: laptop out of the suitcase, wallet/keys/belt apart, toothpaste apart. And afterwards putting everything back in. But I forgot to close the wheeled suitcase and when putting it on the ground, my MacBook Pro fell out. A quick inspection revealed that Rob van Wijkhttp://www.blogger.com/profile/00499478359372903250noreply@blogger.com0

Poem: "Wrong Address" - a tribute to victims of Newtown massacre

Debu Panda - Fri, 2012-12-21 13:49



When I dropped you off at the school last Friday
Your eyes were so bright
Face so charming;
You told me with your sweet voice
The weekend plans of your choice
The ballet class on Saturday
Bowling on friend’s birthday;
Helping your mom in Christmas shopping
Putting more decorations and lightings
Every itty-bitty thing that ran through your mind;
Without which I am deaf, dumb and blind.
You were so excited
To paint your walls blue and roof with red
And a new white bed
To match with the fury cat.
All your dreams and wishes
Shattered like glass
When the devil sprayed bullets in your class  
Death arrived at the wrong address
Making you an angel and driving us hapless.

I found a piece of your shattered dream
In your Barbie bag
 The incomplete Christmas drawing
Vivid and enigmatic;
Holding that your mom
Inconsolable and desolate
Sitting by the Christmas tree
Decorated by thee
 With colorful ornaments
Candy canes and reindeers
The shiny golden stars
How proud they are
That you have become a tiny star in the sky
To give light to the stranger who lost his way.

The mailman returned your letter to Santa Claus
Marking “Undeliverable address”;
You wrote in big letters
With your tiny hands 
Asked for a cute little brother
And a new dollhouse for you
A purse for your mommy
A new job for me
Happiness for your grandma
And world peace for your grandpa
Probably your letter arrived at wrong address
Saint of Death delivered your sacrifice as our present.
We are hapless
Powerless living cadavers
We forgive the devil for the sin
We cannot stop his toy named “Gun”!

I still miss
Your goodnight kiss
I want to cuddle you again
But I don’t know the address for heaven.

(Note: The recent massacre at Sandy Brook Elementary has probably broken everybody’s heart. I wrote this poem to express my feelings as a father as a tribute to the victims of that mindless killing. I do not know any victims personally)




Pages

Subscribe to Oracle FAQ aggregator