Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Possible Y2K trap in using DECODE?

Possible Y2K trap in using DECODE?

From: Jurij Modic <jmodic_at_src.si>
Date: Fri, 08 Jan 1999 23:28:45 GMT
Message-ID: <3696949e.1025342@news.arnes.si>


Today I came accross a very interesting situation which I think could be a potential Y2K problem.

Here is a scenario:
In one of our projects, we recive a source data from a mainframe, load it with SQL*Loader to temporary tables and from there on we transform it according to the required business rules to the destination tables of the data warehouse. Pretty standard, I guess. There are a couple of date fields in the source files and they are recorded as 8 character strings, in the format 'DDMMYYYY'. Saying this, one could conclude there is no actual danger about Y2K as the year part of the dates is passed with the century part, not only last two years of the century. But there is a catch. One of the business rules says:

"Dates are passed as character strings in the format 'DDMMYYYY' and should be stored as DATE datatype in the Oracle database. Hovewer, if the date was unknown, it was stored as '00000000' or '01010001' in a legacy system and in these cases dates should be stored as NULL values in the Oracle database".

To fulfill this rule, we used the DECODE function in the insert statements of our stored procedures, something like:

INSERT INTO destination_table (date_column)   SELECT DECODE(char_date, '00000000', NULL, '01010001', NULL,

                TO_DATE(char_date,'DDMMYYYY'))
  FROM temp_table;

At first glimpse it all seems to be as it should be, no potential Y2K bug possible as there is explicite char to date conversion with TO_DATE function using format mask with 4digit years. But guess what, it is very high possibility that dates from 21st century will be stored as dates in the 20th century!

Here is a quick demonstration:

SQL> SELECT sysdate FROM dual;

SYSDATE



08.01.99

SQL> CREATE TABLE destination_table (date_column DATE);

Table created.

SQL> INSERT INTO destination_table (date_column)   2 SELECT DECODE('31.12.2000', '00000000', NULL,

  3                   TO_DATE('31.12.2000','DD.MM.YYYY'))
  4 FROM DUAL; 1 row created.

SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')



31.12.1900

The date_column should contain a date in a year 2000, not 1900!!!! There is absolutely nothing wrong with the TO_DATE function or the format mask. Actualy, even if we directly use a DATE datatype as a condition-satisfying exit of the DECODE function the dates from past the year 2000 will be stored as the years in the 20th century. Example:

SQL> TRUNCATE TABLE destination_table;

Table truncated.

SQL> SELECT TO_CHAR(sysdate+1000,'DD.MM.YYYY') FROM dual;

TO_CHAR(SYSDATE+1000,'DD.MM.YYYY')



04.10.2001

SQL> INSERT INTO destination_table (date_column)   2 SELECT DECODE('01.11.1999', '00000000', NULL, sysdate+1000)   3 FROM DUAL; 1 row created.

SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')



04.10.1901

Instead of year 2001 the date was stored as in year 1901 !!!

This is all because we used a DECODE function. And it can't even be considered as an Oracle bug, because this particular behaviour of the decode function is described in the docs, but I belive most people didn't detect this behaviour as a potential Y2K bomb. So what is it about this unusual behaviour of the DECODE? Here is a quote from the SQL manual:

"Oracle7 automatically converts the return value to the same datatype as the first result. If the first result has the datatype CHAR or if the first result is null, then Oracle7 converts the return value to the datatype VARCHAR2."

In our case, the first result in the decode was a NULL, so the decode implicitly converted the regular DATE value to a VARCHAR2 string using the default format mask (with only 2 digits for a year part), hence loosing an information abot the century. When inserting the record into a table, it again implicitly converted the string into a date, but the century information was allready lost in a previous conversion, so it used a current century for conversion. There is no programatic cure for this behaviour, no additional TO_DATE or TO_CHAR functions inside or outside of the DECODE can help here, no 'RR' or 'RRRR' format masks can help. The only way for handling this problem is to use 4digit year format as a default year format in a session (either with setting NLS_DATE_FORMAT in a client side environment or with ALTER SESSION SET NLS_DATE_FORMAT at the beginning of the session).

Up until now I was under an impression that if I handle all character-to-date conversions using explicit TO_DATE function with a 'YYYY' in a format mask, there is no possibility for a Y2K troubles. Now i'm convinced that this aproach just isn't 100% safe. So I guess it's time to check how we handled the dates when using the DECODE function in all of our SQL/PLSQL programs....

Any comments?

Jurij Modic <jmodic_at_src.si>
Certified Oracle7 DBA (OCP)



The above opinions are mine and do not represent any official standpoints of my employer Received on Fri Jan 08 1999 - 17:28:45 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US