Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.server -> Re: Very Large Table extending after mass deletes
If I were you, I would change it to a varchar2 if it doesn't need to be a
LONG. Longs have several restrictions against them. One is on import,
Oracle commits after every row whether or not you have commit=Y and
buffer=to somesize. I am currently importing my large table into
a test database and it has been running, last I checked, for 36 hours. Some
other restrictions are you can't use a long in SELECT lists of CREATE
TABLE ... AS SELECT statements and you can't use a long in SELECT lists in
subqueries in INSERT statements. Check out Chapter 2, section Datatypes, on
the Generic Documentation CD.
Jane
Greg Stark <greg-spare-1_at_mit.edu> wrote in message
news:874sbrasay.fsf_at_HSE-Montreal-ppp33976.qc.sympatico.ca...
> "Jonathan Lewis" <jonathan_at_jlcomp.demon.co.uk> writes:
>
> > Tables with LONGs in them tend to be major pains.
>
> We have a table with a LONG column in it (the developer was used to MSSQL
> which had a 256 character limit on varchars) but we don't really use it
for
> anything big, 100-200 bytes is normal. Should we be worried and rebuild
this
> table? or are your comments and others that I've seen only relevant for
people
> who are actually storing lots of data in LONGs?
>
> --
> greg
Received on Thu Feb 03 2000 - 08:34:57 CST