Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Help! ORA-01658 on import, and with plenty of space
Pedro,
I think your problem is still one of insufficient contiguous space to create an INITIAL or NEXT extent for the problem table. This can be caused by the COMPRESS=Y as mentioned by Sybrand in his reply.
I too suffer from this problem when importing customer dumps and I've traced it doen to the following - all of which could be affecting you :
What I do is run a dummy import where I specify INDEXFILE=crud.sql and this creates a file with lots of create table and create index commands. I then edit this file to remove the STORAGE parameters. I have a C++ program to do this which runs on my HP servers, but I can let you have the source code if you want. It isn't perfect, and there seems to be many ways that Oracle place the STORAGE parameters in the file, but I think I have most of them covered :o)
(I've had 'STORAGE ( INITIAL...', or 'STORAGE (INITIAL...' or 'STORAGE(INITIAL ...' and so on - they seem to be pretty random with where they put spaces and things.)
Once I've created a new script, I drop all the curent tables completely, coalesce the tablespace and run my script to create the objects with default storage. Then I run the import with IGNORE=Y and usually (!) it works, however, there might be a problem if the export dile contains tables or indexes with columns which are Oracle reserved words. These are usually pretty easily fixed.
Email me if you wish to have a copy of the C++ source code.
Regards,
Norman.
Tel: 0113 289 6265 Fax: 0113 289 3146 URL: http://www.Lynx-FS.com
-------------------------------------
-----Original Message-----
From: mercadop_at_expertx.com (Pedro Mercado) [mailto:mercadop_at_expertx.com]
Posted At: Thursday, August 22, 2002 9:52 PM
Posted To: server
Conversation: Help! ORA-01658 on import, and with plenty of space
Subject: Re: Help! ORA-01658 on import, and with plenty of space
Thank you Mr. Bakker for your reply. I tried what you suggested, and it did run more efficiently with compress=n on the export. However, when I hit the largest of the tables (3GB, ~30mil records, 19 fields) it gave the error again. Why did the strategy not work with this table, and what can I do to correct this problem in the future?
Thanks,
Pedro Mercado
Received on Fri Aug 23 2002 - 05:17:40 CDT
![]() |
![]() |