Re: filter used tables out of complex script

From: Jan Gelbrich <j_gelbrich_at_westfalen-blatt.de>
Date: Fri, 5 Mar 2004 08:42:49 +0100
Message-ID: <c29b7o$1qpft7$1_at_ID-93924.news.uni-berlin.de>


"Jeroen Houben" <huuub_at_skynet.be> schrieb im Newsbeitrag news:1ba301aa.0403040731.45bfee03_at_posting.google.com...
> situation:
> On unix we have a lot of reports (.sql scripts) that create temporary
> tables etc.
> In the end it spools a report.
> (bad system but historic.. etc...)
> question:
> Is there a script/tool(perl...) that is able to filter the used tables
> out of all the statements in the sql script? (Out of the statements
> like create table as select ... etc.)
> It's pretty complex I guess. But it can be done.
>
> It's to automattically extract the used tables out of sql files to be
> short...
> (sorry for English :s)

Sounds like Reverse Engineering.
Many approaches are possible.
There are Third Party Tools like Application Miner Suite from CAST and many other tools or scripts out there, all for different needs (just a documentation or more).

You could use on-board functionality, too, and it would not be too difficult:
The information about tables and dependecies can be found in data dictionary views
USER_TABLES, DBA_DEPENDECIES and so on,
and Your scripts could be read using UTL_FILE built-in package e.g. The rest is PL/SQL coding and output setting in SQL*Plus for Your needs - no Java required ;)

hth, Jan Received on Fri Mar 05 2004 - 08:42:49 CET

Original text of this message