Home » RDBMS Server » Server Administration » sheel script to trace Duplicate Files Script
sheel script to trace Duplicate Files Script [message #226722] Mon, 26 March 2007 05:28 Go to next message
puneetsachar
Messages: 97
Registered: March 2005
Location: New Delhi, India
Member

Hi everyone

Please help me in creating script created that goes out daily and captures and duplicate file names and other invalid file names, i.e. not matching file name patterns and emails with those situations when they occur. This whole thing will be automated.


Please help..

I'm not good with shell scripting..
Re: sheel script to trace Duplicate Files Script [message #226725 is a reply to message #226722] Mon, 26 March 2007 05:33 Go to previous messageGo to next message
Mahesh Rajendran
Messages: 10672
Registered: March 2002
Location: oracleDocoVille
Senior Member
Account Moderator
no two files can have the same name in same directory.
Are you looking to check the contents?
What particular pattern are you looking for?
Rephrase your question
Re: sheel script to trace Duplicate Files Script [message #226726 is a reply to message #226722] Mon, 26 March 2007 05:53 Go to previous messageGo to next message
puneetsachar
Messages: 97
Registered: March 2005
Location: New Delhi, India
Member

Hi Mahesh

Thanks... for replying..

There could be a instance where we get the datafile

/u10/oracle/erpprod/abcd01.dbf
/u20/oracle/erprpod/abcd01.dbf
****
****
****


SQL Something like
SELECT count(*),
substr(name, 26) filename
FROM V$DATAFILE
group by substr(name, 26)
having count(*) > 1

and

-- files for filenames that do not have .dbf on the end
select name from v$datafile
where name not like '/u%/oracle/erp%data/%.dbf';

-- files for filenames with only the name .dbf
select name from v$datafile
where name like '/u%/oracle/erp%data/.dbf';

-- files for filenames with embedded spaces or trailing spaces
select name from v$datafile
where name like '% %';


Please help

Thanks
Puneet

Re: sheel script to trace Duplicate Files Script [message #226734 is a reply to message #226726] Mon, 26 March 2007 06:25 Go to previous messageGo to next message
S.Rajaram
Messages: 1027
Registered: October 2006
Location: United Kingdom
Senior Member
V$datafile holds the information about the dbf files that are related to oracle. These files will be created and stored in the data dictionary views only when you create it to add some extra storage for your database. What exactly you want and why do you need a shell script for this ?
Re: sheel script to trace Duplicate Files Script [message #226737 is a reply to message #226722] Mon, 26 March 2007 06:34 Go to previous messageGo to next message
puneetsachar
Messages: 97
Registered: March 2005
Location: New Delhi, India
Member

I understand, what you said mahesh

There are case where files system get dupicated and now managment want to trace the problem now and then.

therefore they want an automated process to trace this...
they want an crontab which will run and give them file name which is dupicated so next time when there is downtime we replace the datafile with right name or patten

Please help !!!
Re: sheel script to trace Duplicate Files Script [message #226758 is a reply to message #226722] Mon, 26 March 2007 08:20 Go to previous messageGo to next message
puneetsachar
Messages: 97
Registered: March 2005
Location: New Delhi, India
Member

please help
Re: sheel script to trace Duplicate Files Script [message #226809 is a reply to message #226758] Mon, 26 March 2007 11:15 Go to previous messageGo to next message
Mahesh Rajendran
Messages: 10672
Registered: March 2002
Location: oracleDocoVille
Senior Member
Account Moderator
An automated job for this purpose is totally not required.
How many times a day are you going to add a datafile?
How many times will you create it wrong againg and again? (pattern/directory whatever?).
Your search "pattern" is not logically correct.
For instance, system tablespace by default will get created as
system01.dbf (atleast in 10g) or something like that.
A simple script like this, could do the job.
This script could be a one liner and for understanding i broke it into two lines.
You can make it more functional using regular expressions.
But you need to first comeout with a pattern to search.
using grep -v you can exclude whatever do not want.
if your filesystem always starts with /u*, include it in the find
-bash-3.00$ cat somescript
find / . -name "sys*.d*" >/tmp/temp1 2>&1
cat /tmp/temp1 | grep -v "find: cannot read dir" | grep -v "/usr" | grep -v "/etc"
-bash-3.00$
-bash-3.00$
-bash-3.00$ ./somescript
/sand/sand/system01.dbf
/sand/sand/sysaux01.dbf
/source/source/system01.dbf
/source/source/sysaux01.dbf
/target/target/system01.dbf
/target/target/sysaux01.dbf
/target/target/sysaux01.data_this_is_wrong
/whse/widw4l9/system01.dbf
/whse/widw4l9/sysaux01.dbf

Re: sheel script to trace Duplicate Files Script [message #227118 is a reply to message #226809] Tue, 27 March 2007 06:27 Go to previous message
puneetsachar
Messages: 97
Registered: March 2005
Location: New Delhi, India
Member

Mahesh

I agree with you, but its a managment call.
Anyways I did get some clue..

working on it, if finished I will share with you.

Thanks for support
Previous Topic: To check archieve process are successfully running
Next Topic: Oracle 9i & 10g on Windows
Goto Forum:
  


Current Time: Sat Dec 03 01:09:19 CST 2016

Total time taken to generate the page: 0.12584 seconds