Re: working with too many files

From: Philip Lijnzaad <lijnzaad_at_ebi.ac.uk>
Date: 16 Nov 2000 08:51:17 +0000
Message-ID: <u73dgs7116.fsf_at_o2-3.ebi.ac.uk>


[ note: this is off topic; you better ask comp.unix.shell or comp.lang.perl   if this doesn't help ]

Osvaldo> I have a lot of files in a UNIX subdirectory and I need to perform some Osvaldo> calculations on the data for each pair of files .

Have you tried a little shell script along the following lines:

#!/bin/sh -x

cd somewhere
outputdir=somewhere-else

for a in *.data
do
  for b in *.data
  do
    if [ $b > $a ]
    then

        continue
    fi
    run-some-program -options $a $b > $outputdir/results.$a.$b   done
done

This will work only if there are not too many files (say, less than 1000 or so). If it doesn't, you'll have to use xargs to step through all files.

If you're worried about NFS performance, go to the machine that has the disks locally, if that is a machine you can log in to. Do 'mount' to see which machine that is.

Cheers,

                                                                      Philip

-- 
Time passed, which, basically, is its job. -- Terry Pratchett (in: Equal Rites)
-----------------------------------------------------------------------------
Philip Lijnzaad, lijnzaad_at_ebi.ac.uk \ European Bioinformatics Institute,rm A2-24

+44 (0)1223 49 4639 / Wellcome Trust Genome Campus, Hinxton
+44 (0)1223 49 4468 (fax) \ Cambridgeshire CB10 1SD, GREAT BRITAIN
PGP fingerprint: E1 03 BF 80 94 61 B6 FC 50 3D 1F 64 40 75 FB 53
Received on Thu Nov 16 2000 - 09:51:17 CET

Original text of this message