Eliminating duplicate log files

Vernon Schryver vjs@calcite.rhyolite.com
Thu Mar 28 00:23:29 UTC 2002


> From: Gary Mills <mills@cc.UManitoba.CA>

> Has any thought been given to modifying DCC to avoid creating duplicate
> log files?  In a one-hour interval, my DCC server created 471 log files.
> Of these, 369 had the same body checksum, 160 had the same fuz1 checksum,
> and 99 had the same fuz2 checksum.  Eliminating the duplicates would
> make it easier to review the log files and identify legitimate bulk mail.

But wouldn't that be contrary to an equal purpose of the log files,
which is to show users which of their mail has been blocked and why?

What I do for my own analogous but comparatively trivial problem
is something like:

   mkdir foo
   mv `grep -l "Fuz2: 5ba36e3e d6ab5792 7280b7be 5ae1a936"` foo

I have more than 1000 files in a directory called mbcmovieenglish.


It shouldn't be too hard to write a script that would automate that.


Vernon Schryver    vjs@rhyolite.com



More information about the DCC mailing list

Contact vjs@rhyolite.com by mail or use the form.