Changes for version 1.18

  • Replaced DB_File with SDBM_File in RobotsTxt filter
  • Modified Repeated filter to use an SDBM_File for persistent storage
  • Tightened up test cases for RobotsTxt and Repeated

Modules

Process Web log files for institutional repositories
base class for filters
Catch fulltext events and check for repeated requests
Filter Web log hits using a database of robot's IPs
Simple session class
Parse combined logs like those generated from Apache
filter to remove existing hits
Discover the 'institution' that a user comes from
map URLs to repository behaviour
Map DSpace logs to requests
Parse Apache logs from GNU EPrints
Parse Apache logs from an arXiv mirror
Parse Web server logs that are formatted as one hit per line (e.g. Apache)
Parse hits from an OAI-PMH interface

Provides

in lib/Logfile/EPrints/Filter.pm
in lib/Logfile/EPrints/Filter/Session.pm
in lib/Logfile/EPrints/Filter/Robots.pm
in lib/Logfile/EPrints/Filter/Session.pm
in lib/Logfile/EPrints/Hit.pm
in lib/Logfile/EPrints/Hit.pm
in lib/Logfile/EPrints/Parser/OAI.pm
in lib/Logfile/EPrints/Hit.pm
in lib/Logfile/EPrints/Period.pm
in lib/Logfile/EPrints/Repeated.pm
in lib/Logfile/EPrints/RobotsFilter.pm
in lib/Logfile/EPrints/RobotsTxtFilter.pm