Scroll to navigation

DBILogger(3pm) User Contributed Perl Documentation DBILogger(3pm)


Apache::DBILogger - Tracks what's being transferred in a DBI database


  # Place this in your Apache's httpd.conf file
  PerlLogHandler Apache::DBILogger

  PerlSetVar DBILogger_data_source    DBI:mysql:httpdlog
  PerlSetVar DBILogger_username       httpduser
  PerlSetVar DBILogger_password       secret
  PerlSetvar DBILogger_table          requests

Create a database with a table named requests like this:

CREATE TABLE requests ( server varchar(127) DEFAULT '' NOT NULL, bytes mediumint(9) DEFAULT '0' NOT NULL, user varchar(15) DEFAULT '' NOT NULL, filename varchar(200) DEFAULT '' NOT NULL, remotehost varchar(150) DEFAULT '' NOT NULL, remoteip varchar(15) DEFAULT '' NOT NULL, status smallint(6) DEFAULT '0' NOT NULL, timeserved datetime DEFAULT '0000-00-00 00:00:00' NOT NULL, contenttype varchar(50) DEFAULT '' NOT NULL, urlpath varchar(200) DEFAULT '' NOT NULL, referer varchar(250) DEFAULT '' NOT NULL, useragent varchar(250) DEFAULT '' NOT NULL, usertrack varchar(100) DEFAULT '' NOT NULL, KEY server_idx (server), KEY timeserved_idx (timeserved) );

Please note that for some databases (notably, PostgreSQL) you will need to double-quote the user column name (that is, to specify it as ""user" varchar(15)") in order for the database not to mistake it with a keyword.

Its recommended that you include

use Apache::DBI; use DBI; use Apache::DBILogger;

in your script. Please read the Apache::DBI documentation for further information.


This module tracks what's being transferred by the Apache web server in a SQL database (everything with a DBI/DBD driver). This allows one to get statistics (of almost everything) without having to parse the log files (like the Apache::Traffic module, just in a "real" database, and with a lot more logged information).

Apache::DBILogger will track the cookie from 'mod_usertrack' if it's there.

After installation, follow the instructions in the synopsis and restart the server.

The statistics are then available in the database. See the section VIEWING STATISTICS for more details.


You need to have compiled mod_perl with the LogHandler hook in order to use this module. Additionally, the following modules are required:

        o DBI
        o Date::Format


To install this module, move into the directory where this file is located and type the following:

        perl Makefile.PL
        make test
        make install

This will install the module into the Perl library directory.

Once installed, you will need to modify your web server's configuration file so it knows to use Apache::DBILogger during the logging phase.


Please see the bin/ directory in the distribution for a statistics script.

Some funny examples on what you can do might include:

hit count and total bytes transferred from the virtual server
    select count(id),sum(bytes) from requests 
    where server=""
hit count and total bytes from all servers, ordered by number of hits
    select server,count(id) as hits,sum(bytes) from requests
    group by server order by hits desc
count of hits from macintosh users
    select count(id) from requests where useragent like "%Mac%"
hits and total bytes in the last 30 days select count(id),sum(bytes) from requests where server="" and TO_DAYS(NOW()) - TO_DAYS(timeserved) <= 30
This is pretty unoptimal. It would be faster to calculate the dates in perl and write them in the sql query using f.x. Date::Format.
hits and total bytes from on mondays.
    select count(id),sum(bytes) from requests where
    server="" and dayofweek(timeserved) = 2

It's often pretty interesting to view the referer info too.

See your sql server documentation of more examples. I'm a happy mySQL user, so I would continue on


MySQL 'read locks' the table when you do a select. On a big table (like a large httpdlog) this might take a while, where your httpds can't insert new logentries, which will make them 'hang' until the select is done.

One way to work around this is to create another table (f.x. requests_insert) and get the httpd's to insert to this table.

Then run a script from crontab once in a while which does something like this:

  LOCK TABLES requests WRITE, requests_insert WRITE
  insert into requests select * from requests_insert
  delete from requests_insert

You can use the script from the bin/ directory.

Please note that this won't work if you have any unique id field! You'll get duplicates and your new rows won't be inserted, just deleted. Be careful.


I've experienced problems with 'Packets too large' when using Apache::DBI, mysql and DBD::mysql 2.00 (from the Msql-mysql 1.18x packages). The DBD::mysql module from Msql-mysql 1.19_17 seems to work fine with Apache::DBI.

You might get problems with Apache 1.2.x. (Not supporting post_connection?)


The official version of this module, as Ask Bjoern Hansen last modified it, lacks support for the API changes introduced with Apache 2.x and the corresponding mod_perl 2.x - Of course, this is quite understandable as this module was last updated in 1998 ;-) But anyway, the module does its job still quite fine, and users still require its functionality.

For any help requests regarding this module on Apache 2 systems, contact Gunnar Wolf <> directly. If your system is based on Debian GNU/Linux, you can use the regular Debian bugtracking facilities, as the multi-API patch was introduced specifically for Debian.


This module is supported via the mod_perl mailinglist (, subscribe by sending a mail to

I would like to know which databases this module have been tested on, so please mail me if you try it.

The latest version can be found on your local CPAN mirror or at ""


Copyright (C) 1998, Ask Bjoern Hansen <>. All rights reserved. This module is free software; you may redistribute it and/or modify it under the same terms as Perl itself.


perl(1), mod_perl(3)
2018-06-21 perl v5.26.2