Update documentation and ChangeLog

This commit is contained in:
Gregory Soutade 2017-05-25 21:09:02 +02:00
parent 68a67adecc
commit 83227cfad2
2 changed files with 12 additions and 1 deletions

View File

@ -1,3 +1,13 @@
v0.5 (25/05/2017)
** User **
Add --dry-run (-D) argument
** Dev **
Use cPickle instead of pickle
Don't save all robots requests (only first pass is kept) which allow to save a large amount of memory/disk space
Add one more rule to robot detection : more than ten 404 pages viewed
** Bugs **
v0.4 (29/01/2017)
** User **
Remove crawlers from feed parsers

View File

@ -11,7 +11,7 @@ Nevertheless, iwla is only focused on HTTP logs. It uses data (robots definition
Usage
-----
./iwla [-c|--clean-output] [-i|--stdin] [-f FILE|--file FILE] [-d LOGLEVEL|--log-level LOGLEVEL] [-r|--reset year/month] [-z|--dont-compress] [-p]
./iwla [-c|--clean-output] [-i|--stdin] [-f FILE|--file FILE] [-d LOGLEVEL|--log-level LOGLEVEL] [-r|--reset year/month] [-z|--dont-compress] [-p] [-D|--dry-run]
-c : Clean output (database and HTML) before starting
-i : Read data from stdin instead of conf.analyzed_filename
@ -20,6 +20,7 @@ Usage
-r : Reset analysis to a specific date (month/year)
-z : Don't compress databases (bigger but faster, not compatible with compressed databases)
-p : Only generate display
-d : Dry run (don't write/update files to disk)
Basic usage
-----------