1 | v0.6 (20/11/2022)␊ |
2 | ** User **␊ |
3 | ␉Replace track_users by filter_users plugins which can interpret conditional filters from configuration␊ |
4 | ␉Don't save all visitors requests into database (save space and computing). Can be changed in deufalt_conf.py with keep_requests value␊ |
5 | ␉Replace -c argument by config file. Now clean output is -C␊ |
6 | ␉Add favicon␊ |
7 | ␉Be more strict with robots : requires at least 1 hit per viewed page␊ |
8 | ␉Add a plugin to anonymize IP addresses (for public statistics)␊ |
9 | ␉Allow to merge feeds based on name regular expression with merge_feeds_parsers_list conf value␊ |
10 | ␉Add all_visits_enlight display plugin␊ |
11 | ␉Show last access information for feeds parsers␊ |
12 | ␊ |
13 | ** Dev **␊ |
14 | ␉Update data from AWStats␊ |
15 | ␉Add Geo IP location information␊ |
16 | ␉Normalize URL before counting in stats␊ |
17 | ␉Update feed detector : check 'feed', 'rss' or 'atom' string in user agent␊ |
18 | ␉Move code to Python 3␊ |
19 | ␉Check for multimedia files using lower case␊ |
20 | ␉Don't launch robot analysis rules for feed parsers␊ |
21 | ␊ |
22 | ** Bugs **␊ |
23 | ␉Fix a bug in feed filter (must not return, just break after manage of feed_name_analysed)␊ |
24 | ␉Display viewed and no viewed data for feeds (can be set as a robot too late)␊ |
25 | ␊ |
26 | v0.5 (15/04/2020)␊ |
27 | ** User **␊ |
28 | ␉Add --dry-run (-D) argument␊ |
29 | ␉Add more rules for robot detection :␊ |
30 | ␉␉More than ten 404 pages viewed␊ |
31 | ␉␉No page and no Hit␊ |
32 | ␉␉Pages without hit␊ |
33 | ␉New format for (not_)viewed pages/hits and bandwidth that are now recorded by day (in a dictionnary were only element 0 is initialized). Element 0 is the total. WARNING : not backward compatible with previous databases.␊ |
34 | ␉Sync data with awstat (develop branch : 7.7+)␊ |
35 | ␉Make backup before compressing (low memory servers)␊ |
36 | ␉Add top_pages_diff plugin␊ |
37 | ␉Add IP exclusion feature␊ |
38 | ** Dev **␊ |
39 | ␉Use cPickle instead of pickle␊ |
40 | ␉Don't save all robots requests (only first pass is kept) which allow to save a large amount of memory/disk space␊ |
41 | ␉Handle URLs with empty referer␊ |
42 | ␉Don't try to find search engine on robots␊ |
43 | ** Bugs **␊ |
44 | ␉Fix KeyError : geo attribute can not exists␊ |
45 | ␉Call post hook plugins even in display only mode␊ |
46 | ␊ |
47 | v0.4 (29/01/2017)␊ |
48 | ** User **␊ |
49 | ␉Remove crawlers from feed parsers␊ |
50 | ␉Add display only switch (-p)␊ |
51 | ␉Add robot bandwidth display plugin␊ |
52 | ** Dev **␊ |
53 | ** Bugs **␊ |
54 | ␉for robots, we have to use not_viewed_pages (feeds plugin)␊ |
55 | ␉gz files were not generated due to bad time comparison␊ |
56 | ␉Database compression could lead to altered files␊ |
57 | ␊ |
58 | v0.3 (12/04/2016)␊ |
59 | ** User **␊ |
60 | ␉Add referers_diff display plugin␊ |
61 | ␉Add year statistics in month details␊ |
62 | ␉Add analysis duration␊ |
63 | ␉Add browsers detection␊ |
64 | ␉Add operating systems detection␊ |
65 | ␉Add track users plugin␊ |
66 | ␉Add feeds plugin␊ |
67 | ␉Add _append feature to conf.py␊ |
68 | ␉Add hours_stats plugin␊ |
69 | ␉Add display/top_downloads_diff plugin␊ |
70 | ␉Can specify multiple files to analyze␊ |
71 | ␉Add reset feature␊ |
72 | ␉Add gz files support␊ |
73 | ␉Add -z option (don't compress databases)␊ |
74 | ␉Add own search enfines files␊ |
75 | ␉Do reverse DNS on feeds parsers␊ |
76 | ␉Add IPToGeo plugin␊ |
77 | ␊ |
78 | ** Dev **␊ |
79 | ␉Add istats_diff interface␊ |
80 | ␉Sort documentation output␊ |
81 | ␉Add debug traces in robots plugin␊ |
82 | ␉Update awstats data␊ |
83 | ␉Remove double slashes at the end of URL␊ |
84 | ␉Remove final slashes for referrers␊ |
85 | ␉Add alt attribute for all img tag␊ |
86 | ␊ |
87 | ** Bugs **␊ |
88 | ␉Forgot <body> tag␊ |
89 | ␉Bad UTC time computation␊ |
90 | ␉Hits/pages in the same second were not analyzed␊ |
91 | ␉Last day of month was skipped␊ |