Limit and filter logs

Drop duplicates logs.
Allow for logs to be grouped, enforcing a maximum number of logs per group.
Add the LOG_FILTER setting to ask from the configuration file to ignore some
logs (of level up to warning).
This commit is contained in:
Rogdham 2014-04-01 20:44:09 +02:00
commit d9b0091357
8 changed files with 146 additions and 33 deletions

View file

@ -143,3 +143,41 @@ and Python 3 at the same time:
changed it where I felt necessary.
- Changed xrange() back to range(), so it is valid in both Python versions.
Logging tips
============
Try to use logging with appropriate levels.
For logging messages that are not repeated, use the usual Python way:
# at top of file
import logging
logger = logging.getLogger(__name__)
# when needed
logger.warning('A warning that could occur only once")
However, if you want to log messages that may occur several times, instead of
a string, gives a tuple to the logging method, with two arguments:
1. The message to log for this very execution
2. A generic message that will appear if the previous one would occur to many
times.
For example, if you want to log missing resources, use the following code:
for ressource in ressources:
if ressource.is_missing:
logger.warning((
'The resource {r} is missing'.format(r=ressource.name),
'Other resources were missing'))
The logs will be displayed as follows:
WARNING: The resource prettiest_cat.jpg is missing
WARNING: The resource best_cat_ever.jpg is missing
WARNING: The resource cutest_cat.jpg is missing
WARNING: The resource lolcat.jpg is missing
WARNING: Other resources were missing