Limit and filter logs

Drop duplicates logs.
Allow for logs to be grouped, enforcing a maximum number of logs per group.
Add the LOG_FILTER setting to ask from the configuration file to ignore some
logs (of level up to warning).
This commit is contained in:
Rogdham 2014-04-01 20:44:09 +02:00
commit d9b0091357
8 changed files with 146 additions and 33 deletions

View file

@ -239,8 +239,10 @@ class Content(object):
self._context['filenames'][path].url))
origin = origin.replace('\\', '/') # for Windows paths.
else:
logger.warning("Unable to find {fn}, skipping url"
" replacement".format(fn=path))
logger.warning(("Unable to find {fn}, skipping url"
" replacement".format(fn=value),
"Other ressources were not found"
" and their urls not replaced"))
elif what == 'category':
origin = Category(path, self.settings).url
elif what == 'tag':