Refactor logging handling

Old system was using manual string formatting for log messages.
This caused issues with common operations like exception logging
because often they need to be handled differently for Py2/Py3
compatibility. In order to unify the effort:

 - All logging is changed to `logging.level(msg, arg1, arg2)` style.
 - A `SafeLogger` is implemented to auto-decode exceptions properly
in the args (ref #1403).
 - Custom formatters were overriding useful logging functionality
like traceback outputing (ref #1402). They are refactored to be
more transparent. Traceback information is provided in `--debug`
mode for `read_file` errors in generators.
 - Formatters will now auto-format multiline log messages in order
to make them look related. Similarly, traceback will be formatted in
the same fashion.
 - `pelican.log.LimitFilter` was (ab)using logging message which
would result in awkward syntax for argumented logging style. This
functionality is moved to `extra` keyword argument.
 - Levels for errors that would result skipping a file (`read_file`)
changed from `warning` to `error` in order to make them stand out
among other logs.
 - Small consistency changes to log messages (i.e. changing all
to start with an uppercase letter) and quality-of-life improvements
(some log messages were dumping raw object information).
This commit is contained in:
Deniz Turgut 2014-07-22 11:48:15 -04:00
commit ed3209888a
14 changed files with 271 additions and 195 deletions

View file

@ -142,34 +142,65 @@ Logging tips
Try to use logging with appropriate levels. Try to use logging with appropriate levels.
For logging messages that are not repeated, use the usual Python way: For logging messages that are not repeated, use the usual Python way::
# at top of file # at top of file
import logging import logging
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
# when needed # when needed
logger.warning("A warning that would usually occur only once") logger.warning("A warning with %s formatting", arg_to_be_formatted)
However, if you want to log messages that may occur several times, instead of Do not format log messages yourself. Use ``%s`` formatting in messages and pass
a string, give a tuple to the logging method, with two arguments: arguments to logger. This is important, because Pelican logger will preprocess
some arguments (like Exceptions) for Py2/Py3 compatibility.
1. The message to log for the initial execution Limiting extraneous log messages
2. A generic message that will appear if the previous one would occur too many --------------------------------
times.
For example, if you want to log missing resources, use the following code: If the log message can occur several times, you may want to limit the log to
prevent flooding. In order to do that, use the ``extra`` keyword argument for
the logging message in the following format::
logger.warning("A warning with %s formatting", arg_to_be_formatted,
extra={'limit_msg': 'A generic message for too many warnings'})
Optionally, you can also set ``'limit_args'`` as a tuple of arguments in
``extra`` dict if your generic message needs formatting.
Limit is set to ``5``, i.e, first four logs with the same ``'limit_msg'`` are
outputted normally but the fifth one will be logged using
``'limit_msg'`` (and ``'limit_args'`` if present). After the fifth,
corresponding log messages will be ignored.
For example, if you want to log missing resources, use the following code::
for resource in resources: for resource in resources:
if resource.is_missing: if resource.is_missing:
logger.warning(( logger.warning(
'The resource {r} is missing'.format(r=resource.name), 'The resource %s is missing', resource.name,
'Other resources were missing')) extra={'limit_msg': 'Other resources were missing'})
The log messages will be displayed as follows: The log messages will be displayed as follows::
WARNING: The resource prettiest_cat.jpg is missing WARNING: The resource prettiest_cat.jpg is missing
WARNING: The resource best_cat_ever.jpg is missing WARNING: The resource best_cat_ever.jpg is missing
WARNING: The resource cutest_cat.jpg is missing WARNING: The resource cutest_cat.jpg is missing
WARNING: The resource lolcat.jpg is missing WARNING: The resource lolcat.jpg is missing
WARNING: Other resources were missing WARNING: Other resources were missing
Outputting traceback in the logs
--------------------------------
If you're logging inside an ``except`` block, you may want to provide the
traceback information as well. You can do that by setting ``exc_info`` keyword
argument to ``True`` during logging. However, doing so by default can be
undesired because tracebacks are long and can be confusing to regular users.
Try to limit them to ``--debug`` mode like the following::
try:
some_action()
except Exception as e:
logger.error('Exception occured: %s', e,
exc_info=settings.get('DEBUG', False))

View file

@ -70,16 +70,16 @@ class Pelican(object):
for plugin in self.settings['PLUGINS']: for plugin in self.settings['PLUGINS']:
# if it's a string, then import it # if it's a string, then import it
if isinstance(plugin, six.string_types): if isinstance(plugin, six.string_types):
logger.debug("Loading plugin `{0}`".format(plugin)) logger.debug("Loading plugin `%s`", plugin)
try: try:
plugin = __import__(plugin, globals(), locals(), plugin = __import__(plugin, globals(), locals(),
str('module')) str('module'))
except ImportError as e: except ImportError as e:
logger.error( logger.error(
"Can't find plugin `{0}`: {1}".format(plugin, e)) "Cannot load plugin `%s`\n%s", plugin, e)
continue continue
logger.debug("Registering plugin `{0}`".format(plugin.__name__)) logger.debug("Registering plugin `%s`", plugin.__name__)
plugin.register() plugin.register()
self.plugins.append(plugin) self.plugins.append(plugin)
logger.debug('Restoring system path') logger.debug('Restoring system path')
@ -99,7 +99,7 @@ class Pelican(object):
for setting in ('ARTICLE_URL', 'ARTICLE_LANG_URL', 'PAGE_URL', for setting in ('ARTICLE_URL', 'ARTICLE_LANG_URL', 'PAGE_URL',
'PAGE_LANG_URL'): 'PAGE_LANG_URL'):
logger.warning("%s = '%s'" % (setting, self.settings[setting])) logger.warning("%s = '%s'", setting, self.settings[setting])
if self.settings.get('ARTICLE_PERMALINK_STRUCTURE', False): if self.settings.get('ARTICLE_PERMALINK_STRUCTURE', False):
logger.warning('Found deprecated `ARTICLE_PERMALINK_STRUCTURE` in' logger.warning('Found deprecated `ARTICLE_PERMALINK_STRUCTURE` in'
@ -124,7 +124,7 @@ class Pelican(object):
'PAGE_SAVE_AS', 'PAGE_LANG_SAVE_AS'): 'PAGE_SAVE_AS', 'PAGE_LANG_SAVE_AS'):
self.settings[setting] = os.path.join(structure, self.settings[setting] = os.path.join(structure,
self.settings[setting]) self.settings[setting])
logger.warning("%s = '%s'" % (setting, self.settings[setting])) logger.warning("%s = '%s'", setting, self.settings[setting])
for new, old in [('FEED', 'FEED_ATOM'), ('TAG_FEED', 'TAG_FEED_ATOM'), for new, old in [('FEED', 'FEED_ATOM'), ('TAG_FEED', 'TAG_FEED_ATOM'),
('CATEGORY_FEED', 'CATEGORY_FEED_ATOM'), ('CATEGORY_FEED', 'CATEGORY_FEED_ATOM'),
@ -203,7 +203,7 @@ class Pelican(object):
for v in value: for v in value:
if isinstance(v, type): if isinstance(v, type):
logger.debug('Found generator: {0}'.format(v)) logger.debug('Found generator: %s', v)
generators.append(v) generators.append(v)
return generators return generators
@ -217,11 +217,11 @@ class Pelican(object):
else: else:
writer = writers[0] writer = writers[0]
if writers_found == 1: if writers_found == 1:
logger.debug('Found writer: {}'.format(writer)) logger.debug('Found writer: %s', writer)
else: else:
logger.warning( logger.warning(
'{} writers found, using only first one: {}'.format( '%s writers found, using only first one: %s',
writers_found, writer)) writers_found, writer)
return writer(self.output_path, settings=self.settings) return writer(self.output_path, settings=self.settings)
@ -308,6 +308,7 @@ def get_config(args):
config['CACHE_PATH'] = args.cache_path config['CACHE_PATH'] = args.cache_path
if args.selected_paths: if args.selected_paths:
config['WRITE_SELECTED'] = args.selected_paths.split(',') config['WRITE_SELECTED'] = args.selected_paths.split(',')
config['DEBUG'] = args.verbosity == logging.DEBUG
# argparse returns bytes in Py2. There is no definite answer as to which # argparse returns bytes in Py2. There is no definite answer as to which
# encoding argparse (or sys.argv) uses. # encoding argparse (or sys.argv) uses.
@ -404,7 +405,7 @@ def main():
logger.critical(e.args) logger.critical(e.args)
raise raise
logger.warning( logger.warning(
'Caught exception "{0}". Reloading.'.format(e)) 'Caught exception "%s". Reloading.', e)
finally: finally:
time.sleep(.5) # sleep to avoid cpu load time.sleep(.5) # sleep to avoid cpu load
@ -419,13 +420,7 @@ def main():
pelican.run() pelican.run()
except Exception as e: except Exception as e:
# localized systems have errors in native language if locale is set logger.critical('%s', e)
# so convert the message to unicode with the correct encoding
msg = str(e)
if not six.PY3:
msg = msg.decode(locale.getpreferredencoding())
logger.critical(msg)
if args.verbosity == logging.DEBUG: if args.verbosity == logging.DEBUG:
raise raise

View file

@ -232,10 +232,11 @@ class Content(object):
self._context['filenames'][path].url)) self._context['filenames'][path].url))
origin = origin.replace('\\', '/') # for Windows paths. origin = origin.replace('\\', '/') # for Windows paths.
else: else:
logger.warning(("Unable to find {fn}, skipping url" logger.warning(
" replacement".format(fn=value), "Unable to find `%s`, skipping url replacement.",
"Other resources were not found" value.geturl(), extra = {
" and their urls not replaced")) 'limit_msg': ("Other resources were not found "
"and their urls not replaced")})
elif what == 'category': elif what == 'category':
origin = Category(path, self.settings).url origin = Category(path, self.settings).url
elif what == 'tag': elif what == 'tag':
@ -358,6 +359,5 @@ def is_valid_content(content, f):
content.check_properties() content.check_properties()
return True return True
except NameError as e: except NameError as e:
logger.error("Skipping %s: could not find information about " logger.error("Skipping %s: could not find information about '%s'", f, e)
"'%s'" % (f, e))
return False return False

View file

@ -22,13 +22,14 @@ from jinja2 import (Environment, FileSystemLoader, PrefixLoader, ChoiceLoader,
from pelican.contents import Article, Draft, Page, Static, is_valid_content from pelican.contents import Article, Draft, Page, Static, is_valid_content
from pelican.readers import Readers from pelican.readers import Readers
from pelican.utils import (copy, process_translations, mkdir_p, DateFormatter, from pelican.utils import (copy, process_translations, mkdir_p, DateFormatter,
FileStampDataCacher) FileStampDataCacher, python_2_unicode_compatible)
from pelican import signals from pelican import signals
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@python_2_unicode_compatible
class Generator(object): class Generator(object):
"""Baseclass generator""" """Baseclass generator"""
@ -67,7 +68,7 @@ class Generator(object):
extensions=self.settings['JINJA_EXTENSIONS'], extensions=self.settings['JINJA_EXTENSIONS'],
) )
logger.debug('template list: {0}'.format(self.env.list_templates())) logger.debug('Template list: %s', self.env.list_templates())
# provide utils.strftime as a jinja filter # provide utils.strftime as a jinja filter
self.env.filters.update({'strftime': DateFormatter()}) self.env.filters.update({'strftime': DateFormatter()})
@ -153,6 +154,10 @@ class Generator(object):
value = list(value.items()) # py3k safeguard for iterators value = list(value.items()) # py3k safeguard for iterators
self.context[item] = value self.context[item] = value
def __str__(self):
# return the name of the class for logging purposes
return self.__class__.__name__
class CachingGenerator(Generator, FileStampDataCacher): class CachingGenerator(Generator, FileStampDataCacher):
'''Subclass of Generator and FileStampDataCacher classes '''Subclass of Generator and FileStampDataCacher classes
@ -470,7 +475,8 @@ class ArticlesGenerator(CachingGenerator):
context_signal=signals.article_generator_context, context_signal=signals.article_generator_context,
context_sender=self) context_sender=self)
except Exception as e: except Exception as e:
logger.warning('Could not process {}\n{}'.format(f, e)) logger.error('Could not process %s\n%s', f, e,
exc_info=self.settings.get('DEBUG', False))
continue continue
if not is_valid_content(article, f): if not is_valid_content(article, f):
@ -492,9 +498,8 @@ class ArticlesGenerator(CachingGenerator):
context_sender=self) context_sender=self)
all_drafts.append(draft) all_drafts.append(draft)
else: else:
logger.warning("Unknown status %s for file %s, skipping it." % logger.error("Unknown status '%s' for file %s, skipping it.",
(repr(article.status), article.status, f)
repr(f)))
self.articles, self.translations = process_translations(all_articles) self.articles, self.translations = process_translations(all_articles)
self.drafts, self.drafts_translations = \ self.drafts, self.drafts_translations = \
@ -594,7 +599,8 @@ class PagesGenerator(CachingGenerator):
context_signal=signals.page_generator_context, context_signal=signals.page_generator_context,
context_sender=self) context_sender=self)
except Exception as e: except Exception as e:
logger.warning('Could not process {}\n{}'.format(f, e)) logger.error('Could not process %s\n%s', f, e,
exc_info=self.settings.get('DEBUG', False))
continue continue
if not is_valid_content(page, f): if not is_valid_content(page, f):
@ -609,9 +615,8 @@ class PagesGenerator(CachingGenerator):
elif page.status == "hidden": elif page.status == "hidden":
hidden_pages.append(page) hidden_pages.append(page)
else: else:
logger.warning("Unknown status %s for file %s, skipping it." % logger.error("Unknown status '%s' for file %s, skipping it.",
(repr(page.status), page.status, f)
repr(f)))
self.pages, self.translations = process_translations(all_pages) self.pages, self.translations = process_translations(all_pages)
self.hidden_pages, self.hidden_translations = ( self.hidden_pages, self.hidden_translations = (
@ -679,7 +684,7 @@ class StaticGenerator(Generator):
save_as = os.path.join(self.output_path, sc.save_as) save_as = os.path.join(self.output_path, sc.save_as)
mkdir_p(os.path.dirname(save_as)) mkdir_p(os.path.dirname(save_as))
shutil.copy2(source_path, save_as) shutil.copy2(source_path, save_as)
logger.info('copying {} to {}'.format(sc.source_path, sc.save_as)) logger.info('Copying %s to %s', sc.source_path, sc.save_as)
class SourceFileGenerator(Generator): class SourceFileGenerator(Generator):
@ -694,7 +699,7 @@ class SourceFileGenerator(Generator):
copy(obj.source_path, dest) copy(obj.source_path, dest)
def generate_output(self, writer=None): def generate_output(self, writer=None):
logger.info(' Generating source files...') logger.info('Generating source files...')
for obj in chain(self.context['articles'], self.context['pages']): for obj in chain(self.context['articles'], self.context['pages']):
self._create_source(obj) self._create_source(obj)
for obj_trans in obj.translations: for obj_trans in obj.translations:

View file

@ -8,59 +8,76 @@ __all__ = [
import os import os
import sys import sys
import logging import logging
import locale
from collections import defaultdict from collections import defaultdict, Mapping
import six
RESET_TERM = '\033[0;m' class BaseFormatter(logging.Formatter):
def __init__(self, fmt=None, datefmt=None):
FORMAT = '%(customlevelname)s %(message)s'
super(BaseFormatter, self).__init__(fmt=FORMAT, datefmt=datefmt)
COLOR_CODES = {
'red': 31,
'yellow': 33,
'cyan': 36,
'white': 37,
'bgred': 41,
'bggrey': 100,
}
def ansi(color, text):
"""Wrap text in an ansi escape sequence"""
code = COLOR_CODES[color]
return '\033[1;{0}m{1}{2}'.format(code, text, RESET_TERM)
class ANSIFormatter(logging.Formatter):
"""Convert a `logging.LogRecord' object into colored text, using ANSI
escape sequences.
"""
def format(self, record): def format(self, record):
msg = record.getMessage() record.__dict__['customlevelname'] = self._get_levelname(record.levelname)
if record.levelname == 'INFO': # format multiline messages 'nicely' to make it clear they are together
return ansi('cyan', '-> ') + msg record.msg = record.msg.replace('\n', '\n | ')
elif record.levelname == 'WARNING': return super(BaseFormatter, self).format(record)
return ansi('yellow', record.levelname) + ': ' + msg
elif record.levelname == 'ERROR': def formatException(self, ei):
return ansi('red', record.levelname) + ': ' + msg ''' prefix traceback info for better representation '''
elif record.levelname == 'CRITICAL': # .formatException returns a bytestring in py2 and unicode in py3
return ansi('bgred', record.levelname) + ': ' + msg # since .format will handle unicode conversion,
elif record.levelname == 'DEBUG': # str() calls are used to normalize formatting string
return ansi('bggrey', record.levelname) + ': ' + msg s = super(BaseFormatter, self).formatException(ei)
# fancy format traceback
s = str('\n').join(str(' | ') + line for line in s.splitlines())
# seperate the traceback from the preceding lines
s = str(' |___\n{}').format(s)
return s
def _get_levelname(self, name):
''' NOOP: overridden by subclasses '''
return name
class ANSIFormatter(BaseFormatter):
ANSI_CODES = {
'red': '\033[1;31m',
'yellow': '\033[1;33m',
'cyan': '\033[1;36m',
'white': '\033[1;37m',
'bgred': '\033[1;41m',
'bggrey': '\033[1;100m',
'reset': '\033[0;m'}
LEVEL_COLORS = {
'INFO': 'cyan',
'WARNING': 'yellow',
'ERROR': 'red',
'CRITICAL': 'bgred',
'DEBUG': 'bggrey'}
def _get_levelname(self, name):
color = self.ANSI_CODES[self.LEVEL_COLORS.get(name, 'white')]
if name == 'INFO':
fmt = '{0}->{2}'
else: else:
return ansi('white', record.levelname) + ': ' + msg fmt = '{0}{1}{2}:'
return fmt.format(color, name, self.ANSI_CODES['reset'])
class TextFormatter(logging.Formatter): class TextFormatter(BaseFormatter):
""" """
Convert a `logging.LogRecord' object into text. Convert a `logging.LogRecord' object into text.
""" """
def format(self, record): def _get_levelname(self, name):
if not record.levelname or record.levelname == 'INFO': if name == 'INFO':
return record.getMessage() return '->'
else: else:
return record.levelname + ': ' + record.getMessage() return name + ':'
class LimitFilter(logging.Filter): class LimitFilter(logging.Filter):
@ -73,37 +90,74 @@ class LimitFilter(logging.Filter):
E.g.: log.warning(('43 is not the answer', 'More erroneous answers')) E.g.: log.warning(('43 is not the answer', 'More erroneous answers'))
""" """
ignore = set() _ignore = set()
threshold = 5 _threshold = 5
group_count = defaultdict(int) _group_count = defaultdict(int)
def filter(self, record): def filter(self, record):
# don't limit log messages for anything above "warning" # don't limit log messages for anything above "warning"
if record.levelno > logging.WARN: if record.levelno > logging.WARN:
return record return True
# extract group # extract group
group = None group = record.__dict__.get('limit_msg', None)
if len(record.msg) == 2: group_args = record.__dict__.get('limit_args', ())
record.msg, group = record.msg
# ignore record if it was already raised # ignore record if it was already raised
# use .getMessage() and not .msg for string formatting # use .getMessage() and not .msg for string formatting
ignore_key = (record.levelno, record.getMessage()) ignore_key = (record.levelno, record.getMessage())
to_ignore = ignore_key in LimitFilter.ignore if ignore_key in self._ignore:
LimitFilter.ignore.add(ignore_key)
if to_ignore:
return False return False
else:
self._ignore.add(ignore_key)
# check if we went over threshold # check if we went over threshold
if group: if group:
key = (record.levelno, group) key = (record.levelno, group)
LimitFilter.group_count[key] += 1 self._group_count[key] += 1
if LimitFilter.group_count[key] == LimitFilter.threshold: if self._group_count[key] == self._threshold:
record.msg = group record.msg = group
if LimitFilter.group_count[key] > LimitFilter.threshold: record.args = group_args
elif self._group_count[key] > self._threshold:
return False return False
return record return True
class LimitLogger(logging.Logger): class SafeLogger(logging.Logger):
"""
Base Logger which properly encodes Exceptions in Py2
"""
_exc_encoding = locale.getpreferredencoding()
def _log(self, level, msg, args, exc_info=None, extra=None):
# if the only argument is a Mapping, Logger uses that for formatting
# format values for that case
if args and len(args)==1 and isinstance(args[0], Mapping):
args = ({k: self._decode_arg(v) for k, v in args[0].items()},)
# otherwise, format each arg
else:
args = tuple(self._decode_arg(arg) for arg in args)
super(SafeLogger, self)._log(level, msg, args,
exc_info=exc_info, extra=extra)
def _decode_arg(self, arg):
'''
properly decode an arg for Py2 if it's Exception
localized systems have errors in native language if locale is set
so convert the message to unicode with the correct encoding
'''
if isinstance(arg, Exception):
text = str(arg)
if six.PY2:
text = text.decode(self._exc_encoding)
return text
else:
return arg
class LimitLogger(SafeLogger):
""" """
A logger which adds LimitFilter automatically A logger which adds LimitFilter automatically
""" """

View file

@ -131,7 +131,7 @@ class Page(object):
prop_value = getattr(rule, key) prop_value = getattr(rule, key)
if not isinstance(prop_value, six.string_types): if not isinstance(prop_value, six.string_types):
logger.warning('%s is set to %s' % (key, prop_value)) logger.warning('%s is set to %s', key, prop_value)
return prop_value return prop_value
# URL or SAVE_AS is a string, format it with a controlled context # URL or SAVE_AS is a string, format it with a controlled context

View file

@ -325,18 +325,22 @@ class HTMLReader(BaseReader):
name = self._attr_value(attrs, 'name') name = self._attr_value(attrs, 'name')
if name is None: if name is None:
attr_serialized = ', '.join(['{}="{}"'.format(k, v) for k, v in attrs]) attr_serialized = ', '.join(['{}="{}"'.format(k, v) for k, v in attrs])
logger.warning("Meta tag in file %s does not have a 'name' attribute, skipping. Attributes: %s", self._filename, attr_serialized) logger.warning("Meta tag in file %s does not have a 'name' "
"attribute, skipping. Attributes: %s",
self._filename, attr_serialized)
return return
name = name.lower() name = name.lower()
contents = self._attr_value(attrs, 'content', '') contents = self._attr_value(attrs, 'content', '')
if not contents: if not contents:
contents = self._attr_value(attrs, 'contents', '') contents = self._attr_value(attrs, 'contents', '')
if contents: if contents:
logger.warning(( logger.warning(
"Meta tag attribute 'contents' used in file {}, should" "Meta tag attribute 'contents' used in file %s, should"
" be changed to 'content'".format(self._filename), " be changed to 'content'",
"Other files have meta tag attribute 'contents' that" self._filename,
" should be changed to 'content'")) extra={'limit_msg': ("Other files have meta tag "
"attribute 'contents' that should "
"be changed to 'content'")})
if name == 'keywords': if name == 'keywords':
name = 'tags' name = 'tags'
@ -376,8 +380,8 @@ class Readers(FileStampDataCacher):
for cls in [BaseReader] + BaseReader.__subclasses__(): for cls in [BaseReader] + BaseReader.__subclasses__():
if not cls.enabled: if not cls.enabled:
logger.debug('Missing dependencies for {}' logger.debug('Missing dependencies for %s',
.format(', '.join(cls.file_extensions))) ', '.join(cls.file_extensions))
continue continue
for ext in cls.file_extensions: for ext in cls.file_extensions:
@ -414,8 +418,8 @@ class Readers(FileStampDataCacher):
path = os.path.abspath(os.path.join(base_path, path)) path = os.path.abspath(os.path.join(base_path, path))
source_path = os.path.relpath(path, base_path) source_path = os.path.relpath(path, base_path)
logger.debug('read file {} -> {}'.format( logger.debug('Read file %s -> %s',
source_path, content_class.__name__)) source_path, content_class.__name__)
if not fmt: if not fmt:
_, ext = os.path.splitext(os.path.basename(path)) _, ext = os.path.splitext(os.path.basename(path))
@ -423,11 +427,11 @@ class Readers(FileStampDataCacher):
if fmt not in self.readers: if fmt not in self.readers:
raise TypeError( raise TypeError(
'Pelican does not know how to parse {}'.format(path)) 'Pelican does not know how to parse %s', path)
if preread_signal: if preread_signal:
logger.debug('signal {}.send({})'.format( logger.debug('Signal %s.send(%s)',
preread_signal, preread_sender)) preread_signal.name, preread_sender)
preread_signal.send(preread_sender) preread_signal.send(preread_sender)
reader = self.readers[fmt] reader = self.readers[fmt]
@ -463,8 +467,8 @@ class Readers(FileStampDataCacher):
metadata['summary'] = typogrify(metadata['summary']) metadata['summary'] = typogrify(metadata['summary'])
if context_signal: if context_signal:
logger.debug('signal {}.send({}, <metadata>)'.format( logger.debug('Signal %s.send(%s, <metadata>)',
context_signal, context_sender)) context_signal.name, context_sender)
context_signal.send(context_sender, metadata=metadata) context_signal.send(context_sender, metadata=metadata)
return content_class(content=content, metadata=metadata, return content_class(content=content, metadata=metadata,
@ -497,9 +501,10 @@ def find_empty_alt(content, path):
) )
""", re.X) """, re.X)
for match in re.findall(imgs, content): for match in re.findall(imgs, content):
logger.warning(('Empty alt attribute for image {} in {}'.format( logger.warning(
os.path.basename(match[1] + match[5]), path), 'Empty alt attribute for image %s in %s',
'Other images have empty alt attributes')) os.path.basename(match[1] + match[5]), path,
extra={'limit_msg': 'Other images have empty alt attributes'})
def default_metadata(settings=None, process=None): def default_metadata(settings=None, process=None):

View file

@ -30,9 +30,9 @@ class ComplexHTTPRequestHandler(srvmod.SimpleHTTPRequestHandler):
logging.info("Found: %s" % self.path) logging.info("Found: %s" % self.path)
found = True found = True
break break
logging.info("Tried to find file %s, but it doesn't exist. " % self.path) logging.info("Tried to find file %s, but it doesn't exist. ", self.path)
if not found: if not found:
logging.warning("Unable to find file %s or variations." % self.path) logging.warning("Unable to find file %s or variations.", self.path)
Handler = ComplexHTTPRequestHandler Handler = ComplexHTTPRequestHandler
@ -40,13 +40,13 @@ socketserver.TCPServer.allow_reuse_address = True
try: try:
httpd = socketserver.TCPServer(("", PORT), Handler) httpd = socketserver.TCPServer(("", PORT), Handler)
except OSError as e: except OSError as e:
logging.error("Could not listen on port %s" % PORT) logging.error("Could not listen on port %s", PORT)
sys.exit(getattr(e, 'exitcode', 1)) sys.exit(getattr(e, 'exitcode', 1))
logging.info("serving at port %s" % PORT) logging.info("Serving at port %s", PORT)
try: try:
httpd.serve_forever() httpd.serve_forever()
except KeyboardInterrupt as e: except KeyboardInterrupt as e:
logging.info("shutting down server") logging.info("Shutting down server")
httpd.socket.close() httpd.socket.close()

View file

@ -152,7 +152,8 @@ def read_settings(path=None, override=None):
local_settings['PLUGIN_PATHS'] = local_settings['PLUGIN_PATH'] local_settings['PLUGIN_PATHS'] = local_settings['PLUGIN_PATH']
del local_settings['PLUGIN_PATH'] del local_settings['PLUGIN_PATH']
if isinstance(local_settings['PLUGIN_PATHS'], six.string_types): if isinstance(local_settings['PLUGIN_PATHS'], six.string_types):
logger.warning("Defining %s setting as string has been deprecated (should be a list)" % 'PLUGIN_PATHS') logger.warning("Defining PLUGIN_PATHS setting as string "
"has been deprecated (should be a list)")
local_settings['PLUGIN_PATHS'] = [local_settings['PLUGIN_PATHS']] local_settings['PLUGIN_PATHS'] = [local_settings['PLUGIN_PATHS']]
elif local_settings['PLUGIN_PATHS'] is not None: elif local_settings['PLUGIN_PATHS'] is not None:
local_settings['PLUGIN_PATHS'] = [os.path.abspath(os.path.normpath(os.path.join(os.path.dirname(path), pluginpath))) local_settings['PLUGIN_PATHS'] = [os.path.abspath(os.path.normpath(os.path.join(os.path.dirname(path), pluginpath)))
@ -201,7 +202,7 @@ def configure_settings(settings):
' (see pelican --help for more information)') ' (see pelican --help for more information)')
# specify the log messages to be ignored # specify the log messages to be ignored
LimitFilter.ignore.update(set(settings.get('LOG_FILTER', LimitFilter._ignore.update(set(settings.get('LOG_FILTER',
DEFAULT_CONFIG['LOG_FILTER']))) DEFAULT_CONFIG['LOG_FILTER'])))
# lookup the theme in "pelican/themes" if the given one doesn't exist # lookup the theme in "pelican/themes" if the given one doesn't exist
@ -243,10 +244,9 @@ def configure_settings(settings):
]: ]:
if key in settings and not isinstance(settings[key], types): if key in settings and not isinstance(settings[key], types):
value = settings.pop(key) value = settings.pop(key)
logger.warn( logger.warn('Detected misconfigured %s (%s), '
'Detected misconfigured {} ({}), ' 'falling back to the default (%s)',
'falling back to the default ({})'.format( key, value, DEFAULT_CONFIG[key])
key, value, DEFAULT_CONFIG[key]))
# try to set the different locales, fallback on the default. # try to set the different locales, fallback on the default.
locales = settings.get('LOCALE', DEFAULT_CONFIG['LOCALE']) locales = settings.get('LOCALE', DEFAULT_CONFIG['LOCALE'])
@ -319,8 +319,8 @@ def configure_settings(settings):
old_key = key + '_DIR' old_key = key + '_DIR'
new_key = key + '_PATHS' new_key = key + '_PATHS'
if old_key in settings: if old_key in settings:
logger.warning('Deprecated setting {}, moving it to {} list'.format( logger.warning('Deprecated setting %s, moving it to %s list',
old_key, new_key)) old_key, new_key)
settings[new_key] = [settings[old_key]] # also make a list settings[new_key] = [settings[old_key]] # also make a list
del settings[old_key] del settings[old_key]
@ -343,8 +343,8 @@ def configure_settings(settings):
for PATH_KEY in filter(lambda k: k in settings, path_keys): for PATH_KEY in filter(lambda k: k in settings, path_keys):
if isinstance(settings[PATH_KEY], six.string_types): if isinstance(settings[PATH_KEY], six.string_types):
logger.warning("Detected misconfiguration with %s setting " logger.warning("Detected misconfiguration with %s setting "
"(must be a list), falling back to the default" "(must be a list), falling back to the default",
% PATH_KEY) PATH_KEY)
settings[PATH_KEY] = DEFAULT_CONFIG[PATH_KEY] settings[PATH_KEY] = DEFAULT_CONFIG[PATH_KEY]
# Add {PAGE,ARTICLE}_PATHS to {ARTICLE,PAGE}_EXCLUDES # Add {PAGE,ARTICLE}_PATHS to {ARTICLE,PAGE}_EXCLUDES

View file

@ -182,5 +182,5 @@ class TestPelican(LoggedTestCase):
logger.setLevel(orig_level) logger.setLevel(orig_level)
self.assertLogCountEqual( self.assertLogCountEqual(
count=2, count=2,
msg="writing .*", msg="Writing .*",
level=logging.INFO) level=logging.INFO)

View file

@ -20,8 +20,10 @@ from six.moves.urllib.error import URLError
from six.moves.urllib.parse import urlparse from six.moves.urllib.parse import urlparse
from six.moves.urllib.request import urlretrieve from six.moves.urllib.request import urlretrieve
from pelican.utils import slugify, SafeDatetime # pelican.log has to be the first pelican module to be loaded
# because logging.setLoggerClass has to be called before logging.getLogger
from pelican.log import init from pelican.log import init
from pelican.utils import slugify, SafeDatetime
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -131,7 +133,7 @@ def wp2fields(xml, wp_custpost=False):
title = unescape(item.title.contents[0]) title = unescape(item.title.contents[0])
except IndexError: except IndexError:
title = 'No title [%s]' % item.find('post_name').string title = 'No title [%s]' % item.find('post_name').string
logger.warning('Post "%s" is lacking a proper title' % title) logger.warning('Post "%s" is lacking a proper title', title)
filename = item.find('post_name').string filename = item.find('post_name').string
post_id = item.find('post_id').string post_id = item.find('post_id').string
@ -594,24 +596,9 @@ def download_attachments(output_path, urls):
try: try:
urlretrieve(url, os.path.join(full_path, filename)) urlretrieve(url, os.path.join(full_path, filename))
locations.append(os.path.join(localpath, filename)) locations.append(os.path.join(localpath, filename))
except URLError as e: except (URLError, IOError) as e:
error = ("No file could be downloaded from {}; Error {}" #Python 2.7 throws an IOError rather Than URLError
.format(url, e)) logger.warning("No file could be downloaded from %s\n%s", url, e)
logger.warning(error)
except IOError as e: #Python 2.7 throws an IOError rather Than URLError
# For japanese, the error might look kind of like this:
# e = IOError( 'socket error', socket.error(111, u'\u63a5\u7d9a\u3092\u62d2\u5426\u3055\u308c\u307e\u3057\u305f') )
# and not be suitable to use in "{}".format(e) , raising UnicodeDecodeError
# (This is at least the case on my Fedora running Python 2.7.5
# (default, Feb 19 2014, 13:47:28) [GCC 4.8.2 20131212 (Red Hat 4.8.2-7)] on linux2
try:
error = ("No file could be downloaded from {}; Error {}"
.format(url, e))
except UnicodeDecodeError:
# For lack of a better log message because we could not decode e, let's use repr(e)
error = ("No file could be downloaded from {}; Error {}"
.format(url, repr(e)))
logger.warning(error)
return locations return locations

View file

@ -70,7 +70,7 @@ class URLWrapper(object):
setting = "%s_%s" % (self.__class__.__name__.upper(), key) setting = "%s_%s" % (self.__class__.__name__.upper(), key)
value = self.settings[setting] value = self.settings[setting]
if not isinstance(value, six.string_types): if not isinstance(value, six.string_types):
logger.warning('%s is set to %s' % (setting, value)) logger.warning('%s is set to %s', (setting, value))
return value return value
else: else:
if get_page_name: if get_page_name:

View file

@ -293,46 +293,46 @@ def copy(source, destination):
if not os.path.exists(dest_dir): if not os.path.exists(dest_dir):
os.makedirs(dest_dir) os.makedirs(dest_dir)
shutil.copy2(source_, destination_) shutil.copy2(source_, destination_)
logger.info('copying %s to %s' % (source_, destination_)) logger.info('Copying %s to %s', source_, destination_)
else: else:
logger.warning('skipped copy %s to %s' % (source_, destination_)) logger.warning('Skipped copy %s to %s', source_, destination_)
def clean_output_dir(path, retention): def clean_output_dir(path, retention):
"""Remove all files from output directory except those in retention list""" """Remove all files from output directory except those in retention list"""
if not os.path.exists(path): if not os.path.exists(path):
logger.debug("Directory already removed: %s" % path) logger.debug("Directory already removed: %s", path)
return return
if not os.path.isdir(path): if not os.path.isdir(path):
try: try:
os.remove(path) os.remove(path)
except Exception as e: except Exception as e:
logger.error("Unable to delete file %s; %s" % (path, str(e))) logger.error("Unable to delete file %s; %s", path, e)
return return
# remove existing content from output folder unless in retention list # remove existing content from output folder unless in retention list
for filename in os.listdir(path): for filename in os.listdir(path):
file = os.path.join(path, filename) file = os.path.join(path, filename)
if any(filename == retain for retain in retention): if any(filename == retain for retain in retention):
logger.debug("Skipping deletion; %s is on retention list: %s" \ logger.debug("Skipping deletion; %s is on retention list: %s",
% (filename, file)) filename, file)
elif os.path.isdir(file): elif os.path.isdir(file):
try: try:
shutil.rmtree(file) shutil.rmtree(file)
logger.debug("Deleted directory %s" % file) logger.debug("Deleted directory %s", file)
except Exception as e: except Exception as e:
logger.error("Unable to delete directory %s; %s" % ( logger.error("Unable to delete directory %s; %s",
file, str(e))) file, e)
elif os.path.isfile(file) or os.path.islink(file): elif os.path.isfile(file) or os.path.islink(file):
try: try:
os.remove(file) os.remove(file)
logger.debug("Deleted file/link %s" % file) logger.debug("Deleted file/link %s", file)
except Exception as e: except Exception as e:
logger.error("Unable to delete file %s; %s" % (file, str(e))) logger.error("Unable to delete file %s; %s", file, e)
else: else:
logger.error("Unable to delete %s, file type unknown" % file) logger.error("Unable to delete %s, file type unknown", file)
def get_relative_path(path): def get_relative_path(path):
@ -455,10 +455,10 @@ def process_translations(content_list):
lang_items = list(lang_items) lang_items = list(lang_items)
len_ = len(lang_items) len_ = len(lang_items)
if len_ > 1: if len_ > 1:
logger.warning('There are %s variants of "%s" with lang %s' \ logger.warning('There are %s variants of "%s" with lang %s',
% (len_, slug, lang)) len_, slug, lang)
for x in lang_items: for x in lang_items:
logger.warning(' %s' % x.source_path) logger.warning('\t%s', x.source_path)
# find items with default language # find items with default language
default_lang_items = list(filter(attrgetter('in_default_lang'), default_lang_items = list(filter(attrgetter('in_default_lang'),
@ -469,11 +469,11 @@ def process_translations(content_list):
default_lang_items = items[:1] default_lang_items = items[:1]
if not slug: if not slug:
logger.warning(( logger.warning(
'empty slug for {!r}. ' 'empty slug for %s. '
'You can fix this by adding a title or a slug to your ' 'You can fix this by adding a title or a slug to your '
'content' 'content',
).format(default_lang_items[0].source_path)) default_lang_items[0].source_path)
index.extend(default_lang_items) index.extend(default_lang_items)
translations.extend([x for x in items if x not in default_lang_items]) translations.extend([x for x in items if x not in default_lang_items])
for a in items: for a in items:
@ -499,7 +499,7 @@ def folder_watcher(path, extensions, ignores=[]):
try: try:
yield os.stat(os.path.join(root, f)).st_mtime yield os.stat(os.path.join(root, f)).st_mtime
except OSError as e: except OSError as e:
logger.warning('Caught Exception: {}'.format(e)) logger.warning('Caught Exception: %s', e)
LAST_MTIME = 0 LAST_MTIME = 0
while True: while True:
@ -522,7 +522,7 @@ def file_watcher(path):
try: try:
mtime = os.stat(path).st_mtime mtime = os.stat(path).st_mtime
except OSError as e: except OSError as e:
logger.warning('Caught Exception: {}'.format(e)) logger.warning('Caught Exception: %s', e)
continue continue
if mtime > LAST_MTIME: if mtime > LAST_MTIME:
@ -596,15 +596,15 @@ class FileDataCacher(object):
with self._cache_open(self._cache_path, 'rb') as fhandle: with self._cache_open(self._cache_path, 'rb') as fhandle:
self._cache = pickle.load(fhandle) self._cache = pickle.load(fhandle)
except (IOError, OSError) as err: except (IOError, OSError) as err:
logger.debug(('Cannot load cache {} (this is normal on first ' logger.debug('Cannot load cache %s (this is normal on first '
'run). Proceeding with empty cache.\n{}').format( 'run). Proceeding with empty cache.\n%s',
self._cache_path, err)) self._cache_path, err)
self._cache = {} self._cache = {}
except Exception as err: except Exception as err:
logger.warning(('Cannot unpickle cache {}, cache may be using ' logger.warning(('Cannot unpickle cache %s, cache may be using '
'an incompatible protocol (see pelican caching docs). ' 'an incompatible protocol (see pelican caching docs). '
'Proceeding with empty cache.\n{}').format( 'Proceeding with empty cache.\n%s'),
self._cache_path, err)) self._cache_path, err)
self._cache = {} self._cache = {}
else: else:
self._cache = {} self._cache = {}
@ -629,8 +629,8 @@ class FileDataCacher(object):
with self._cache_open(self._cache_path, 'wb') as fhandle: with self._cache_open(self._cache_path, 'wb') as fhandle:
pickle.dump(self._cache, fhandle) pickle.dump(self._cache, fhandle)
except (IOError, OSError, pickle.PicklingError) as err: except (IOError, OSError, pickle.PicklingError) as err:
logger.warning('Could not save cache {}\n{}'.format( logger.warning('Could not save cache %s\n ... %s',
self._cache_path, err)) self._cache_path, err)
class FileStampDataCacher(FileDataCacher): class FileStampDataCacher(FileDataCacher):
@ -656,8 +656,7 @@ class FileStampDataCacher(FileDataCacher):
return hash_func(fhandle.read()).digest() return hash_func(fhandle.read()).digest()
self._filestamp_func = filestamp_func self._filestamp_func = filestamp_func
except AttributeError as err: except AttributeError as err:
logger.warning('Could not get hashing function\n{}'.format( logger.warning('Could not get hashing function\n\t%s', err)
err))
self._filestamp_func = None self._filestamp_func = None
def cache_data(self, filename, data): def cache_data(self, filename, data):
@ -677,8 +676,8 @@ class FileStampDataCacher(FileDataCacher):
try: try:
return self._filestamp_func(filename) return self._filestamp_func(filename)
except (IOError, OSError, TypeError) as err: except (IOError, OSError, TypeError) as err:
logger.warning('Cannot get modification stamp for {}\n{}'.format( logger.warning('Cannot get modification stamp for %s\n\t%s',
filename, err)) filename, err)
return b'' return b''
def get_cached_data(self, filename, default=None): def get_cached_data(self, filename, default=None):

View file

@ -68,11 +68,11 @@ class Writer(object):
raise RuntimeError('File %s is set to be overridden twice' raise RuntimeError('File %s is set to be overridden twice'
% filename) % filename)
else: else:
logger.info('skipping %s' % filename) logger.info('Skipping %s', filename)
filename = os.devnull filename = os.devnull
elif filename in self._written_files: elif filename in self._written_files:
if override: if override:
logger.info('overwriting %s' % filename) logger.info('Overwriting %s', filename)
else: else:
raise RuntimeError('File %s is to be overwritten' % filename) raise RuntimeError('File %s is to be overwritten' % filename)
if override: if override:
@ -120,7 +120,7 @@ class Writer(object):
encoding = 'utf-8' if six.PY3 else None encoding = 'utf-8' if six.PY3 else None
with self._open_w(complete_path, encoding) as fp: with self._open_w(complete_path, encoding) as fp:
feed.write(fp, 'utf-8') feed.write(fp, 'utf-8')
logger.info('writing %s' % complete_path) logger.info('Writing %s', complete_path)
return feed return feed
finally: finally:
locale.setlocale(locale.LC_ALL, old_locale) locale.setlocale(locale.LC_ALL, old_locale)
@ -160,7 +160,7 @@ class Writer(object):
with self._open_w(path, 'utf-8', override=override) as f: with self._open_w(path, 'utf-8', override=override) as f:
f.write(output) f.write(output)
logger.info('writing {}'.format(path)) logger.info('Writing %s', path)
# Send a signal to say we're writing a file with some specific # Send a signal to say we're writing a file with some specific
# local context. # local context.