The story: I have several HTTP service providers which works quite unstable. Yes, I had to have in mind this during development. But we're all thought that issues are
"temporary" and will gone when we going to production. We accurately added
logging.error in every place and move on with other stuff.
But our expectation about temporary nature of service behavior will never happen. Service sometimes work slowly, sometimes return HTTP errors and so on. We receive tons of exceptions every day. We had to do something with that.
The solution: Here is
safe_exec decorator which help solve this problem. You can specify how many times you want to try execute function, what's timeout between them and what exceptions are expected during execution decorated function. For example,
urllib2.urlopen may generate
urllib2.URLError or
urllib2.HTTPError.
import logging
import time
from functools import wraps
__all__ = ("safe_exec",)
def safe_exec(exceptions, shakes=3, timeout=1, title="", **kwargs):
"""
Decorator to safely execute function or method
within `shakes` trying.
In case provide argument `default` exception will not
be raised and will return provided value.
"""
def wrap(func):
if not isinstance(exceptions, tuple):
raise TypeError(
"First argument of safe_exec should be tuple of exceptions"
)
@wraps(func)
def wrapped(*args, **kwargs):
result = None
orig_exception = None
for shake in range(shakes):
try:
result = func(*args, **kwargs)
break
except exceptions, orig_exception:
logging.warn("%s: Sorry, can't execute %s, shake #%d",
title,
func.__name__,
shake,
exc_info=True
)
time.sleep(timeout)
else:
logging.error(
"%s: Can't execute `%s` after %d shakes",
title,
func.__name__,
shakes
)
if "default" in kwargs:
return kwargs.get("default")
raise orig_exception
return result
return wrapped
return wrap
Sample usage:
import urllib2
@safe_exec((urllib2.URLError, urllib2.HTTPError), shakes=2)
def download(url):
return urllib2.urlopen(url).read()
download("http://slow-resource.com/")