This is a handy little decorator that lets you annotate function definitions with argument type requirements. These type requirements are automatically checked by the system at function invocation time. The decorator frees you from writing type-checking boilerplate code by hand.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 | def require(arg_name, *allowed_types):
def make_wrapper(f):
if hasattr(f, "wrapped_args"):
wrapped_args = getattr(f, "wrapped_args")
else:
code = f.func_code
wrapped_args = list(code.co_varnames[:code.co_argcount])
try:
arg_index = wrapped_args.index(arg_name)
except ValueError:
raise NameError, arg_name
def wrapper(*args, **kwargs):
if len(args) > arg_index:
arg = args[arg_index]
if not isinstance(arg, allowed_types):
type_list = " or ".join(str(allowed_type) for allowed_type in allowed_types)
raise TypeError, "Expected '%s' to be %s; was %s." % (arg_name, type_list, type(arg))
else:
if arg_name in kwargs:
arg = kwargs[arg_name]
if not isinstance(arg, allowed_types):
type_list = " or ".join(str(allowed_type) for allowed_type in allowed_types)
raise TypeError, "Expected '%s' to be %s; was %s." % (arg_name, type_list, type(arg))
return f(*args, **kwargs)
wrapper.wrapped_args = wrapped_args
return wrapper
return make_wrapper
@require("x", int, float)
@require("y", float)
def foo(x, y):
return x+y
print foo(1, 2.5) # Prints 3.5.
print foo(2.0, 2.5) # Prints 4.5.
print foo("asdf", 2.5) # Raises TypeError exception.
print foo(1, 2) # Raises TypeError exception.
|
A small restriction on this decorator is that all uses of @require must come first in the stack of decorators applied to a function. In other words,
<pre> @some_decorator @require("x", int) @require("y", int) def f(x, y): pass </pre> is allowed, whereas
<pre> @require("x", int) @require("y", int) @some_decorator def f(x, y): pass </pre> is not allowed. The reason is that the require decorator uses the wrapped_args function attribute to propagate the argument list of the originally decorated function down the decorator stack, and other decorators do not participate in the propagation.
The checking provided by this decorator is fairly basic. I have plans to extend it with more sophisticated contract-checking functionality, but it is usable for type-checking even in its current state.
Type-checking considered harmful? Before anyone jumps on me: in posting this, I am not implicitly advocating that explicit type checks are usually a good idea in Python. I know the potentially detrimental effect it can have on polymorphism, and duck typing in particular. In fact, I almost never use these kinds of up-front checks. The recipe is mostly intended as a demonstration of an implementation technique that can be used for more general contract-checking. And if you need to write boilerplate type-checking code for whatever reason (I don't judge!), the require decorator is a useful shortcut.
There are previously existing recipes which do this.
more compact. nice recipe. might be nice to enhance it so that all the type specs could be specified in a single @require call, like this:
seems like it shouldn't be too hard, something along the lines of:
</pre>
Here is a simpler, though less flexible, candidate, using named parameters:
you say:
Even though type checking isn't supposed to be Pythonic, I think this could be made more so. For example, using your decorator, the following is not allowed:
What I think should happen here is that the string "2" is converted to an int. To be properly Pythonic, I don't think the instruction "require x to be an int" means require it to be an instance of int, but require it to be an object that can be used as an int. I've modified your decorator slightly so that in the above case, it will call int() on the specified argument and pass this converted object to the foo function. Obviously a string like "hello" can't be converted to an int and in that case, a normal ValueError will be thrown.
One area in which this is less flexible though is that you can't check that "x" is an int OR a float. In that case, there's no way of knowing if you want the object to be converted to an int or a float. Although since an int will be converted to a float, then you could just check for a float and still pass an int.
I've only tested this with things like ints and floats, but if you've got your own classes with __init__ functions then I see no reason why it shouldn't work for that as well although obviously it can only take one parameter,