Consider a highly dynamic system where a lot of objects are created and destroyed. If the objects are complex and have a lot of default values the following recipe may improve the performance. The more default values you have the more you will gain from it.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 | class A (object) :
"""A simple class with some internal default values"""
def __init__ (self, x) :
self.x = x
self._a = 1000
self._b = 10000
self._data = [1,2,3,4]
self._d = { 1 :1, 2:2, 3:3}
class A_Metaclass (type) :
"""Metaclass setting default values"""
def __init__ (cls, name, bases, dct) :
super(A_Metaclass, cls).__init__(name, bases, dct)
setattr (cls, "_a", 1000)
setattr (cls, "_b", 10000)
setattr (cls, "_data", [1,2,3,4])
setattr (cls, "_d", { 1 :1, 2:2, 3:3})
class A_With_Metaclass (object) :
"""The same class as A, but internal default values set by metaclass"""
__metaclass__ = A_Metaclass
def __init__ (self, x) :
self.x = x
from timeit import timeit
n = 10000000
print "metaclass :", timeit ("A_With_Metaclass (22)", setup = "from __main__ import A_Metaclass, A_With_Metaclass", number = n)
print "normal :", timeit ("A (22)", setup = "from __main__ import A", number = n)
|
Tags: metaclass, performance
I'm sorry. This is neither useful nor reusable.
Adding default value on the class have a performance penalty at access time:
Those extra timing:
Give the following results:
Moreover you do not need metaclass for that. Just add them to the class namespace: