Objects that use the Event object from the threading core library can't be pickled because the Event's underlying implementation is an unpicklable Lock object. Conceptually, though, an Event is just a boolean, so providing reasonable serialization behavior is pretty straightforward.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
import threading import copy class SerializableEvent(object): "A threading.Event that can be serialized." def __init__(self): self.evt = threading.Event() def set(self): return self.evt.set() def clear(self): return self.evt.clear() def isSet(self): return self.evt.isSet() def wait(self, timeout=0): return self.evt.wait(timeout) def __getstate__(self): d = copy.copy(self.__dict__) if self.evt.isSet(): d['evt'] = True else: d['evt'] = False return d def __setstate__(self, d): self.evt = threading.Event() if d['evt']: self.evt.set()
This recipe takes advantage of the fact that Event's state is nothing more than a boolean value. By overriding __getstate__ and __setstate__, we can serialize it as such, and reset its state appropriately on deserialization.
A reasonable first attempt at this would subclass the Event object and override its __getstate__ and __setstate__. However, the implementation of threading gets in the way; Event, it turns out, is a factory function for a private class (_Event, in Python 2.4). We could inherit from this private class but (besides being bad manners) this might break if the implementation of threading changes in the future. By implementing as a proxy, we honor the public interface, and buy some insurance against future modifications to threading.
One could use __getattr__ or __getattribute__ to implement a more robust proxy. However, a naive proxy like this one is more readable, and the risk of the interface changing is low. If the proxy needs to be truly transparent, extend this recipe with "double underscore" methods as appropriate, or re-implement with __getattr__ or __getattribute__.