I really like the Python 2.5 generator extensions. I have cases where I want a generator to pass a reference to itself to another generator, or to store it in a queue etc. Here's how a generator can get its own handle.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 | # sample generator coroutine getting access to its own handle
import functools
def coroutine(function):
"""
decorator to create a generator coroutine
Note that except for the calll to 'send()', this comes from PEP 342
"""
@functools.wraps(function)
def wrapper(*args, **kw):
generator = function(*args, **kw)
result = generator.next() # one to get ready
result = generator.send(generator) # two to store our own handle
return generator
return wrapper
process_queue = list() # queue for some imaginary process
def factory():
@coroutine
def test_function():
"""a simple function to test the coroutine decorator"""
this_generator = (yield "foop")
process_queue.append(this_generator)
# now pretend to do some real work
while True:
result = (yield 42)
return test_function()
print "creating generator instance"
factory()
print "obtaining instance from queue"
generator = process_queue.pop()
print "executing one step"
x = generator.send(None)
print "result =", x
|
The real case is asynchronous socket (and file) I/O.
A 'pollster' coroutine maintains a dictionary of filenos and generator references. After calling poll(), for each fileno that is ready for I/O, the pollster retrieves the generator instance and calls send() with the event flags from poll().
Each I/O coroutine places its socket fileno and a reference to itself in the pollster dictionary when itinitiates an asynchronous I/O request. It then yields, waiting for the pollster to signal I/O ready.
I would be interested to see how you could use this to do Async I/O. Has anyone implemented this and have sample code?