| Store | Cart

[Python-ideas] Learning from the shell in supporting asyncio background calls

From: Nick Coghlan <ncog...@gmail.com>
Fri, 10 Jul 2015 20:49:31 +1000
Hi folks,

Based on the recent discussions Sven kicked off regarding the
complexity of interacting with asyncio from otherwise synchronous
code, I came up with an API design that I like inspired by the way
background and foreground tasks in the POSIX shell work.

My blog post about this design is at
http://www.curiousefficiency.org/posts/2015/07/asyncio-background-calls.html,
but the essential components are the following two APIs:

    def run_in_background(target, *, loop=None):
        """Schedules target as a background task

        Returns the scheduled task.

        If target is a future or coroutine, equivalent to asyncio.ensure_future
        If target is a callable, it is scheduled in the default executor
        """
        ...

    def run_in_foreground(task, *, loop=None):
        """Runs event loop in current thread until the given task completes

        Returns the result of the task.
        For more complex conditions, combine with asyncio.wait()
        To include a timeout, combine with asyncio.wait_for()
        """
        ...

run_in_background is akin to invoking a shell command with a trailing
"&" - it puts the operation into the background, leaving the current
thread to move on to the next operation (or wait for input at the
REPL). When coroutines are scheduled, they won't start running until
you start a foreground task, while callables delegated to the default
executor will start running immediately.

To actually get the *results* of that task, you have to run it in the
foreground of the current thread using run_in_foreground - this is
akin to bringing a background process to the foreground of a shell
session using "fg".

To relate this idea back to some of the examples Sven was discussing,
here's how translating some old serialised synchronous code to use
those APIs might look in practice:

    # Serial synchronous data loading
    def load_and_process_data():
        data1 = load_remote_data_set1()
        data2 = load_remote_data_set2()
        return process_data(data1, data2)

    # Parallel asynchronous data loading
    def load_and_process_data():
        future1 = asyncio.run_in_background(load_remote_data_set1_async())
        future2 = asyncio.run_in_background(load_remote_data_set2_async())
        data1 = asyncio.run_in_foreground(future1)
        data2 = asyncio.run_in_foreground(future2)
        return process_data(data1, data2)

The application remains fundamentally synchronous, but the asyncio
event loop is exploited to obtain some local concurrency in waiting
for client IO operations.

Regards,
Nick.

P.S. time.sleep() and asyncio.sleep() are rather handy as standins for
blocking and non-blocking IO operations. I wish I'd remembered that
earlier :)

-- 
Nick Coghlan   |   ncog...@gmail.com   |   Brisbane, Australia
_______________________________________________
Python-ideas mailing list
Pyth...@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

Recent Messages in this Thread
Nick Coghlan Jul 10, 2015 10:49 am
Oscar Benjamin Jul 10, 2015 11:48 am
Nick Coghlan Jul 11, 2015 04:33 am
Guido van Rossum Jul 10, 2015 11:51 am
Nick Coghlan Jul 11, 2015 05:04 am
Nathaniel Smith Jul 11, 2015 05:16 am
Nick Coghlan Jul 11, 2015 10:17 am
Nick Coghlan Jul 12, 2015 02:48 am
Sven R. Kunze Aug 11, 2015 09:26 pm
Jonathan Slenders Aug 11, 2015 10:37 pm
Sven R. Kunze Aug 13, 2015 06:48 am
Jonathan Slenders Aug 11, 2015 10:59 pm
Nick Coghlan Aug 19, 2015 09:24 am
Sven R. Kunze Aug 20, 2015 03:27 pm
Stephen J. Turnbull Aug 21, 2015 02:51 am
Andrew Barnert via Python-ideas Aug 21, 2015 04:22 am
Nick Coghlan Jul 11, 2015 07:04 am
Messages in this thread