How can I wrap a synchronous function in an async coroutine?





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







25















I'm using aiohttp to build an API server that sends TCP requests off to a seperate server. The module that sends the TCP requests is synchronous and a black box for my purposes. So my problem is that these requests are blocking the entire API. I need a way to wrap the module requests in an asynchronous coroutine that won't block the rest of the API.



So, just using sleep as a simple example, is there any way to somehow wrap time-consuming synchronous code in a non-blocking coroutine, something like this:



async def sleep_async(delay):
# After calling sleep, loop should be released until sleep is done
yield sleep(delay)
return 'I slept asynchronously'









share|improve this question




















  • 1





    You always block on I/O. With cooperative multitasking you can't get desired behaviour, because blocked coroutine returns control (yield) only after request is finished.

    – Dmitry Shilyaev
    Apr 5 '17 at 21:03








  • 1





    aiohttp is good for http. For non http TCP, asyncio is enough.

    – Udi
    Apr 5 '17 at 22:13


















25















I'm using aiohttp to build an API server that sends TCP requests off to a seperate server. The module that sends the TCP requests is synchronous and a black box for my purposes. So my problem is that these requests are blocking the entire API. I need a way to wrap the module requests in an asynchronous coroutine that won't block the rest of the API.



So, just using sleep as a simple example, is there any way to somehow wrap time-consuming synchronous code in a non-blocking coroutine, something like this:



async def sleep_async(delay):
# After calling sleep, loop should be released until sleep is done
yield sleep(delay)
return 'I slept asynchronously'









share|improve this question




















  • 1





    You always block on I/O. With cooperative multitasking you can't get desired behaviour, because blocked coroutine returns control (yield) only after request is finished.

    – Dmitry Shilyaev
    Apr 5 '17 at 21:03








  • 1





    aiohttp is good for http. For non http TCP, asyncio is enough.

    – Udi
    Apr 5 '17 at 22:13














25












25








25


8






I'm using aiohttp to build an API server that sends TCP requests off to a seperate server. The module that sends the TCP requests is synchronous and a black box for my purposes. So my problem is that these requests are blocking the entire API. I need a way to wrap the module requests in an asynchronous coroutine that won't block the rest of the API.



So, just using sleep as a simple example, is there any way to somehow wrap time-consuming synchronous code in a non-blocking coroutine, something like this:



async def sleep_async(delay):
# After calling sleep, loop should be released until sleep is done
yield sleep(delay)
return 'I slept asynchronously'









share|improve this question
















I'm using aiohttp to build an API server that sends TCP requests off to a seperate server. The module that sends the TCP requests is synchronous and a black box for my purposes. So my problem is that these requests are blocking the entire API. I need a way to wrap the module requests in an asynchronous coroutine that won't block the rest of the API.



So, just using sleep as a simple example, is there any way to somehow wrap time-consuming synchronous code in a non-blocking coroutine, something like this:



async def sleep_async(delay):
# After calling sleep, loop should be released until sleep is done
yield sleep(delay)
return 'I slept asynchronously'






python python-3.x asynchronous python-asyncio aiohttp






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 5 '17 at 20:45







Zac Delventhal

















asked Apr 5 '17 at 20:40









Zac DelventhalZac Delventhal

928818




928818








  • 1





    You always block on I/O. With cooperative multitasking you can't get desired behaviour, because blocked coroutine returns control (yield) only after request is finished.

    – Dmitry Shilyaev
    Apr 5 '17 at 21:03








  • 1





    aiohttp is good for http. For non http TCP, asyncio is enough.

    – Udi
    Apr 5 '17 at 22:13














  • 1





    You always block on I/O. With cooperative multitasking you can't get desired behaviour, because blocked coroutine returns control (yield) only after request is finished.

    – Dmitry Shilyaev
    Apr 5 '17 at 21:03








  • 1





    aiohttp is good for http. For non http TCP, asyncio is enough.

    – Udi
    Apr 5 '17 at 22:13








1




1





You always block on I/O. With cooperative multitasking you can't get desired behaviour, because blocked coroutine returns control (yield) only after request is finished.

– Dmitry Shilyaev
Apr 5 '17 at 21:03







You always block on I/O. With cooperative multitasking you can't get desired behaviour, because blocked coroutine returns control (yield) only after request is finished.

– Dmitry Shilyaev
Apr 5 '17 at 21:03






1




1





aiohttp is good for http. For non http TCP, asyncio is enough.

– Udi
Apr 5 '17 at 22:13





aiohttp is good for http. For non http TCP, asyncio is enough.

– Udi
Apr 5 '17 at 22:13












3 Answers
3






active

oldest

votes


















23














Eventually I found an answer in this thread. The method I was looking for is run_in_executor. This allows a synchronous function to be run asynchronously without blocking an event loop.



In the sleep example I posted above, it might look like this:



import asyncio
from time import sleep
from concurrent.futures import ProcessPoolExecutor

async def sleep_async(loop, delay):
# Can set executor to None if a default has been set for loop
await loop.run_in_executor(ProcessPoolExecutor(), sleep, delay)
return 'I slept asynchronously'


Also see the following answer -> How do we call a normal function where a coroutine is expected?






share|improve this answer





















  • 11





    ProcessPoolExecutor has a high cost because it launches an entire new python interpreter. It is used when you have a CPU-intensive task that needs to use multiple processors. Consider using ThreadPoolExecutor instead, which uses threading.

    – Oleg
    Jun 11 '17 at 1:38








  • 5





    Thank you for the additional info. Although the original example used process pool, ThreadPoolExecutor is what I ended up using after a little more research. Still seems a little jenky, but so far it's all holding together.

    – Zac Delventhal
    Jun 14 '17 at 3:34






  • 7





    Just a note, instead of creating a new executor, it might be simpler to use the default executor by calling loop.run_in_executor(executor=None, func, *args) (see documentation).

    – Amit Kotlovski
    Jan 18 '18 at 8:51



















7














You can use a decorator to wrap the sync version to an async version.



import time
from functools import wraps, partial


def wrap(func):
@wraps(func)
async def run(*args, loop=None, executor=None, **kwargs):
if loop is None:
loop = asyncio.get_event_loop()
pfunc = partial(func, *args, **kwargs)
return await loop.run_in_executor(executor, pfunc)
return run

@wrap
def sleep_async(delay):
time.sleep(delay)
return 'I slept asynchronously'


or use the aioify lib



% pip install aioify


then



@aioify
def sleep_async(delay):
pass





share|improve this answer


























  • good advise to use aioify it makes now so easy to write async functions and modules :)

    – WBAR
    Feb 26 at 23:46



















1














Not sure if too late but you can also use a decorator to do your function in a thread. ALTHOUGH, note that it will still be non-coop blocking unlike async which is co-op blocking.



def wrap(func):
from concurrent.futures import ThreadPoolExecutor
pool=ThreadPoolExecutor()
@wraps(func)
async def run(*args, loop=None, executor=None, **kwargs):
if loop is None:
loop = asyncio.get_event_loop()
future=pool.submit(func, *args, **kwargs)
return asyncio.wrap_future(future)
return run


Hope that helps!






share|improve this answer
























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f43241221%2fhow-can-i-wrap-a-synchronous-function-in-an-async-coroutine%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    23














    Eventually I found an answer in this thread. The method I was looking for is run_in_executor. This allows a synchronous function to be run asynchronously without blocking an event loop.



    In the sleep example I posted above, it might look like this:



    import asyncio
    from time import sleep
    from concurrent.futures import ProcessPoolExecutor

    async def sleep_async(loop, delay):
    # Can set executor to None if a default has been set for loop
    await loop.run_in_executor(ProcessPoolExecutor(), sleep, delay)
    return 'I slept asynchronously'


    Also see the following answer -> How do we call a normal function where a coroutine is expected?






    share|improve this answer





















    • 11





      ProcessPoolExecutor has a high cost because it launches an entire new python interpreter. It is used when you have a CPU-intensive task that needs to use multiple processors. Consider using ThreadPoolExecutor instead, which uses threading.

      – Oleg
      Jun 11 '17 at 1:38








    • 5





      Thank you for the additional info. Although the original example used process pool, ThreadPoolExecutor is what I ended up using after a little more research. Still seems a little jenky, but so far it's all holding together.

      – Zac Delventhal
      Jun 14 '17 at 3:34






    • 7





      Just a note, instead of creating a new executor, it might be simpler to use the default executor by calling loop.run_in_executor(executor=None, func, *args) (see documentation).

      – Amit Kotlovski
      Jan 18 '18 at 8:51
















    23














    Eventually I found an answer in this thread. The method I was looking for is run_in_executor. This allows a synchronous function to be run asynchronously without blocking an event loop.



    In the sleep example I posted above, it might look like this:



    import asyncio
    from time import sleep
    from concurrent.futures import ProcessPoolExecutor

    async def sleep_async(loop, delay):
    # Can set executor to None if a default has been set for loop
    await loop.run_in_executor(ProcessPoolExecutor(), sleep, delay)
    return 'I slept asynchronously'


    Also see the following answer -> How do we call a normal function where a coroutine is expected?






    share|improve this answer





















    • 11





      ProcessPoolExecutor has a high cost because it launches an entire new python interpreter. It is used when you have a CPU-intensive task that needs to use multiple processors. Consider using ThreadPoolExecutor instead, which uses threading.

      – Oleg
      Jun 11 '17 at 1:38








    • 5





      Thank you for the additional info. Although the original example used process pool, ThreadPoolExecutor is what I ended up using after a little more research. Still seems a little jenky, but so far it's all holding together.

      – Zac Delventhal
      Jun 14 '17 at 3:34






    • 7





      Just a note, instead of creating a new executor, it might be simpler to use the default executor by calling loop.run_in_executor(executor=None, func, *args) (see documentation).

      – Amit Kotlovski
      Jan 18 '18 at 8:51














    23












    23








    23







    Eventually I found an answer in this thread. The method I was looking for is run_in_executor. This allows a synchronous function to be run asynchronously without blocking an event loop.



    In the sleep example I posted above, it might look like this:



    import asyncio
    from time import sleep
    from concurrent.futures import ProcessPoolExecutor

    async def sleep_async(loop, delay):
    # Can set executor to None if a default has been set for loop
    await loop.run_in_executor(ProcessPoolExecutor(), sleep, delay)
    return 'I slept asynchronously'


    Also see the following answer -> How do we call a normal function where a coroutine is expected?






    share|improve this answer















    Eventually I found an answer in this thread. The method I was looking for is run_in_executor. This allows a synchronous function to be run asynchronously without blocking an event loop.



    In the sleep example I posted above, it might look like this:



    import asyncio
    from time import sleep
    from concurrent.futures import ProcessPoolExecutor

    async def sleep_async(loop, delay):
    # Can set executor to None if a default has been set for loop
    await loop.run_in_executor(ProcessPoolExecutor(), sleep, delay)
    return 'I slept asynchronously'


    Also see the following answer -> How do we call a normal function where a coroutine is expected?







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Dec 24 '17 at 11:27









    Jonathan

    5,40163258




    5,40163258










    answered Apr 6 '17 at 18:43









    Zac DelventhalZac Delventhal

    928818




    928818








    • 11





      ProcessPoolExecutor has a high cost because it launches an entire new python interpreter. It is used when you have a CPU-intensive task that needs to use multiple processors. Consider using ThreadPoolExecutor instead, which uses threading.

      – Oleg
      Jun 11 '17 at 1:38








    • 5





      Thank you for the additional info. Although the original example used process pool, ThreadPoolExecutor is what I ended up using after a little more research. Still seems a little jenky, but so far it's all holding together.

      – Zac Delventhal
      Jun 14 '17 at 3:34






    • 7





      Just a note, instead of creating a new executor, it might be simpler to use the default executor by calling loop.run_in_executor(executor=None, func, *args) (see documentation).

      – Amit Kotlovski
      Jan 18 '18 at 8:51














    • 11





      ProcessPoolExecutor has a high cost because it launches an entire new python interpreter. It is used when you have a CPU-intensive task that needs to use multiple processors. Consider using ThreadPoolExecutor instead, which uses threading.

      – Oleg
      Jun 11 '17 at 1:38








    • 5





      Thank you for the additional info. Although the original example used process pool, ThreadPoolExecutor is what I ended up using after a little more research. Still seems a little jenky, but so far it's all holding together.

      – Zac Delventhal
      Jun 14 '17 at 3:34






    • 7





      Just a note, instead of creating a new executor, it might be simpler to use the default executor by calling loop.run_in_executor(executor=None, func, *args) (see documentation).

      – Amit Kotlovski
      Jan 18 '18 at 8:51








    11




    11





    ProcessPoolExecutor has a high cost because it launches an entire new python interpreter. It is used when you have a CPU-intensive task that needs to use multiple processors. Consider using ThreadPoolExecutor instead, which uses threading.

    – Oleg
    Jun 11 '17 at 1:38







    ProcessPoolExecutor has a high cost because it launches an entire new python interpreter. It is used when you have a CPU-intensive task that needs to use multiple processors. Consider using ThreadPoolExecutor instead, which uses threading.

    – Oleg
    Jun 11 '17 at 1:38






    5




    5





    Thank you for the additional info. Although the original example used process pool, ThreadPoolExecutor is what I ended up using after a little more research. Still seems a little jenky, but so far it's all holding together.

    – Zac Delventhal
    Jun 14 '17 at 3:34





    Thank you for the additional info. Although the original example used process pool, ThreadPoolExecutor is what I ended up using after a little more research. Still seems a little jenky, but so far it's all holding together.

    – Zac Delventhal
    Jun 14 '17 at 3:34




    7




    7





    Just a note, instead of creating a new executor, it might be simpler to use the default executor by calling loop.run_in_executor(executor=None, func, *args) (see documentation).

    – Amit Kotlovski
    Jan 18 '18 at 8:51





    Just a note, instead of creating a new executor, it might be simpler to use the default executor by calling loop.run_in_executor(executor=None, func, *args) (see documentation).

    – Amit Kotlovski
    Jan 18 '18 at 8:51













    7














    You can use a decorator to wrap the sync version to an async version.



    import time
    from functools import wraps, partial


    def wrap(func):
    @wraps(func)
    async def run(*args, loop=None, executor=None, **kwargs):
    if loop is None:
    loop = asyncio.get_event_loop()
    pfunc = partial(func, *args, **kwargs)
    return await loop.run_in_executor(executor, pfunc)
    return run

    @wrap
    def sleep_async(delay):
    time.sleep(delay)
    return 'I slept asynchronously'


    or use the aioify lib



    % pip install aioify


    then



    @aioify
    def sleep_async(delay):
    pass





    share|improve this answer


























    • good advise to use aioify it makes now so easy to write async functions and modules :)

      – WBAR
      Feb 26 at 23:46
















    7














    You can use a decorator to wrap the sync version to an async version.



    import time
    from functools import wraps, partial


    def wrap(func):
    @wraps(func)
    async def run(*args, loop=None, executor=None, **kwargs):
    if loop is None:
    loop = asyncio.get_event_loop()
    pfunc = partial(func, *args, **kwargs)
    return await loop.run_in_executor(executor, pfunc)
    return run

    @wrap
    def sleep_async(delay):
    time.sleep(delay)
    return 'I slept asynchronously'


    or use the aioify lib



    % pip install aioify


    then



    @aioify
    def sleep_async(delay):
    pass





    share|improve this answer


























    • good advise to use aioify it makes now so easy to write async functions and modules :)

      – WBAR
      Feb 26 at 23:46














    7












    7








    7







    You can use a decorator to wrap the sync version to an async version.



    import time
    from functools import wraps, partial


    def wrap(func):
    @wraps(func)
    async def run(*args, loop=None, executor=None, **kwargs):
    if loop is None:
    loop = asyncio.get_event_loop()
    pfunc = partial(func, *args, **kwargs)
    return await loop.run_in_executor(executor, pfunc)
    return run

    @wrap
    def sleep_async(delay):
    time.sleep(delay)
    return 'I slept asynchronously'


    or use the aioify lib



    % pip install aioify


    then



    @aioify
    def sleep_async(delay):
    pass





    share|improve this answer















    You can use a decorator to wrap the sync version to an async version.



    import time
    from functools import wraps, partial


    def wrap(func):
    @wraps(func)
    async def run(*args, loop=None, executor=None, **kwargs):
    if loop is None:
    loop = asyncio.get_event_loop()
    pfunc = partial(func, *args, **kwargs)
    return await loop.run_in_executor(executor, pfunc)
    return run

    @wrap
    def sleep_async(delay):
    time.sleep(delay)
    return 'I slept asynchronously'


    or use the aioify lib



    % pip install aioify


    then



    @aioify
    def sleep_async(delay):
    pass






    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jul 13 '18 at 20:11









    DMfll

    1,10212131




    1,10212131










    answered May 21 '18 at 13:59









    ospiderospider

    1,8862030




    1,8862030













    • good advise to use aioify it makes now so easy to write async functions and modules :)

      – WBAR
      Feb 26 at 23:46



















    • good advise to use aioify it makes now so easy to write async functions and modules :)

      – WBAR
      Feb 26 at 23:46

















    good advise to use aioify it makes now so easy to write async functions and modules :)

    – WBAR
    Feb 26 at 23:46





    good advise to use aioify it makes now so easy to write async functions and modules :)

    – WBAR
    Feb 26 at 23:46











    1














    Not sure if too late but you can also use a decorator to do your function in a thread. ALTHOUGH, note that it will still be non-coop blocking unlike async which is co-op blocking.



    def wrap(func):
    from concurrent.futures import ThreadPoolExecutor
    pool=ThreadPoolExecutor()
    @wraps(func)
    async def run(*args, loop=None, executor=None, **kwargs):
    if loop is None:
    loop = asyncio.get_event_loop()
    future=pool.submit(func, *args, **kwargs)
    return asyncio.wrap_future(future)
    return run


    Hope that helps!






    share|improve this answer




























      1














      Not sure if too late but you can also use a decorator to do your function in a thread. ALTHOUGH, note that it will still be non-coop blocking unlike async which is co-op blocking.



      def wrap(func):
      from concurrent.futures import ThreadPoolExecutor
      pool=ThreadPoolExecutor()
      @wraps(func)
      async def run(*args, loop=None, executor=None, **kwargs):
      if loop is None:
      loop = asyncio.get_event_loop()
      future=pool.submit(func, *args, **kwargs)
      return asyncio.wrap_future(future)
      return run


      Hope that helps!






      share|improve this answer


























        1












        1








        1







        Not sure if too late but you can also use a decorator to do your function in a thread. ALTHOUGH, note that it will still be non-coop blocking unlike async which is co-op blocking.



        def wrap(func):
        from concurrent.futures import ThreadPoolExecutor
        pool=ThreadPoolExecutor()
        @wraps(func)
        async def run(*args, loop=None, executor=None, **kwargs):
        if loop is None:
        loop = asyncio.get_event_loop()
        future=pool.submit(func, *args, **kwargs)
        return asyncio.wrap_future(future)
        return run


        Hope that helps!






        share|improve this answer













        Not sure if too late but you can also use a decorator to do your function in a thread. ALTHOUGH, note that it will still be non-coop blocking unlike async which is co-op blocking.



        def wrap(func):
        from concurrent.futures import ThreadPoolExecutor
        pool=ThreadPoolExecutor()
        @wraps(func)
        async def run(*args, loop=None, executor=None, **kwargs):
        if loop is None:
        loop = asyncio.get_event_loop()
        future=pool.submit(func, *args, **kwargs)
        return asyncio.wrap_future(future)
        return run


        Hope that helps!







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 26 '18 at 22:18









        hpca01hpca01

        12919




        12919






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f43241221%2fhow-can-i-wrap-a-synchronous-function-in-an-async-coroutine%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            To store a contact into the json file from server.js file using a class in NodeJS

            Redirect URL with Chrome Remote Debugging Android Devices

            Dieringhausen