Python Requests: Don't wait for request to finish -
in bash, possible execute command in background appending &
. how can in python?
while true: data = raw_input('enter something: ') requests.post(url, data=data) # don't wait finish. print('sending post request...') # should appear immediately.
i use multiprocessing.dummy.pool
. create singleton thread pool @ module level, , use pool.apply_async(requests.get, [params])
launch task.
this command gives me future, can add list other futures indefinitely until i'd collect or of results.
multiprocessing.dummy.pool
is, against logic , reason, thread pool , not process pool.
example (works in both python 2 , 3, long requests installed):
from multiprocessing.dummy import pool import requests pool = pool(10) # creates pool ten threads; more threads = more concurrency. # "pool" module attribute; can sure there # 1 of them in application # modules cached after initialization. if __name__ == '__main__': futures = [] x in range(10): futures.append(pool.apply_async(requests.get, ['http://example.com/'])) # futures list of 10 futures. future in futures: print(future.get()) # each future, wait until request # finished , print response object.
the requests executed concurrently, running ten of these requests should take no longer longest one. strategy use 1 cpu core, shouldn't issue because of time spent waiting i/o.
Comments
Post a Comment