multithreading - Communication between threads in Python (without using Global Variables) -


let's if have main thread launches 2 threads test modules - " test_a" , " test_b". both test module threads maintain state whether done performing test or if encountered error, warning or if want update other information.

how main thread can access information , act accordingly. example, if " test_a" raised error flag; how "main" know , stop rest of tests before existing error ?

one way using global variables gets ugly.. soon.

the obvious solution share kind of mutable variable, passing in thread objects/functions @ constructor/start.

the clean way build class appropriate instance attributes. if you're using threading.thread subclass, instead of thread function, can use subclass place stick attributes. i'll show list because it's shorter:

def test_a_func(thread_state):     # ...     thread_state[0] = my_error_state     # ...  def main_thread():     test_states = [none]     test_a = threading.thread(target=test_a_func, args=(test_a_state,))     test_a.start() 

you can (and want to) pack lock or condition mutable state object, can synchronize between main_thread , test_a.

(another option use queue.queue, os.pipe, etc. pass information around, still need queue or pipe child thread—which in exact same way above.)


however, it's worth considering whether need this. if think of test_a , test_b "jobs", rather "thread functions", can execute jobs on pool, , let pool handle passing results or errors back.

for example:

try:     concurrent.futures.threadpoolexecutor(workers=2) executor:         tests = [executor.submit(job) job in (test_a, test_b)]         test in concurrent.futures.as_completed(tests):             result = test.result() except exception e:     # stuff 

now, if test_a function raises exception, main thread exception—and, because means exiting with block, , of other jobs cancelled , thrown away, , worker threads shut down.

if you're using 2.5-3.1, don't have concurrent.futures built in, can install the backport off pypi, or can rewrite things around multiprocessing.dummy.pool. (it's slightly more complicated way, because have create sequence of jobs , call map_async iterator on asyncresult objects… that's still pretty simple.)


Comments

Popular posts from this blog

blackberry 10 - how to add multiple markers on the google map just by url? -

php - guestbook returning database data to flash -

delphi - Dynamic file type icon -