multithreading - why doesn't a simple python producer/consumer multi-threading program speed up by adding the number of workers? -


the code below identical python official queue example @ http://docs.python.org/2/library/queue.html

from queue import queue threading import thread time import time import sys  num_worker_threads = int(sys.argv[1]) source = xrange(10000)  def do_work(item):     in xrange(100000):         pass  def worker():     while true:         item = q.get()         do_work(item)         q.task_done()  q = queue()  item in source:     q.put(item)  start = time()  in range(num_worker_threads):     t = thread(target=worker)     t.daemon = true     t.start()  q.join()  end = time()  print(end - start) 

these results on xeon 12-core processor:

$ ./speed.py 1 12.0873839855  $ ./speed.py 2 15.9101941586  $ ./speed.py 4 27.5713479519 

i expected increasing number of workers reduce response time instead, increasing. did experiment again , again result didn't change.

am missing obvious? or python queue/threading doesn't work well?

python rather poor @ multi-threading. due global lock 1 thread makes progress @ time. see http://wiki.python.org/moin/globalinterpreterlock


Comments

Popular posts from this blog

blackberry 10 - how to add multiple markers on the google map just by url? -

php - guestbook returning database data to flash -

delphi - Dynamic file type icon -