python - "execution_count" error when running a job on a remote IPython cluster -


i running ipython cluster (ssh) on remote linux machine, , using mac os x ipython use cluster. in ipython on mac write:

from ipython.parallel import client c = client('~/ipcontroller-client.json', sshserver="me@remote_linux_machine") dview=c[:] dview.scatter('m', arange(100)) 

where '~/ipcontroller-client.json' file copied remote_linux_machine. works point.

when try use parallel magic %px error:

/opt/local/library/frameworks/python.framework/versions/2.7/lib/python2.7/site-packages/ipython/parallel/client/client.pyc in __init__(self, msg_id, content, metadata)      80         self.msg_id = msg_id      81         self._content = content ---> 82         self.execution_count = content['execution_count']      83         self.metadata = metadata      84   keyerror: 'execution_count' 

same idea, when run cluster on localhost works perfectly.

should parallel magic work @ remote ssh cluster case?

the problem fixed: 1 needs make sure ipython versions same (mine 0.13.2) on cluster , on machine using it.

on linux machine had specify version needed install standard ipython installed version 0.12.1:

sudo apt-get install ipython=0.13.2-1~ubuntu12.04.1 

Comments

Popular posts from this blog

blackberry 10 - how to add multiple markers on the google map just by url? -

php - guestbook returning database data to flash -

delphi - Dynamic file type icon -