2017-06-02 59 views
0

我想將數據發送到多個進程。每個進程都會對該數據執行其他操作,並等待下一個數據。我有這樣的事情:將數據發送到Python中的多個進程2

from multiprocessing import Process, Manager 

def do_work1(in_queue): 
    while True: 
     item = in_queue.get() 
     # exit signal 
     if item == None: 
      return 
     print "worker 1 : {}".format(item) 

def do_work2(in_queue): 
    while True: 
     item = in_queue.get() 
     # exit signal 
     if item == None: 
      return 
     print "worker 2: {}".format(item) 



if __name__ == "__main__": 
    num_workers = 2 

    manager = Manager() 

    work = manager.Queue(num_workers) 

    # start for workers 
    pool = [] 
    p = Process(target=do_work1, args=(work,)) 
    p.start() 
    pool.append(p) 
    p2 = Process(target=do_work2, args=(work,)) 
    p2.start() 
    pool.append(p2) 



    work.put("1") 
    work.put("2") 

    for p in pool: 
     p.join() 

但運行此代碼後,我得到:

worker 1 : 1 

worker 1 : 2 

或者:

worker 2 : 1 

worker 1 : 2 

我希望這樣的事情:

worker 1 : 1 

worker 2 : 1 

worker 1 : 2 

worker 2 : 2 

如果我想獲得什麼,我應該改變結果如上?

回答

0

你可以使用多管道將數據發送到每一道工序:

https://docs.python.org/2/library/multiprocessing.html#multiprocessing.Pipe

假設數據能夠被醃製:

https://docs.python.org/2/library/pickle.html#what-can-be-pickled-and-unpickled

你就可以有進程等待管道末端的數據,並在發送數據時遍歷所有管道的列表。

+0

嗯消除手工處理隊列,numpy.ndarray似乎無法醃製 – John

+0

它可能不是文檔中列出,因爲它不是默認庫在Python中。它能夠被醃製。 – njoosse

+0

它適用於ndarray。非常感謝 – John

0

您可以通過使用multiprocessing.Pool

import multiprocessing 


def do_work1(item): 
    print "worker 1: {}".format(item) 


def do_work2(item): 
    print "worker 2: {}".format(item) 


if __name__ == '__main__': 
    pool = multiprocessing.Pool(2) 

    pool.apply_async(do_work1, (1,)) 
    pool.apply_async(do_work2, (1,)) 
    pool.apply_async(do_work1, (2,)) 
    pool.apply_async(do_work2, (2,)) 

    pool.close() 
    pool.join()