Some thoughts about error (Exception) handling in python when using Processes.
Python Process-based parallelism is something I've been learning as part of an application our group is developing. In this (and many applications) an underlying computation needs to be launched which is somewhat complex, and thus we don't want it to lock up the GUI. Thus the multiprocessing library to avoid global interpreter lock.
For this application I'm essentially writing the framework, and the computationally intensive tasks to be launched are written by others. Thus I can't ensure that their modules will be bullet-proof. Furthermore, when others are writing their modules, useful debugging info from the GUI is always handy (some in my group are probably not familiar with good debugging tools like pdb).
Anyways, the problem is set up as something like:
from multiprocessing import Process def run(): # actual code ... raise Exception('foo') p = Process(target=run)
and then the naive thing to do is:
try: p.start() except: print('caught!')
which of course doesn't work, since
p.start() executes just fine without raising an Exception.
The Simple Solution
Of course the solution is to use a Pipe to wrap Exception handling:
from multiprocessing import Process, Pipe def run(conn): assert hasattr(conn, 'send') try: # actual code ... except Exception as e: conn.send(e) parent_conn, child_conn = Pipe() p = Process(target=run, args=(child_conn,)) p.start() # Callback loop to monitor proces status: def callback(): nonlocal parent_conn if parent_conn.poll(): obj = parent_conn.recv() if isinstance(obj, Exception): # Custom error handling here if p.is_alive(): app.after(50, callback) # initiate callback loop: app.after(50, callback)
where I'm using tkinter so app is a tk Toplevel, and I'm using the
.after functionality for a callback loop to watch for any Exceptions sent by the child process.
In hindsight this is really obvious but it didn't occur to me at first. Maybe it'll be useful for others if it gets picked up by Google. Good luck!