Copied from my README at Image processing helper class for Octopi · GitHub. This feature is not yet in the master branch, but in relevant branches of catalli/octopi-research
(needs to be consolidated and merged into master).
ProcessingHandler class
An arbitrary per-XY-coordinate post-acquisition image processing helper class for Octopi.
ProcessingHandler
is initialized as an attribute of the MultiPointController
(and aliased as an attribute of the MultiPointWorker
object as MultiPointWorker.processingHandler
) and maintains two queues which have work started on them in separate threads whenever a multipoint acquisition is started. The work is automatically ended by enqueueing termination signals when the multipoint acquisition is ended, and the threads run to completion. Both queues are python queue.Queue
objects, which have tasks added to them via their put
methods. While the multipoint acquisition is running, the user has the option of enqueueing processing tasks (this documentation assumes multipoint_custom_script_entry
is being used to add custom processing, and thus the relevant MultiPointWorker
object is accessed as multiPointWorker
).
Enqueueing tasks
The handler thread for each queue expects objects in the queue to be dicts in the form {'function':(a callable), 'args':(list of positional arguments), 'kwargs':(dict of keyword arguments)}
, and executes each task represented by a dict c
by running c['function'](*c['args'],**c['kwargs'])
. The upload queue handler does not have any requirements on what its functions return, it is assumed that the user will have it pass the data to be uploaded (passed to the function in the dictionary as an arg in the dictionary) to some internal or external data handler accessible from the multiPointWorker
. However, the function in any task queued into processingHandler.processing_queue
must return a task, i.e. a dictionary of the form described previously. This is assumed to contain data to upload and a method for uploading it, which will then be automatically enqueued in the upload queue. Thus,
- End users should ordinarily only be directly queueing tasks in
multiPointWorker.processingHandler.processing_queue
, by, given a task dicttask
, runningmultiPointWorker.processingHandeler.processing_queue.put(task)
. - The function (the callable at the key
'function'
in the task) in any task queued in theprocessing_queue
should return a task dict in the form{'function':(callable), 'args':(list of positional arguments), 'kwargs':(dict of keyword arguments)}
to be enqueued in theupload_queue
.
Example use in a multipoint custom script entry function
Suppose we have a function process_image
that takes an image ndarray
I
as its sole positional argument, and returns a scalar indicating the probability of the presence of a malaria parasite in the image (this is an oversimplified model). Also suppose we have a function upload
that takes a scalar as its sole positional argument and passes it on to some data handler or cloud service.
To use the processing handler in the custom script, the user will first have to write a function process_image_wrapper
as follows (to make sure a task is returned):
def process_image_wrapper(I):
score = process_image(I)
return {'function':upload, 'args':[score], 'kwargs':{}}
Then, in their custom script entry, after whatever step it is in which they acquire a FOV I
at a given Z-level in their XY-coordinates, the user can simply add the code
task_dict = {'function':process_image_wrapper, 'args':[I.copy()],'kwargs':{}}
multiPointWorker.processingHandler.processing_queue.put(task_dict)
and this will result in the processing and uploading taking place in the background while the microscope moves on to the next acquisition. Note, if working with image ndarrays, remember to pass them using ndarray.copy()
to prevent them from being overwritten before processing.
Note: This thread is joined by the MultiPointWorker
once it finishes acquisitions, so it should be safe to enqueue tasks that call the worker object’s functions (such as emitting images to be displayed post-processing) in the version of this functionality that will be merged into production.