piscat does not work with parallel processing
As soon as I set the parallel loop in piscat functions, i.e.,
# Github Issues Example for Piscat
import numpy as np
from piscat.Visualization import *
from piscat.Preproccessing import *
from piscat.BackgroundCorrection import *
from piscat.InputOutput import *
import matplotlib.pyplot as plt
videoraw_path = r"C:\write\the\path\to\the\video.raw"
# load the raw video with piscat
video = video_reader(file_name=videoraw_path, type='binary', img_width=256, img_height=256,
image_type=np.dtype('<u2'), s_frame=0, e_frame=-1)
video_pn, power = Normalization(video).power_normalized(inter_flag_parallel_active = True)
I immediately obtain errors, in the form :
"...\joblib\parallel.py", line 763, in _return_or_raise raise self._result UnboundLocalError: local variable 'sum_img_pixels' referenced before assignment"
and for differential averaging:
video_dr = DifferentialRollingAverage(video, batchSize=batch_size, mode_FPN=cFPN)
video_dra, _ = video_dr.differential_rolling(FPN_flag=True, select_correction_axis='Both', FFT_flag=True)
I have obtained memory errors even though there is enough RAM (32 GB) for this process. Furthermore, If I try to perform fFPN, with FFT_flag=False it automatically fails anyway because this filter already is computed in a parallel loop.
PSF detection functions also do not work if I do not manually set the parallel looping to inactive, i.e.,
display_psf = DisplayDataFramePSFsLocalization(dra_video, df_PSFs, 0.1, False , save_path)
# display_psf.cpu.parallel_active = False
display_psf.run()
It gives the error that the object "DisplayDataFramePSFsLocalization" cannot be pickled.
Thank you for reporting these issues. Let me address your concerns point by point:
Memory Usage in Parallel Processing: Running code in parallel generally requires more memory. Even though you have 32 GB of RAM, it is possible that the parallel processing is consuming a significant portion of it, leading to memory errors. Please refer to our documentation on memory requirements for different tutorials here. This should give you a better idea of the memory usage during parallel execution.
Jupyter Tutorials: Have you run through the provided Jupyter tutorials to ensure everything works correctly? These tutorials help troubleshoot and verify that all functions work as expected. Running the tutorials with and without parallel processing activated can help isolate the issue.
Testing Parallel Functionality: Are all the problems you encounter related to parallel processing? Please try running the Jupyter tutorials with and without parallel activation, and let us know if the issues persist.
Dependencies: Could you confirm that all dependencies were installed with the recommended versions? Using incompatible versions might lead to unexpected errors during execution. Double-checking the installation steps might help resolve some of these issues.
Please let us know the results after testing the above steps, and feel free to provide any additional details if the issues persist.
@marcoheisig: Would you please check these error?
The pickle error is probably due to some changes to the Python internals of newer versions of Python. Or due to our audacious use of pickle in the first place. Either way, I first need to reproduce the problem before I can address it.
@mnotrin Can you send me the output of pip freeze and python --version so that I can try to reproduce your problem?