[Error] Running seg2explorer
Dear segger team,
Hi, all! Hope you been well and thanks for the sharing tool for community:
I am running the most recent segger_dev repo. with Xenium_v2 dataset and have following error when running seg2explorer.
I also saw the other posts (#119, #83) but seem like when running segment function didn't generate cells.zarr.zip but only segger_transcripts.parquet. Any chance I can use this parquet file to visualise segmentation result on Xenium explorer?
please refer to #121
@jpark27 could you check the main branch now? the bug should have been fixed now
Hi, @EliHei2! Thank you so much for updating the scripts. seg2explorer seems running this time with those codes but come across with following error:
########################################################### from segger.validation.xenium_explorer import seg2explorer import dask.dataframe as dd ddf = dd.read_parquet('/project/simmons_hts/jpark/0_tools/segger_dev/data_segger/RUNTRexBio_SLIDE1/benchmarks/segger_output_0.5_False_4_12_15_3_20250730/segger_transcripts.parquet').compute() ddf = ddf.dropna() ddf = ddf[ddf.segger_cell_id != "None"] ddf = ddf.sort_values("segger_cell_id")
from shapely.geometry import Polygon
seg2explorer( seg_df=ddf, source_path=xenium_data_dir, output_dir="/project/simmons_hts/jpark/0_tools/segger_dev/data_segger/RUNTRexBio_SLIDE1/benchmarks/segger_output_0.5_False_4_12_15_3_20250730", cells_filename="seg_cells", analysis_filename="seg_analysis", xenium_filename="seg_experiment.xenium", analysis_df=None, cell_id_columns="segger_cell_id", area_low=10, area_high=100, ) ###########################################################
Can you help me how to fix it?
(Update)
It seem even though seg2explorer failed, segger_cell_boundaries.parquet generation is working with the corrected scripts. One burning question is, as morphology image and boundary image are having different orientation and scale, is there any function one can easily overlay them?
@jpark27 the most recent commit should fix all bugs with Xenium explorer, could you please check?