pyFAI icon indicating copy to clipboard operation
pyFAI copied to clipboard

Discussion about hdf5 integration

Open picca opened this issue 12 years ago • 5 comments

Hello Jerome,

I would like to know your plan for integration of hdf5/NeXuS files in your tools.

maybe it would be nice to provide a sort of URI system to describe the datas these files.

pyFAI-saxs -p Si.poni hdf5://:/path/to/the/data[slice]

where slice allow to extract part of the datas.

Are you aware of an hdf5 URI system ?

picca avatar Jul 12 '13 20:07 picca

On Fri, 12 Jul 2013 13:49:54 -0700 picca [email protected] wrote:

Hello Jerome,

I would like to know your plan for integration of hdf5/NeXuS files in your tools.

Plans ... you have spotted some bugs induced by the introduction of code for HDF5.

This has a very low priority because no scientist wants it so no project will be requested on this hence no manpower allocated.

So this can only be done on my spare time

maybe it would be nice to provide a sort of URI system to describe the datas these files.

pyFAI-saxs -p Si.poni hdf5://:/path/to/the/data[slice]

looks interesting.

where slice allow to extract part of the datas.

Are you aware of an hdf5 URI system ?

In fullfield we are using path.h5:/group then rely on nexus tags to retrieve the dataset but we have no slicing issue.

Cheers,

Jerome Kieffer [email protected]

kif avatar Jul 15 '13 09:07 kif

Development version of FabIO include now the ability to read hdf5://:/path/to/the/data[slice]... integration into pyFAI remains but it is ongoing

kif avatar Nov 20 '13 08:11 kif

Hello Jerome, so I just tested pyFAI with fabio 0.1.4 and the hdf5 files

here the command I am using.

./bootstrap.py pyFAI-calib -l 0.39 -w 0.652 -D Xpad_flat -S Si hdf5:///nfs/ruche-diffabs/diffabs-users/20120966/2013/Run3/2013-07-11/silicium_1298.nxs:/scan_1311/scan_data/data_15[0]

with the current pyFAI, I got this error message:

Traceback (most recent call last): File "./bootstrap.py", line 99, in execfile(os.path.join(SCRIPTSPATH, script)) File "./build/scripts-2.7/pyFAI-calib", line 58, in c.parse() File "./build/lib.linux-i686-2.7/pyFAI/calibration.py", line 778, in parse (options, _) = self.analyse_options() File "./build/lib.linux-i686-2.7/pyFAI/calibration.py", line 364, in analyse_options raise RuntimeError("Please provide some calibration images ... " RuntimeError: Please provide some calibration images ... if you want to analyze them. Try also the --help option to see all options!

looking at the code it seems that the problem is in the utils module, the expand_args method

def expand_args(args): """ Takes an argv and expand it (under Windows, cmd does not convert *.tif into a list of files. Keeps only valid files (thanks to glob)

@param args: list of files or wilcards
@return: list of actual args
"""
new = []
for afile in  args:
    print afile
    if os.path.exists(afile):
        new.append(afile)
    else:
        new += glob.glob(afile)
return new

Indeed afile is no more a valid file with the URI. I am wondering is this validation should not be delegated to fabio. which couls says Hey this URI is a valid URI for me.

maybe fabio should contain a way to build a list of valid URI from the command line. so instead of doing this work in pyFAI, fabio should have something that could return a list of valid URI.

picca avatar Apr 08 '14 14:04 picca

Another problem observed with the hdf5 URI Traceback (most recent call last): File "./bootstrap.py", line 99, in execfile(os.path.join(SCRIPTSPATH, script)) File "./build/scripts-2.7/pyFAI-calib", line 63, in c.gui_peakPicker() File "./build/lib.linux-i686-2.7/pyFAI/calibration.py", line 811, in gui_peakPicker self.data = self.peakPicker.finish(self.pointfile) File "./build/lib.linux-i686-2.7/pyFAI/peakPicker.py", line 299, in finish self.points.save(filename) File "./build/lib.linux-i686-2.7/pyFAI/peakPicker.py", line 504, in save with open(filename, "w") as f: IOError: [Errno 2] No such file or directory: 'hdf5:///nfs/ruche-diffabs/diffabs-users/20120966/2013/Run3/2013-07-11/silicium_1298.nxs:/scan_1311/scan_data/data_15[0].npt'

Indeed the peakpeaker try to open a non valid .npt file.

picca avatar Apr 08 '14 16:04 picca

It is now possible to use HDF5 files with most applications: average, calibration, integration

We are using 2 types of URLs:

  • The one supported by fabio foo.h5::blahblah
  • The one supported by silx silx://, fabio://

Sometime it is one or the other, cause it was more easy to implement. Here is an overview https://github.com/silx-kit/pyFAI/pull/1175

Let us know if it is enough to close this issue.

vallsv avatar May 07 '19 08:05 vallsv