starlink
starlink copied to clipboard
Gaia can't load files if they are in a directory whose name has a space in it.
Reported by a visiting observer, and verified here:
Gaia can't load data if its in a directory with a space in its name:
a) If you try and open a file from the command line that is in a directory with a space in it, you get an HDS_OPEN error:
$ gaia Test\ Directory/scuba2_generic_bothmaps.sdf
GAIA_DIR = /stardev/bin/gaia
!! Error accessing file '/home/sgraves/Test.sdf' - No such file or directory
! HDS_OPEN: Error opening an HDS container file.
a) The Select File window doesn't list the directory. If you manually type the directory name in it will correctly list the file, but you will then get the same HDS_OPEN error as above.
This was verified on the EAO Hilo /stardev build, which claims a starversion of: master @ 5c8256a47fa4a9361dd5f4739857d7381490b61f (2016-01-23T05:39:31)
Not sure if this would be an awful lot of work to fix, but it did seem to be very annoying for the users (and initially confusing, until they'd worked out what was causing the problem).
Spaces are fun. Do any kappa commands work with the space in? I'm trying to remember if HDS itself has problems with spaces in directory paths or file names. I think it might in some cases (it used to be really bad in the cases where NDG and HDS would use a shell for wild card matching [which HDS still does]).
It doesn't look like I can run Kappa's stats on a file name with a space in it either.
stats "'Test\ Directory/scuba2_generic_bothmaps.sdf'"
!! Cannot access Test\ Directory/scuba2_generic_bothmaps.sdf
! Please give a new value for parameter NDF
I think the back slash is not really meant to be part of the file name. I think step one is seeing if HDS can work by, for example, using hdsdump. Second is getting the space through the parameter system. I forget how to do that but File parameters have some trickery with @ signs to allow things like a different file name to be passed through. That technique is needed for DST format files. HDSTRACE is a good test of that because NDG is not in the way.
This works for me:
$ stats "'a test/comwest.sdf'"
Pixel statistics for the NDF structure /Users/timj/work/lsst/dm_dev_guide/a
test/comwest
Title : Comet West, low resolution
NDF array analysed : DATA
Pixel sum : 1.11964e+07
Pixel mean : 170.844
Standard deviation : 63.4728
Skewness : -1.06336
Kurtosis : -0.298918
Minimum pixel value : 3.89062
At pixel : (59, 83)
Co-ordinate : (58.5, 82.5)
Maximum pixel value : 245.938
At pixel : (248, 45)
Co-ordinate : (247.5, 44.5)
Total number of pixels : 65536
Number of pixels used : 65536 (100.0%)
KAPPA:CREFRAME allows me to make an NDF called "star map". But hdstrace "'star map'" or hdstrace "'star map.sdf'" give this result
HDSTRACE.OBJECT <_CHAR*132>
OBJECT 'star map.sdf'
Trying the approach used for DST files (cf. HDS files section in SUN/86) with the ampersand prefix also didn't work.
hdsdump
does work on files with spaces in the name though.
Does this imply HDSTRACE has an issue?
I tried hdstrace "'space the final frontier/starmap.sdf'"
HDSTRACE.OBJECT <_CHAR*132>
OBJECT 'space the final frontier/starmap.sdf'
Without the ampersand it's not taken as a file.
Right I have HDSDUMP working with this syntax: hdsdump space\ the\ final\ frontier/starmap.sdf
That would imply the parameter system is getting in the way for HDSTRACE.
HDSTRACE uses a different routine for finding HDS files than is used by Kappa.
Yes HDSTRACE uses DAT_ASSOC.
In turn NDF_ASSOC calls SUBPAR_GETNAME, while DAT_ASSOC calls SUBPAR_ASSOC (which interprets the string associated with the parameter "as a VMS filename".
@dsberry does NDG call NDF_ASSOC itself? I didn't think it did.
NDG uses SUBPAR_GETNAME to get the text of the NDF path, and then uses NDF_OPEN to open it.
Traced this in GAIA and the full name is passed to NDF_OPEN()
, which fails.