satpy
satpy copied to clipboard
Set specific start and end times for channels 3A and 3B.
Set specific start and end times for channels 3A and 3B.
- [x] Closes #1652
- [x] Closes #1654
- [x] Tests added
- [ ] Fully documented
Codecov Report
Merging #1653 (45d0971) into main (9149430) will increase coverage by
0.02%
. The diff coverage is99.30%
.
:exclamation: Current head 45d0971 differs from pull request most recent head 93ec12b. Consider uploading reports for the commit 93ec12b to get more accurate results
@@ Coverage Diff @@
## main #1653 +/- ##
==========================================
+ Coverage 92.63% 92.66% +0.02%
==========================================
Files 258 258
Lines 37949 37960 +11
==========================================
+ Hits 35154 35174 +20
+ Misses 2795 2786 -9
Flag | Coverage Δ | |
---|---|---|
behaviourtests | 4.83% <0.69%> (+<0.01%) |
:arrow_up: |
unittests | 92.79% <99.30%> (+0.02%) |
:arrow_up: |
Flags with carried forward coverage won't be shown. Click here to find out more.
Impacted Files | Coverage Δ | |
---|---|---|
satpy/readers/aapp_l1b.py | 91.73% <94.44%> (+1.91%) |
:arrow_up: |
satpy/readers/file_handlers.py | 97.22% <100.00%> (+0.03%) |
:arrow_up: |
satpy/readers/yaml_reader.py | 96.51% <100.00%> (ø) |
|
satpy/tests/reader_tests/test_aapp_l1b.py | 100.00% <100.00%> (ø) |
|
satpy/tests/test_file_handlers.py | 100.00% <100.00%> (ø) |
|
satpy/readers/generic_image.py | 93.33% <0.00%> (-4.23%) |
:arrow_down: |
satpy/readers/mirs.py | 72.54% <0.00%> (-0.84%) |
:arrow_down: |
satpy/tests/reader_tests/test_generic_image.py | 98.33% <0.00%> (-0.58%) |
:arrow_down: |
satpy/dataset/metadata.py | 98.66% <0.00%> (-0.02%) |
:arrow_down: |
satpy/tests/test_dataset.py | 100.00% <0.00%> (ø) |
|
... and 6 more |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 9149430...93ec12b. Read the comment docs.
Although tests appear to pass, this still fails in practice:
Traceback (most recent call last):
File "/home/gholl/checkouts/protocode/avhrr-a-or-b.py", line 13, in <module>
sc.load(["3a", "3b"])
File "/home/gholl/checkouts/satpy/satpy/scene.py", line 1154, in load
self._read_datasets_from_storage(**kwargs)
File "/home/gholl/checkouts/satpy/satpy/scene.py", line 1174, in _read_datasets_from_storage
return self._read_dataset_nodes_from_storage(nodes, **kwargs)
File "/home/gholl/checkouts/satpy/satpy/scene.py", line 1180, in _read_dataset_nodes_from_storage
loaded_datasets = self._load_datasets_by_readers(reader_datasets, **kwargs)
File "/home/gholl/checkouts/satpy/satpy/scene.py", line 1205, in _load_datasets_by_readers
new_datasets = reader_instance.load(ds_ids, **kwargs)
File "/home/gholl/checkouts/satpy/satpy/readers/yaml_reader.py", line 945, in load
ds = self._load_dataset_with_area(dsid, coords, **kwargs)
File "/home/gholl/checkouts/satpy/satpy/readers/yaml_reader.py", line 841, in _load_dataset_with_area
ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
File "/home/gholl/checkouts/satpy/satpy/readers/yaml_reader.py", line 713, in _load_dataset_data
proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
File "/home/gholl/checkouts/satpy/satpy/readers/yaml_reader.py", line 689, in _load_dataset
projectable = fh.get_dataset(dsid, ds_info)
File "/home/gholl/checkouts/satpy/satpy/readers/aapp_l1b.py", line 136, in get_dataset
dataset = self.calibrate(key)
File "/home/gholl/checkouts/satpy/satpy/readers/aapp_l1b.py", line 286, in calibrate
extra_attrs.update(self._get_ch3_times(dataset_id["name"]))
File "/home/gholl/checkouts/satpy/satpy/readers/aapp_l1b.py", line 261, in _get_ch3_times
"start_time": self._get_time_for_idx(idx[0]),
IndexError: index 0 is out of bounds for axis 0 with size 0
I have a case where self.active_channels == {'1': True, '2': True, '3a': True, '3b': True, '4': True, '5': True}
and self._is3a.sum() == 0
. This crashes my code (see above) but is not covered by the unit tests.
Is this a real case?
Yes. hrpt_noaa19_20210423_1411_62891.l1b
Oh, that's the transition period, right? The two bits are set, it means none are valid I think. I thought we had fixed this.
Maybe it got fixed and I unfixed it in this PR…
It is pointless to set start_time
and end_time
in the file handler, because the attributes are overwritten in yaml_reader.FileYAMLReader._load_dataset_data
:
https://github.com/pytroll/satpy/blob/eb2036ed7edf9c82d554c68437ed20ea28012439/satpy/readers/yaml_reader.py#L711-L718
Why is the FileYAMLReader
doing this?
If I remove lines 716 and 717, four unit tests fail. If I make them conditional upon those attributes being not yet present, all unit tests still pass.
It still fails on multiple files. Although the start_time
and end_time
attributes get calculated correctly here:
https://github.com/pytroll/satpy/blob/eb2036ed7edf9c82d554c68437ed20ea28012439/satpy/readers/file_handlers.py#L97-L98
they get overwritten with the wrong information later in the some function:
https://github.com/pytroll/satpy/blob/eb2036ed7edf9c82d554c68437ed20ea28012439/satpy/readers/file_handlers.py#L128
Meh. It appears to work now (as far as data in the file are correct), but for the purposes for which I wrote https://github.com/pytroll/trollflow2/pull/112 it's still not quite good enough — the start_time
and end_time
are of course not updated after resampling…
This PR triggers a struct.error
in pyninjotiff
under some circumstances:
MCVE
from satpy import Scene
from satpy.utils import debug_on; debug_on()
fn = ["/media/nas/x21308/scratch/AAPP-processed/hrpt_noaa19_20210508_1435_63129.l1b"]
fn_out = "{start_time:%Y%m%d%H%M}-{platform_name}_{sensor}_{area.area_id}_{name}.tiff"
sc = Scene(filenames=fn, reader=["avhrr_l1b_aapp"])
sc.load(["3a"])
ls = sc.resample("nqeuro1km")
ls.save_dataset("3a",
writer="ninjotiff",
physic_unit="%",
ch_min_measurement_unit=0,
ch_max_measurement_unit=125,
filename=fn_out,
sat_id=0,
chan_id=0,
data_cat="PORN",
nbits=8,
data_source="HRPT")
Expected result
Not sure yet, but the corresponding GeoTIFF is all black when I view it.
Actual result
It produces an empty file 195001010000-NOAA-19_avhrr-3_nqeuro1km_3a.tiff
and an exception struct.error
is raised:
[DEBUG: 2021-05-10 12:07:06 : satpy.readers.yaml_reader] Reading ('/home/gholl/checkouts/satpy/satpy/etc/readers/avhrr_l1b_aapp.yaml',)
[DEBUG: 2021-05-10 12:07:06 : satpy.readers.yaml_reader] Assigning to avhrr_l1b_aapp: ['/media/nas/x21308/scratch/AAPP-processed/hrpt_noaa19_20210508_1435_63129.l1b']
[DEBUG: 2021-05-10 12:07:06 : satpy.readers.aapp_l1b] Reading time 0:00:00.003673
[DEBUG: 2021-05-10 12:07:06 : satpy.composites.config_loader] Looking for composites config file avhrr-3.yaml
[DEBUG: 2021-05-10 12:07:06 : satpy.composites.config_loader] Looking for composites config file visir.yaml
[DEBUG: 2021-05-10 12:07:06 : satpy.readers.yaml_reader] No coordinates found for DataID(name='longitude', resolution=1050, modifiers=())
[DEBUG: 2021-05-10 12:07:06 : satpy.readers.yaml_reader] No coordinates found for DataID(name='latitude', resolution=1050, modifiers=())
[INFO: 2021-05-10 12:07:06 : satpy.readers.aapp_l1b] No valid operational coefficients, fall back to pre-launch
[DEBUG: 2021-05-10 12:07:06 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
[DEBUG: 2021-05-10 12:07:06 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
[DEBUG: 2021-05-10 12:07:06 : satpy.scene] Resampling DataID(name='3a', wavelength=WavelengthRange(min=1.58, central=1.61, max=1.64, unit='µm'), resolution=1050, calibration=<calibration.reflectance>, modifiers=())
[INFO: 2021-05-10 12:07:06 : satpy.scene] Not reducing data before resampling.
[INFO: 2021-05-10 12:07:06 : satpy.resample] Using default KDTree resampler
[DEBUG: 2021-05-10 12:07:07 : satpy.resample] Check if ./resample_lut-804d3daba9ceafc721d1968df8e06836cdd1080a.npz exists
[DEBUG: 2021-05-10 12:07:07 : satpy.resample] Computing kd-tree parameters
/home/gholl/checkouts/pyresample/pyresample/kd_tree.py:1025: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
input_coords = input_coords.astype(np.float)
[DEBUG: 2021-05-10 12:07:07 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
[DEBUG: 2021-05-10 12:07:07 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
/home/gholl/checkouts/pyresample/pyresample/kd_tree.py:1047: DeprecationWarning: `np.int` is a deprecated alias for the builtin `int`. To silence this warning, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
radius=self.radius_of_influence, dtype=np.int,
[DEBUG: 2021-05-10 12:07:07 : satpy.resample] Resampling where-bbdb931ad47efe6421d69d61a825cc34
[DEBUG: 2021-05-10 12:07:07 : satpy.writers] Reading ['/home/gholl/checkouts/satpy/satpy/etc/writers/ninjotiff.yaml']
/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyninjotiff/tifffile.py:154: UserWarning: failed to import the optional _tifffile C extension module.
Loading of some compressed images will be slow.
Tifffile.c can be obtained at http://www.lfd.uci.edu/~gohlke/
warnings.warn(
[DEBUG: 2021-05-10 12:07:07 : satpy.writers] Enhancement configuration options: [{'name': 'linear_stretch', 'method': <function stretch at 0x7f99d2842670>, 'kwargs': {'stretch': 'crude', 'min_stretch': 0.0, 'max_stretch': 100.0}}, {'name': 'gamma', 'method': <function gamma at 0x7f99d24a7280>, 'kwargs': {'gamma': 1.5}}]
[DEBUG: 2021-05-10 12:07:07 : trollimage.xrimage] Applying stretch crude with parameters {'min_stretch': 0.0, 'max_stretch': 100.0}
[DEBUG: 2021-05-10 12:07:07 : trollimage.xrimage] Applying gamma 1.5
[INFO: 2021-05-10 12:07:07 : pyninjotiff.ninjotiff] Will generate single band product
/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyproj/crs/crs.py:543: UserWarning: You will likely lose important projection information when converting to a PROJ string from another format. See: https://proj.org/faq.html#what-is-the-best-format-for-describing-coordinate-reference-systems
proj_string = self.to_proj4()
/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyproj/crs/crs.py:543: UserWarning: You will likely lose important projection information when converting to a PROJ string from another format. See: https://proj.org/faq.html#what-is-the-best-format-for-describing-coordinate-reference-systems
proj_string = self.to_proj4()
/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyproj/crs/crs.py:543: UserWarning: You will likely lose important projection information when converting to a PROJ string from another format. See: https://proj.org/faq.html#what-is-the-best-format-for-describing-coordinate-reference-systems
proj_string = self.to_proj4()
[INFO: 2021-05-10 12:07:07 : pyninjotiff.ninjotiff] Creating output file '/media/nas/x21308/plots_and_maps/2021/05/10/195001010000-NOAA-19_avhrr-3_nqeuro1km_3a.tiff'
[INFO: 2021-05-10 12:07:07 : pyninjotiff.ninjotiff] creating tags and data for a resolution 6000x12000
[DEBUG: 2021-05-10 12:07:07 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
[DEBUG: 2021-05-10 12:07:07 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
[DEBUG: 2021-05-10 12:07:07 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
[DEBUG: 2021-05-10 12:07:07 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
[DEBUG: 2021-05-10 12:07:07 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
[DEBUG: 2021-05-10 12:07:07 : pyproj] PROJ_ERROR: proj_crs_get_sub_crs: Object is not a CompoundCRS
Traceback (most recent call last):
File "/home/gholl/checkouts/protocode/mwe/avhrr-struct-error.py", line 13, in <module>
ls.save_dataset("3a",
File "/home/gholl/checkouts/satpy/satpy/scene.py", line 986, in save_dataset
return writer.save_dataset(self[dataset_id],
File "/home/gholl/checkouts/satpy/satpy/writers/ninjotiff.py", line 204, in save_dataset
return super(NinjoTIFFWriter, self).save_dataset(
File "/home/gholl/checkouts/satpy/satpy/writers/__init__.py", line 811, in save_dataset
return self.save_image(img, filename=filename, compute=compute, fill_value=fill_value, **kwargs)
File "/home/gholl/checkouts/satpy/satpy/writers/ninjotiff.py", line 176, in save_image
return nt.save(img, filename, data_is_scaled_01=True, compute=compute, **kwargs)
File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyninjotiff/ninjotiff.py", line 560, in save
return write(data, filename, area_def, ninjo_product_name, **kwargs)
File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyninjotiff/ninjotiff.py", line 667, in write
return _write(image_data, output_fn, write_rgb=write_rgb, **options)
File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyninjotiff/ninjotiff.py", line 1062, in _write
return tiffwrite(output_fn, image_data, args, tifargs, factors)
File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyninjotiff/ninjotiff.py", line 1071, in tiffwrite
tif.save(image_data, **args)
File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyninjotiff/tifffile.py", line 671, in save
addtag(*t)
File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyninjotiff/tifffile.py", line 569, in addtag
ifdentry.append(pack(val_format, pack(dtype, value)))
File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/pyninjotiff/tifffile.py", line 545, in pack
return struct.pack(byteorder+fmt, *val)
struct.error: argument out of range
Additional context
This failure occurs with the avhrr-3a-3b-times
branch, but succeeds with the main
branch.
For the file I just described, we have ls["3a"].attrs["start_time"] = datetime.datetime(1950, 1, 1, 0, 0)
. This leads to pyninjotiff trying to set a negative value to a tag which has a dtype of an unsigned integer, which triggers the struct error. I consider this start time a bug in this PR, that I will fix.
It is in the data... what do we do here then?
ipdb> self._data["scnlinyr"][4017]
1950
ipdb> self._data["scnlindy"][4017]
1
ipdb> self._data["scnlintime"][4017]
0
Should this PR handle this and put something sensible in the start_time
and end_time
attributes?
what do you see in the header? specifically startdatajd
, startdatayr
, startdatady
, startdatatime
and enddata*
counterparts?
The header is normal:
In : print([(x, self._header[x]) for x in self._header.dtype.names if x.startswith("start") or x.startswith("end")])
[('startdatajd', memmap([26060], dtype=int32)), ('startdatayr', memmap([2021], dtype=int16)), ('startdatady', memmap([128], dtype=int16)), ('startdatatime', memmap([52532877], dtype=int32)), ('enddatajd', memmap([26060], dtype=int32)), ('enddatayr', memmap([2021], dtype=int16)), ('enddatady', memmap([128], dtype=int16)), ('enddatatime', memmap([53231043], dtype=int32))]
I could add a check and reject any start/end date that isn't strictly between the header start/end date.