pygmtsar icon indicating copy to clipboard operation
pygmtsar copied to clipboard

[Help]: sbas.compute_align(n_jobs=1, joblib_aligning_backend='threading') failed

Open ajkumar96 opened this issue 10 months ago • 1 comments

Hi Alexy,

I am running the Imperial_Valley_2015.ipynb file with your docker container , and it works perfectly.

I am trying to run it on different tiles in Ireland and the code fails at the line (under 'Align a Stack of Images ' section): 'if os.path.exists('/.dockerenv') and not 'google.colab' in sys.modules: # use special joblib backend in Docker containers sbas.compute_align(n_jobs=1, joblib_aligning_backend='threading')'

The tile ID's are : SCENES = ['S1A_IW_SLC__1SDV_20230610T064759_20230610T064826_048920_05E207_2462', 'S1A_IW_SLC__1SDV_20230622T064759_20230622T064826_049095_05E755_0B96',
'S1A_IW_SLC__1SDV_20230704T064800_20230704T064827_049270_05ECB2_5CFF', 'S1A_IW_SLC__1SDV_20230716T064801_20230716T064828_049445_05F21D_2BA8', 'S1A_IW_SLC__1SDV_20230728T064802_20230728T064829_049620_05F77C_A982'] SUBSWATH = 1 POLARIZATION = 'VV'

I am using provider='SRTM' for downloading DSM because the default method had gaps in them.

Here is the error message:

InvalidIndexError Traceback (most recent call last) Cell In[24], line 3 1 if os.path.exists('/.dockerenv') and not 'google.colab' in sys.modules: 2 # use special joblib backend in Docker containers ----> 3 sbas.compute_align(n_jobs=1, joblib_aligning_backend='threading') 4 else: 5 sbas.compute_align()

File /opt/conda/lib/python3.11/site-packages/pygmtsar/Stack_align.py:884, in Stack_align.compute_align(self, geometry, dates, n_jobs, degrees, joblib_aligning_backend, debug) 880 joblib.Parallel(n_jobs=n_jobs, backend=joblib_backend)(joblib.delayed(self._merge_subswaths)(date, offsets, minx, miny, maxx, maxy, debug=debug)
881 for date in dates) 882 else: 883 # DEM extent in radar coordinates, merged reference PRM required --> 884 extent_ra = self.get_extent_ra() 885 minx, miny, maxx, maxy = np.round(extent_ra.bounds).astype(int) 886 #print ('minx, miny, maxx, maxy', minx, miny, maxx, maxy) 887 # in case of a single subswath only convert SLC to NetCDF grid

File /opt/conda/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:26, in Stack_dem.get_extent_ra(self) 24 dem = self.get_dem() 25 df = dem.isel(lon=[0,-1]).to_dataframe().reset_index() ---> 26 geom = self.geocode(LineString(np.column_stack([df.lon, df.lat]))) 27 return geom

File /opt/conda/lib/python3.11/site-packages/pygmtsar/Stack_geocode.py:92, in Stack_geocode.geocode(self, geometry, z_offset) 90 coords = np.asarray(geom.coords[:]) 91 #print (len(coords)) ---> 92 ele = dem.interp(lat=xr.DataArray(coords[:,1]), 93 lon=xr.DataArray(coords[:,0]), method='linear').compute() 94 if z_offset is None: 95 z = coords[:,2] if coords.shape[1]==3 else 0

File /opt/conda/lib/python3.11/site-packages/xarray/core/dataarray.py:2293, in DataArray.interp(self, coords, method, assume_sorted, kwargs, **coords_kwargs) 2289 if self.dtype.kind not in "uifc": 2290 raise TypeError( 2291 f"interp only works for a numeric type array. Given {self.dtype}." 2292 ) -> 2293 ds = self._to_temp_dataset().interp( 2294 coords, 2295 method=method, 2296 kwargs=kwargs, 2297 assume_sorted=assume_sorted, 2298 **coords_kwargs, 2299 ) 2300 return self._from_temp_dataset(ds)

File /opt/conda/lib/python3.11/site-packages/xarray/core/dataset.py:3970, in Dataset.interp(self, coords, method, assume_sorted, kwargs, method_non_numeric, **coords_kwargs) 3968 if method in ["linear", "nearest"]: 3969 for k, v in validated_indexers.items(): -> 3970 obj, newidx = missing._localize(obj, {k: v}) 3971 validated_indexers[k] = newidx[k] 3973 # optimization: create dask coordinate arrays once per Dataset 3974 # rather than once per Variable when dask.array.unify_chunks is called later 3975 # GH4739

File /opt/conda/lib/python3.11/site-packages/xarray/core/missing.py:559, in _localize(var, indexes_coords) 557 maxval = np.nanmax(new_x.values) 558 index = x.to_index() --> 559 imin = index.get_indexer([minval], method="nearest").item() 560 imax = index.get_indexer([maxval], method="nearest").item() 561 indexes[dim] = slice(max(imin - 2, 0), imax + 2)

File /opt/conda/lib/python3.11/site-packages/pandas/core/indexes/base.py:3885, in Index.get_indexer(self, target, method, limit, tolerance) 3882 self._check_indexing_method(method, limit, tolerance) 3884 if not self._index_as_unique: -> 3885 raise InvalidIndexError(self._requires_unique_msg) 3887 if len(target) == 0: 3888 return np.array([], dtype=np.intp)

InvalidIndexError: Reindexing only valid with uniquely valued Index objects

ajkumar96 avatar Apr 24 '24 10:04 ajkumar96

It appears that your DEM may not cover the required area or your orbit files are incomplete. Please check the DEM used and consider reloading the orbit files to resolve the issue.

AlexeyPechnikov avatar Apr 24 '24 13:04 AlexeyPechnikov