pygmtsar
pygmtsar copied to clipboard
Issue while creating baseline in gedit and generating interferogram in jupyter notebook
Hi, I'm currently using pygmtsar on my Ubuntu 20.4 system. Initially, I successfully ran pygmtsar in a Jupyter notebook. I processed 180 scenes using 3 subswaths and was able to create the baseline plot. However, when attempting to generate the interferogram, I encountered an error, which I have attached an image of below.
Subsequently, I attempted to run pygmtsar using gedit with the same setup of 180 scenes and 3 subswaths. Unfortunately, I faced an issue in creating the baseline plot, and I have attached the error message below.
I would greatly appreciate your assistance in resolving these issues. Your help would be invaluable to me. Thankyou
The first screenshot shows commands which are not exist in the actual PyGMTSAR code, so you are using an old version. Please upgrade your PyGMTSAR library and restart the script/notebook.
The second screenshot for your Jupyter notebook page does not include any error messages but only warnings about high memory usage. That is not a problem when it works because InSAR processing really requires a lot of computational recourses.
Thankyou so much. I just wanted to ask that for the latest version of pygmtsar what additional librarys would be require
All the libraries installed automatically, and you can see the full dependencies list for PyGMTSAR library here: https://pypistats.org/packages/pygmtsar
sir how to solve this error
You need to check the arrays dimensions and unify them before the command call.
Can, I reshape the dimensions ?
Yes, it can be done as array1.reindex_like(array2). But it is better to find the difference and fix it (probably, you messed used variables).
Sir, can I do ? By making detrend and corr shape same?
Check them first. What dimensions differ and why?
Thankyou Sir for your help
Sir could you please help how to make the velocity map using the cumulative displacement
Velocity calculation and mapping are illustrated in the example notebooks provided.
I have been facing problems while downloading orbit files
Use the recommended approach to download the orbit files as illustrated in all the examples.
hi,
My system RAM is 128 GB. While generating interferogram for two subswath I faced an memory issue while doing stack processing also. The data is from 2017 to 2024 and total of 500 to be generate. But also doing stack processing of interferogram i am facing challeng the processing stopped in between how to resolve. and I have uploaded my inteferogram generation step
The code looks unusual and includes non-existent option ‘mmap.’ Also, how are you calculating the interferograms?
I am extremely sorry, I have posted the wrong code
This was my exact code while I was trying to generate interferogram for subswath 2 3 descending, but getting an error. The process stopped in between while generating.
Use the functions compute_interferogram_multilook
and compute_interferogram_singlelook
for processing multiple interferograms, as demonstrated in most examples at https://insar.dev. Technically, both functions follow the same logic, but they split the processing stack into chunks (default queue size is 16) and clean up memory between processing each chunk. The code you applied is intended for small educational and research examples where you want to closely understand every operation performed in InSAR processing or carefully fine-tune your processing parameters.
Hi, I wanted to ask that if I want to apply the multilook factor of 2,8 is the code is correct that I have used? Because i this line compute_interferogram_multilook of code its showing issue
That’s ok; empty (no data) areas, such as the top-right rectangular region on the plot above, are expected. The processing for these areas should return no data pixel values as well. The message is intended to indicate this case, allowing you to verify if something has gone wrong.
I have been processing 7 years of data, consisting of 177 scenes for two subswath descending processing. During this, I generated a total of 500 interferograms. However, while performing the unwrapping step, my Jupyter kernel crashes or stops unexpectedly. As a result, I have to restart the kernel and rerun the entire workflow from the beginning, including the interferogram generation, which is a very time-consuming process.
This recurring issue makes it difficult to progress with my analysis. I would like to understand why this unwrapping step is causing the kernel to crash and if there is an alternative approach that would allow me to resume from where the process stopped, without having to rerun all the previous steps.
Sure, we can split the analysis into independent parts. In your case, when the interferogram processing is completed, you might dump the current state using:
sbas.dump()
Then, create a new notebook where, right after initializing the Dask scheduler, you restore the actual state:
sbas = Stack.restore(WORKDIR)
sbas.to_dataframe()
You will then have the same sbas
object to continue the processing. The second notebook can be restarted multiple times without affecting your interferograms.
Regarding your SNAPHU issue, you have large interferograms of size 13067x12117 pixels, while SNAPHU is suitable for unwrapping grids of about 2000x2000 pixels. It is possible to apply tiled SNAPHU unwrapping; see the PyGMTSAR example notebook titled “CENTRAL Türkiye Mw 7.8 & 7.5 Earthquakes Co-Seismic Interferogram, 2023” for details. You can also use lower-resolution grids. For reference, a common SNAPHU unwrapping resolution is 60 meters, which works well for large areas. Usually, there are no benefits to applying SNAPHU unwrapping to single-looking grids because these are often too noisy to be unwrapped correctly. PyGMTSAR allows unwrapping even single-looking large grids using tiling unwrapping, but that often does not make sense. Use well-multilooking interferograms for SBAS analysis to compute phase screens, topography residuals, etc., and run PSI analysis for single-look interferograms by applying the detected phase corrections.
hi,
-
I tried to unwrap using the tiled with the help of the example that you have mentioned above. As, my pixel size is large I tried to make it 500 x 500 using the tiled still occuring an error could you please help me to solve the problem. Below I have attached my errors
-
For interfrrogram generation step : sbas.compute_interferogram_multilook(baseline_pairs, 'intf_mlook', wavelength=400, weight=sbas.psfunction()) for this line where exactly I can apply the multilook factor of 2x 8
You’re applying unwrapping twice. Also, a 500x500 tile size might be too small, which could cause tile merging artifacts and slow down processing.