open-elevation icon indicating copy to clipboard operation
open-elevation copied to clipboard

Higher Resolution (30m)

Open KirkMiller opened this issue 6 years ago • 22 comments

Would it be possible to update the "Host your own" documentation to provide instruction/code to download the SRTM-30m files for use with open-elevation. PS. Been using your server heavily, so sent a donation. In process of installing open-elevation on my own server.

KirkMiller avatar May 01 '18 01:05 KirkMiller

Completed install using your world dataset, everything works(10x faster on the dedicated server). Since I am only interested in the area around the Juneau Icefield I will be archiving the world and replacing with a small subset for the area of interest.

KirkMiller avatar May 01 '18 13:05 KirkMiller

I'm also interested in this. Were you able to get open-elevation to work with the 30m data?

Also, does anyone know what the disk size requirements would be for the 30m data of the whole world?

heyman avatar May 03 '18 13:05 heyman

Hi, I will be coming back with a reply soon. I've been very busy with my work :)

Jorl17 avatar May 03 '18 14:05 Jorl17

Any updates on this issue? I'm also interested in hosting the 30 m SRTM data

ghost avatar Jul 20 '18 00:07 ghost

I have moved the open-elevation software onto a new server and am able to get 9.5K elevations / second with a symmetric load.

KirkMiller avatar Sep 09 '18 23:09 KirkMiller

Seriously interested in accurate elevation data (I feel a new hobby arising ;)

With what I read here the elevation data now contains the 90m SRTM data? It would be awesome if it would contain the 30m data! Is there a way I (and others?) can help with this?

Also, in ancient history I made several GPS traces and uploaded them to OSM (and improved the maps that way). I expect there are many GPS traces in OSM. Is there a way to use that data to improve the elevation data?

Concluding.. before I found this (happy that I did!) I was thinking about a similar system.. but also I was thinking, wouldn't it be better to somehow incorporate the data in OSM? (https://wiki.openstreetmap.org/wiki/Altitude).

mrAceT avatar Sep 26 '18 11:09 mrAceT

@mrAceT ,

How big is the area that you want? What resolution would you be happy with? I see you are offering to host data, what storage size are you willing to provide? Do you have basic programming knowledge?

helios-hyperion avatar Sep 28 '18 02:09 helios-hyperion

How big is the area that you want?

I want it all ;)

What resolution would you be happy with?

The 30m SRTM would be an good start, since I understand that the dataset is of the complete globe (I have found the way to access the raw data.. now the conversion of the HGT-files..) Also I found the website of the NOAA (National Oceanic and Atmospheric Administration) and ARCGIS. There I see there is also quite a bit of (seriously) high resolution data. I'm afraid that would be a next phase (I think the space required will become extreme...)

I see you are offering to host data, what storage size are you willing to provide?

I think (hope?) the space will not be the problem, bandwidth and CPU power more so.. But first I need to get things going ;) The next month I will be quite busy relocating my company... Then I want to start fiddling ;)

Do you have basic programming knowledge?

I own an ICT company. I can manage (moderate to expert) PHP, SQL, Perl, JS, CSS, SH, HTML(5)

Ehm, answering all your questions ;) Offering to help?

mrAceT avatar Sep 28 '18 07:09 mrAceT

Well, it's just that, I don't understand why everyone is messing around with these 30m datasets (such a ugly resolution), and not just going directly to Google... [which is what I did]. Of course, 30m sets are free, but so is walking around with cataracts.

What do you want to use the data for?

Regarding storage size, if you think about it like this:

1 point = 3 coords so lets say you have 1 point: [6754802.134486316,703953.4065683074,1008.034790039062] and taking that each coord in this form has 51[number]+2[comma] bytes, you can easily calculate the amount of space needed for a grid spanning the globe... depending on your resolution of course. And this is still inefficient csv file storage, but I did not care about compression, because, well, it has not become a problem yet. and do you really need 9 decimal accuracy?

coding this in python is pretty damn easy.... it took me about 2 weeks... and I just dabble in coding.

also, a little secret: it did not cost me anything :P but this results in unbelievably "slow" point acquisition. but you know... it's not like you need to "cross the Atlantic in a canoe with a dipstick" slow.

anyway, contact me on my email if you are interested.

helios-hyperion avatar Sep 28 '18 08:09 helios-hyperion

Ehm, do I understand correctly that you dug up (or are still digging?) the data from uncle google? The nasty thing is, it is processed data, not original and also most likely copyrighted..

For personal usage great! (once had a hobby building geo-zipcode databases, got some 40 countries lying around somewhere) from all kinds of online sources.. when it is for personal usage copyright is much less of an issue ;) But if one wants to add something to the online community, copyright can be "a b.tch".. the base data is quite often opensource.. so I am getting a bit of an "OSM feeling". Google has the data, probably has it quite good also.. but OSM is "of the people" and I really, really like that.. map data should not be owned by some company..

Now I am discovering it is the same with elevation data.. the basics are public domain (the surveys have been done by public domain institutions, or the root data has been made public) but the gathering and interpretation of the data seems to be "owned" data.

More over, when I observe the data at NOAA and compare it to Google, I seriously get the impression "not even" Google has the best data..

Starting to wonder.. how many terabytes would I need.....

All in all, I want open source data, from the source.. I thought it would be a good idea to start with the 30m data and see where "it ends" (ALOS Global DSM looks quite nice too ;)

mrAceT avatar Sep 28 '18 12:09 mrAceT

From here, I understood that all is fair in love and warcraft. If you are not selling Google's grain and cows as your own, Google will not be angered.

Any service listed online uses this. Hence the:

NOTE: To look up elevations using this page, you'll need to get your own Google Maps API key.

I just did not want to use their interface...

When you are walking around with a handheld GPS, the quality of GPS data depends on the quality of GPS service you are subscribed to... if you pay for only 3 satellites, you will be hopping around like a space invader. Google does not have 8bit graphics.

But: I see your point [hue hue hue]. With a risk of this turning into a philosophical discussion regarding the direction of humanity's current heading, I can only say this: If you want something done, it's much better to build an army of autonomous robots to do it for you... which of course, can be done, but funding is always a b.tch :)

In short: if you want elevation data that has higher resolution than 30m, you need to think why someone else has not done it yet. Then, you need to think why you are not the first to think this. Then, you need to think if you are going to be the first to do this. Then, we can seriously think of how to do it.

EDIT: changed "more accurate" to "higher resolution" on account of @KirkMiller 's following comment

helios-hyperion avatar Sep 28 '18 14:09 helios-hyperion

I found that Google elevation is NOT as accurate as The National Map. I have downloaded the USGS dataset(s) and have a paper in prep comparing the USGS NED to SRTM-30 and Google. Will post in an other 30 days along with the script required to load the NED into Open-Elevation.

KirkMiller avatar Sep 30 '18 18:09 KirkMiller

I think I might have mistaken the direction of this project... I was under the impression that we were trying to achieve something of the magnitude showcased by Pink and the Brain... and not a MineCraft project.

Upon reviewing the OP's posts, I now see that my input is irrelevant to this project.

Eh. For my needs, Google was best. There was nothing else for the area :) good luck to thee as

[I] have no time for verbal bric-à-brac]

helios-hyperion avatar Oct 01 '18 04:10 helios-hyperion

@KirkMiller, Can you link to that paper (when you are publishing)? Interesting! Especially curious in how you you compared that data. The USGS data (1) covers "only" the North American continent, SRTM-30 the whole world.

When I read you correctly I get the impression that you have made a script to load the SRTM-30 data into open elevation? Would you be willing to share that script?

@helios-hyperion, Your last post, ehm, not quite sure what you ment? (I'm Dutch, maybe that is it ;)

In short: if you want elevation data that has higher resolution than 30m, you need to think why someone else has not done it yet.

Ik think the sheer magnitude of the amount of data is mind boggling..

Then, you need to think why you are not the first to think this.

Why? NOAA (2) is pretty much complete I think in regard to oceanic data (it is missing the MH370 data though?) Also found 'vterrain' (3) some interesting reading there.. Both are not the whole world and not have the "OSM idea"

Then, you need to think if you are going to be the first to do this. Then, we can seriously think of how to do it.

For just one person I think it's gonna be nearly impossible.. When I find the time I'll start with implementing "open elevation server". I think the right basis is the SRTM-30m data to get a grasp and feeling of the project and then build upon that...

(1) link: UCSG data: https://viewer.nationalmap.gov/basic/ (2) link: https://maps.ngdc.noaa.gov/viewers/bathymetry/ (3) link: http://vterrain.org/

mrAceT avatar Oct 01 '18 10:10 mrAceT

@mrAceT Dit is goed om te hoor dat jy 'n Nederlander is... baie mooi land. Baie plat kontoere. My laaste brief het daaroor gegaan dat een Meneer hare kloof tussen "presies" en "resolusie"... toe ruk ek my moer. Dit is seker maar omdat ek Afrikaans is :)

ok, back to God's language:

Well, basically, yes. The sheer magnitude of data is not just mind boggling, I think it would be close to impossible to host for a mere human being such as us... although, I have been exposed to an idea with regards to biological compression, but it's not there yet... My knowledge with regards to compression and biological systems is pretty limited at the moment. [if you have any study material that is time worthy, I would appreciate it]

Which is why, I was thinking: If you start with your 30m maps, Google can fill in the small areas that you are interested in... but yeah...

The problem faced (in my mind) when looking for "OSM", is as follows:

  1. you need a device [think smartphone, but then, who owns the accelerometer, the GPRS, the the the... data?]
  2. you need software [when I worked with smartphones, I found that if you do not write your own, you are very limited in application... and you can't trust the data... cause of the unknown calibrations done]
  3. you need users [or robots hehehehe]
  4. and of course, all the interconnecting fluff...

Now, in today's world, this means that somewhere someone has their dirty little stamp over it. Unless, of course, you create your own answer to the above mentioned needs. I am not saying this or that, I am just spit balling and brain storming here...

And "nearly impossible" is good enough for me to attempt...

helios-hyperion avatar Oct 02 '18 05:10 helios-hyperion

@helios-hyperion (warning Dutch ;) Met mijn kennis van de Nederlandse taal red ik het heel aardig op je eerste zin te lezen ;) Boeiend om te zien hoe de Nederlandse taal in een paar honderd jaar zich volkomen verschillend kan ontwikkelen!

(warning Frisian :P ) Mar as't myn Avatar sjochst, de flagge fan Fryslân, kin'st riede dat ik einliks ik ek nog in djip Frysk bin ;) Translated: When you see my avatar, the flag of Friesland, you could guess I'm verry Frisian ;)

But back to a more universal language :P

I think you are a few steps further. I'm thinking along these lines:

  1. install open elevation server and play around with it
  2. get SRTM 30m data in there as a basis and get a feel for data..
  3. figure out a multiple database structure
  4. maybe use a completely different database?
  5. find a way to integrate the usage of multiple databases

Mind you, I haven't even started yet ;) I found QGIS (looks nice).. but for now I get stranded by getting my hands on TIF files for open elevation..

What data format would be best (no to marginal data loss?) compared to storage requirements? (was researching some data formats, wow, there are a lot!)

Talking about impossible.. made me think of: https://www.youtube.com/watch?time_continue=8&v=Pw2sex1mJNI Well, that wasn't.. maybe this isn't either ;)

mrAceT avatar Oct 02 '18 10:10 mrAceT

I've been working on an API that's pretty stable now for my uses, it has 30m SRTM, 10m NED, and some other open datasets.

Code: ajnisbet/opentopodata Docs & public API: www.opentopodata.org

I'd appreciate any feedback!

ajnisbet avatar May 22 '20 23:05 ajnisbet

I've been working on an API that's pretty stable now for my uses, it has 30m SRTM, 10m NED, and some other open datasets.

Code: ajnisbet/opentopodata Docs & public API: www.opentopodata.org

I'd appreciate any feedback!

Thanks, OpenTopoData as a resource and API is great!

I started trying a few API's I could host myself and Open-Elevation was pretty nice for testing due to the SRTM 30m being only 18.5GB. But now I need a better dataset. I downloaded NED 1-3 Arc ~10m (but just US + Alaska, fine for now), MAPZEN (various datasets compiled into world-wide 10-250m, but I think it's a bit outdated). I got NED ~10m somewhat working with this running locally and am using it as a hybrid system (direct calls to the modules in Python for the added performance as import libraries and API for web stuff). I'm fully generating elevation maps and wrote some code for generating line flow path using elevation data (which I was quite proud of but still refining). I am having tons of fun with this, if anyone is interested in sharing and/or sharing knowledge, let me know.

aliasfoxkde avatar Jun 13 '23 04:06 aliasfoxkde

@aliasfoxkde Would love to pick your brain. How do we connect? Working on hosting our own elevation API, starting with Open Elevation and replacing it with better US data. Sounds like you're much further down the path.

troutinsights avatar Dec 19 '23 04:12 troutinsights

@troutinsights The 10m NED dataset is currently the highest-resolution covering the US.

There is also 1m lidar data covering about half the US and constantly updated and expanded! But it's made up of lots of datasets which can vary somewhat in quality and occasionally don't seamlessly merge. It's also huge: 10+ TB!

And finally there's a 3m dataset, but it's no longer being developed and it's not great quality.

If you're working in Python, I'd recommend working with compressed geotiffs (both the 10m and 1m US datasets are in this format), and looking into the rasterio library for reading them quickly and correctly.

ajnisbet avatar Dec 19 '23 05:12 ajnisbet

You can call me at 716 870 7755 to discuss what we are doing with the 10m Dem and ESA sentinel-2.

On Tue, Dec 19, 2023, 12:38 AM Andrew Nisbet @.***> wrote:

@troutinsights https://github.com/troutinsights The 10m NED dataset is currently the highest-resolution covering the US.

There is also 1m lidar data covering about half the US and constantly updated and expanded! But it's made up of lots of datasets which can vary somewhat in quality and occasionally don't seamlessly merge. It's also huge: 10+ TB!

And finally there's a 3m dataset, but it's no longer being developed and it's not great quality.

If you're working in Python, I'd recommend working with compressed geotiffs (both the 10m and 1m US datasets are in this format), and looking into the rasterio library for reading them quickly and correctly.

— Reply to this email directly, view it on GitHub https://github.com/Jorl17/open-elevation/issues/15#issuecomment-1862153451, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJISHXM25ZLXWY4HFQBEJOTYKER3HAVCNFSM4E5VZZ3KU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCOBWGIYTKMZUGUYQ . You are receiving this because you were mentioned.Message ID: @.***>

KirkMiller avatar Dec 19 '23 12:12 KirkMiller

@aliasfoxkde Would love to pick your brain. How do we connect? Working on hosting our own elevation API, starting with Open Elevation and replacing it with better US data. Sounds like you're much further down the path.

@troutinsights sorry, I never saw this reply. So my use case was pretty unique/specific, I was actually making elevation tools for where I work (and learning on the side). I got what I needed working and moved onto different projects.

What I did was basically used the gdal_interface.py from open elevation as is, and modified part of the API scripts to create an elevation.py library I can call to in my code. I found the API itself a little slow and heavy for my needs but the little library works great through Python. It's about 100x faster and I've used it to create several Elevation based ArcGIS Pro Toolbox tools for my GIS Team. I'm a GIS Developer for a response company.

Getting higher resolution elevation data was honestly kind of a pain, and took up a lot of space to format because they come as an archive (copy #1), you have to extract them (copy #2), process them into GeoTiff's (copy #3), and then split them out into chunks to be used with the elevation library (4th and final copy). The USCG_1-3_Arc_10m was about 450gb and took ~2TB to process.

Due to poor documentation, no stright forward or Free API (other than one's to download DEM's by grid, etc.), and just cumbersome process... I wanted to modify a fork of the of Open Elevation or OpenTopo to make it easy to self host any dataset as an API or through Python (and simply pre-process the data in the format that is needed, not all the other steps). I have an unlimited webhost I can put the final chunked GeoTiff's (and/or torrent them and point to a download script, idk) and I could even host an API (but I would prefer there to be a pool of hosts over a DNS round robin or something).

I'm getting back into this all again because I want to get the latest data and I am creating a webtool/widget, so I will likely need the API after all (but might just re-write the elevation library in C#/.NET/ASPX to process DEM's web-facing (but can connect locally to DEM's and is lightweight, for performance reasons and it's the backend my company already has setup). I'm really only in the planning stages now but figured I could get the API running to benchmark performance (officially this time). But if I remember correctly it is kind of poor, at least for the primary reason I need it for which is flow pathing (that uses a bunch of points passed to a heapq algorithm to figure path-of-least resistance by elevation; for the whole thing, pure python takes ~1 sec).

So with all that said, if you're still interested in getting something working, I would like to discuss and sorry I missed your initial inquiry.

aliasfoxkde avatar Apr 27 '24 03:04 aliasfoxkde