pyrosm
pyrosm copied to clipboard
Segmentation Fault when attempting `get_buildings` and `get_network`
Hello! Thanks for making this library. I am really liking using it so far.
I am having an issue with seg faults which I wanted to ask about solving. My computer has higher specs than the benchmark computer, including more ram. So I am unsure why it's not even running.
In order to make a test case for it I picked a very small area, and ran the following on my computer with 32 gb ram.
from pyrosm import OSM, get_data
# Get map data
map_data = get_data('london')
osm = OSM(map_data, bounding_box=[-0.071654, 51.501862, -0.048006, 51.511343])
osm.get_network()
This gives a seg fault after about 5 seconds, the RAM use goes up to 9 GB. Of course I ran it without the bounding box. It also seg faults but goes up to 19 GB RAM use and takes about 10 seconds to fail.
Do you have any idea what could be wrong?
This is exact issue I'm having in https://github.com/HTenkanen/pyrosm/issues/166 upon further research. Jupyter doesn't provide as verbose logging as running it normally
Hmm interesting.. Thanks for raising this up! This might be something to do with new version of Cython, I need to investigate a bit more. Could you provide information about which version of Cython you are using and which operating system?
A possibly relevant issue related to this: https://github.com/scipy/scipy/issues/14732#issuecomment-934611750
its working on my mac with the following environment Cython 0.29.15 python 3.7.6 pyrosm version 0.6.0
ubuntu: cython 0.29.24 python 3.97 and newest pyrosm
I just reinstalled everything and now things seem to be working in the same environment with all helsinki data. this is the cython version
Cython==0.29.26
In a debian vm
Same / similar issue happening as well for me with "get_pois".
Fedora 34 (in a Qubes VM) pyrosm - installed today (1tth Jan 22) first time Python 3.9.9 Cython version 0.29.26
Watching memory usage with htop shows slow (0.02 GB) to fast (0.8 GB) memory consumption between htop updates.
Code used: fp = get_data("Karlsruhe", directory="Downloads") osm = OSM(fp)
my_filter = {"nodes": ["1989098258"] } # not sure if this works, can't find any info on this...
pois = osm.get_pois(my_filter)
--> "Killed" once memory runs out
I have this problem when installing the package with pip. Tried with python 3.8, 3.9 and 3.10. Problem does not accour with conda.
Also having this issue. Any updates?
The only solution is using Conda instead of Pip
latest version in conda didnt work for me got a "truncated error" message when running osm.get_buildings // osm.get_network() functions :
from pyrosm import OSM, get_data
# Get map data
map_data = get_data('london')
osm = OSM(map_data, bounding_box=[-0.071654, 51.501862, -0.048006, 51.511343])
osm.get_network()
I was able to get it working in Conda with: python==3.8.12 pyrosm==0.6.1 cython==0.29.28 cykhash==1.0.2 pygeos==0.10
It seems that pygeos 0.11 and newer cause the segmentation fault.
I can confirm that the pygeos>0.10 seems to cause the issue with get_buildings(). pygeos v 0.12.0 -> segfault pygeos v 0.10 -> no segfault
Commenting to confirm the issue is still persisting in a Jupyter environment with 25.5GB of RAM and 107GB of disk space when running the osm.get_network("driving") example from the documentation using the "new_york" pbf data, even with pygeos==0.10.