overpass-api-python-wrapper
overpass-api-python-wrapper copied to clipboard
timeout value should be for the Overpass query, not for the connection
OverpassQL supports a [timeout:N] in the global settings on a query. The timeout value in the API class is used for the TCP connection instead though. This doesn't give the overpass server a chance to reject ridiculously large timeouts, and it also doesn't allow for queries longer than the default overpass timeout of 180 seconds. So if you set timeout to 300 on a complex query, it will still fail at 180 seconds. (Note: I haven't verified this, but am assuming this is the behavior on reading the code.)
I can corroborate the suggestion with my hands-on observation when using the Overpass library. For example, even if I have set the timeout value to be 600 seconds by calling the API like: api = overpass.API(timeout=600)
when I need to make a heavy-weight query, the program would fail with an error message indicating the Overpass query timed out at around 180 seconds.
I would suggest a workaround if someone faces that similar issue as I did: Firstly api = overpass.API(build=False)
which bypasses supplying the timeout parameter used by the API in the unintended way as suggested. Then api.get('YOUR_FULL_QUERY')
where your full query includes the meta timeout info and is something like
[out:json]
[timeout:600];
{{geocodeArea:Oxfordshire}}->.searchArea;
(
node[name="McDonald's"][amenity=fast_food](area.searchArea);
way[name="McDonald's"][amenity=fast_food](area.searchArea);
relation[name="McDonald's"][amenity=fast_food](area.searchArea);
);
out body;
That way you can specify a long timeout period for you heavy-weight query which works in the intended way.
Yes, good point. I’ve only recently discovered the build=False
option after reading the source code and have added some docstrings and created a pull request so that others might discover it as well.
Maybe I’ll write up an example, or copy yours, and get it added to the README.