docker
docker copied to clipboard
Fatal error during test
Aggregate test results
Pass: 107
Improvements: 2
Fail: 46
Placeholders: 0
Regressions: 208
Took 18224ms
Test success rate 42.38%
FATAL ERROR: 208 regression(s) detected.
ivan@pelias:/code/docker/projects/north-america$
You can see the full traceback here, it is very large.
What could be the reason?
The pelias compose logs
and pelias elastic stats
commands should provide more information.
The
pelias compose logs
andpelias elastic stats
commands should provide more information.
pelias elastic stats:
{
"took" : 141,
"timed_out" : false,
"_shards" : {
"total" : 4,
"successful" : 4,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : {
"value" : 0,
"relation" : "eq"
},
"max_score" : null,
"hits" : [ ]
},
"aggregations" : {
"sources" : {
"doc_count_error_upper_bound" : 0,
"sum_other_doc_count" : 0,
"buckets" : [ ]
}
}
}
And attached compose_logs compose_logs.txt
There are a lot of ECONNABORTED
errors in the logs and the stats command shows there are no documents in the elasticsearch index.
Something is seriously wrong with your setup, either that you didn't follow the documentation correctly or that your system is incorrectly configured for this scale of data.
Did you try out a smaller region first as suggested in the docs?
Something is seriously wrong with your setup, either that you didn't follow the documentation correctly or that your system is incorrectly configured for this scale of data. I followed step by step, command by command.
The only thing that was strange: after I made a polyline file from Valhalla and transferred it to the "/data/polyline" folder, I ran the "pelias prepare all" command, but in the console I saw:
ivan@pelias:/code/docker/projects/north-america$ pelias prepare all
Creating extract at /data/placeholder/wof.extract
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
/data/openstreetmap/north-america-latest.osm.pbf is very large.
You will likely experience memory issues working with large extracts like this.
We strongly recommend using Valhalla to produce extracts for large PBF extracts.
see: https://github.com/pelias/polylines#download!data
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
converting /data/openstreetmap/north-america-latest.osm.pbf to /data/polylines/extract.0sv
^CDone!
But I am sure that this file was there, I checked before starting the installation:
ivan@pelias:/code/docker/projects/north-america$ ls /data/polylines
extract.0sv
Something is seriously wrong with your setup, either that you didn't follow the documentation correctly or that your system is incorrectly configured for this scale of data.
Otherwise, I followed step by step command by command. The VM on which I did this fully meets the requirements, there is even a margin of performance and memory.
Any suggestions, or should I try to start all over again?
Hi @kshnkvn, I'm going to need some specifics on your environment before I could even comment.
Like what OS are you running? How is docker configured? What CPU/RAM/disk you have allocated etc. etc.
Which smaller extract did you run and what was your experience with that?
@missinglink
Azure VM instance Ubuntu 18.04.4 LTS 4 CPU 348.73GB disk 32GB RAM Docker version 19.03.6 docker-compose version 1.25.4
Completely clean system, nothing is running.
Which smaller extract did you run and what was your experience with that?
I tried step by step on my PC, everything worked. All I did at the VM was to change the config, as you advised, and also used Valhalla, but there was a error with it that I wrote about earlier. Now the Pelias container seems to be working, but I don’t know what features are not available to me. In fact, I only need a search, probably ... You can check for yourself: http://52.186.30.15:4100/demo/#eng
@missinglink I completely reinstalled Pelias for North America, step by step, every action. Now it’s better, but still there are errors. I loaded the logs into separate files and attached them here. test_log.txt elastic_stats.txt compose_logs.txt
I followed step by step, command by command.
The only thing that was strange: after I made a polyline file from Valhalla and transferred it to the "/data/polyline" folder, I ran the "pelias prepare all" command, but in the console I saw:
ivan@pelias:/code/docker/projects/north-america$ ls /data/polylines extract.0sv
If you created polyline data using valhalla, and have the resulting extracr.0sv, you don't need to do the polyline prepare step (which extracts polyline data from OSM file), which will run when you issue the "pelias prepare all" command. Just issue the other "pelias prepare ..." commands separately, then import.
@w0pr yes, i know it, so second time i did not do it, just pelias prepare placeholder