qfieldcloud
qfieldcloud copied to clipboard
Local File List for Project 'XY' empty
I'm playing around with a self hosted instance of QfieldCloud - i got it working as far as i can tell, i can login to the django interface and with QfieldSync i can upload a project. Although when i want to download that said project in Qfield, i always get a "File Not Found" error:
Output Pre:
::<<<::33755800-6ec9-4f57-a995-586c01bad2cf Start QGIS Application
14:33:06.149 root INFO Starting QGIS app version 33601 (3e589453264)...
14:33:07.871 root INFO QGIS app started!
::>>>::33755800-6ec9-4f57-a995-586c01bad2cf 2
::<<<::d3352db8-2f9b-4d1a-bc81-5568732179bc Download Project Directory
14:33:07.871 root INFO Preparing a temporary directory for project files…
14:33:08.231 root INFO Downloading project files…
14:33:08.235 root INFO Downloading project files finished!
14:33:08.235 root INFO Getting project files list…
14:33:08.239 root INFO Local files list for project "a9507894-5dd5-4731-9493-4bbed4da54ac": empty!
::>>>::d3352db8-2f9b-4d1a-bc81-5568732179bc 2
::<<<::766b5187-249c-4fff-b1b1-13eda7494542 QGIS Layers Data
14:33:08.239 ENTRYPNT INFO Extracting QGIS project layer data…
14:33:08.243 root INFO Loading QGIS project "/tmp/tmp4a9la3l7/files/XYZ.qgs"…
::>>>::766b5187-249c-4fff-b1b1-13eda7494542 1
14:33:08.372 root INFO Stopping QGIS app…
14:33:08.639 root INFO Deleted QGIS app!
Feedback Pre:
{
"container_exit_code": 0,
"error": "/tmp/tmp4a9la3l7/files/XYZ.qgs",
"error_class": "FileNotFoundError",
"error_origin": "container",
"error_stack": [
" File \"/usr/src/app/qfc_worker/utils.py\", line 667, in run_workflow\n return_values = step.method(**arguments)\n",
" File \"/usr/src/app/entrypoint.py\", line 137, in _extract_layer_data\n project = open_qgis_project(str(project_filename))\n",
" File \"/usr/src/app/qfc_worker/utils.py\", line 226, in open_qgis_project\n raise FileNotFoundError(project_filename)\n"
],
"error_type": "FILE_NOT_FOUND",
"feedback_version": "2.0",
"outputs": {
"download_project_directory": {},
"start_qgis_app": {}
},
"steps": [
{
"id": "start_qgis_app",
"name": "Start QGIS Application",
"returns": {},
"stage": 2
},
{
"id": "download_project_directory",
"name": "Download Project Directory",
"returns": {
"tmp_project_dir": "<non-serializable: PosixPath /tmp/tmp4a9la3l7>"
},
"stage": 2
},
{
"id": "qgis_layers_data",
"name": "QGIS Layers Data",
"returns": {},
"stage": 1
},
{
"id": "package_project",
"name": "Package Project",
"returns": {},
"stage": 0
},
{
"id": "qfield_layer_data",
"name": "Packaged Layers Data",
"returns": {},
"stage": 0
},
{
"id": "stop_qgis_app",
"name": "Stop QGIS Application",
"returns": {},
"stage": 0
},
{
"id": "upload_packaged_project",
"name": "Upload Packaged Project",
"returns": {},
"stage": 0
}
],
"workflow_id": "package_project",
"workflow_name": "Package Project",
"workflow_version": "2.0"
}
Via the admin interface i can check, all the files are there - any idea, in which direction i could investigate? Maybe an MinIO issue?
Update: After the initial upload to the cloud, i have to save the project once more and upload the .qgs file again, that makes it work then. Somehow the cloud seems not to wait until the initial upload is finished when calling the worker i guess.
It would be interesting to know what is the list of files returned from the remote. Whatever is the issue, should be somewhere around this snippet:
https://github.com/opengisch/qfieldcloud/blob/master/docker-app/qfieldcloud/core/views/files_views.py#L285-L324