dashy icon indicating copy to clipboard operation
dashy copied to clipboard

[BUG] Rebuild of Dashy in ProxMox cluster v 8.0.4 / LinuxTurnkey / docker with ZFS storage is crashing.

Open LuxBibi opened this issue 2 years ago β€’ 33 comments

Environment

Self-Hosted (Docker)

System

ProxMox Ver 8.0.4 / 3 node Cluster - LXC - Debian GNU/Linux 11 (bullseye) - docker Version: 24.0.5

Version

Dash version 2.1.1

Describe the problem

As I start a rebuild of Dashy Dashboard via icon/menu; (as I changed some *.yml files) -> Config -> Update Configuratipon -> Rebuild Application the rebuild process is "crashing" and Dashy is no more usaeable at all.

Dashy works (using and rebuilding) like a charm on a Synology NAS, on ProxMox with container on local-lvm storage, with exact same image version. Problems only do apear as I have the lxc (Linux container where docker is running) on a cluster wide ZFS storage (shared between all the ProxMox nodes)

I am able to reproduce this error at 100% for every time I do recreate working Dashy and rebuild!

As soon as I move the docker container to local storage (local-lvm) on the ProxMox cluster, no problem at all.

Workaround if someone encounters same problem; Move the lxc container disk to local storage (e.g. local-lvm), make all the changes you need, redeploy a new Dashy image and rebuild Dashy. As Dashy is up and running again (Healthy status in portainer) rebuild and validate that actual set-up in *.yml file(s) does match your needs. As this is OK, move back lxc container to cluster wide ZFS storage to have ProxMox cluster redundancy for dashy. (Not very userfriendly and quite time consuming) but works

Additional info

dashyRebuild-BuildOperationFailed _Dashy-it-4-Home_logs.txt

Please tick the boxes

LuxBibi avatar Oct 01 '23 14:10 LuxBibi

If you're enjoying Dashy, consider dropping us a ⭐
πŸ€– I'm a bot, and this message was automated

liss-bot avatar Oct 01 '23 14:10 liss-bot

Just to mention, that Dashy is for me by far the best Dashboard solution to meet my needs. Mainly because of the multipage an easy search functionality. Thanks Liss ;-)

LuxBibi avatar Oct 01 '23 14:10 LuxBibi

Just to mention, that Dashy is for me by far the best Dashboard solution to meet my needs. Mainly because of the multipage an easy search functionality. Thanks Liss ;-)

Hi can you share your Dashy Logs? And your prxomox lxc container config? Located int /etc/pve/lxc/<CTID>.conf (on the proxmost host)

CrazyWolf13 avatar Oct 19 '23 17:10 CrazyWolf13

Hi, Here the 600.conf (prio-2 docker running CT=600 = dashy)

arch: amd64 cores: 2 features: keyctl=1,nesting=1 hostname: dockerPrio2Services memory: 2048 net0: name=eth0,bridge=vmbr0,gw=192.168.178.1,hwaddr=3E:CB:DD:D1:4C:DB,ip=192.168.178.202/24,type=veth ostype: debian rootfs: cluZFS-1:subvol-600-disk-0,size=8G swap: 512 tags: docker;ha unprivileged: 1

I've attached the dashy logs downloaded from portainer. Let me know, if you need another log file, as I am not able to shell in the dashy container. (Looked @ Google some months ago, this is normal behavior)

Steps done for your log files:

  • Stopped container in portainer

  • restarted dashy container

  • checked log file to be "normal"

    See file: "2023-10-20_Dashy-it-4-Home_logs after RESTART.txt" 2023-10-20_Dashy-it-4-Home_logs after RESTART.txt

  • tested dashy. Worked as expected

  • Rebuild of dashy without changing any *.yml file. "Just" started a rebuild;

    1. "Config"
    2. "Update Configuration"
    3. "Rebuild Application"
    4. "Start Build"
  • dashy error while rebuilding. dashy dashboard not displayed anymore (Chrome refresh)

image

image

  • downloaded log file of dashy-container via portainer

See file: "2023-10-20 [email protected] build NO CONFIG FILE CHANGE = BAD.txt" 2023-10-20 [email protected] build NO CONFIG FILE CHANGE = BAD.txt

Attached the log file of same container (restored in ProxMox), as soon as I move this container to a local-lvm storage, and doing a rebuild. [email protected] build NO CONFIG FILE CHANGE = OK.txt

Let me know if I can help to solve issue by providing more information.

Thanks so far, Luc

LuxBibi avatar Oct 20 '23 00:10 LuxBibi

That indeed is very confusing, can you share why you use /home/home after the port in the url of dashy?

And can you share your browser console when dashy says "not found" ?

(Depending on your browser, tight click anywhere then developer options)

Some random guide I found: https://balsamiq.com/support/faqs/browserconsole/#:~:text=To%20open%20the%20developer%20console,(on%20Windows%2FLinux) .

LeLuc @.***> schrieb am Fr., 20. Okt. 2023, 02:02:

Hi, Here the 600.conf (prio-2 docker running CT=600 = dashy)

arch: amd64 cores: 2 features: keyctl=1,nesting=1 hostname: dockerPrio2Services memory: 2048 net0: name=eth0,bridge=vmbr0,gw=192.168.178.1,hwaddr=3E:CB:DD:D1:4C:DB,ip= 192.168.178.202/24,type=veth ostype: debian rootfs: cluZFS-1:subvol-600-disk-0,size=8G swap: 512 tags: docker;ha unprivileged: 1

I've attached the dashy logs downloaded from portainer. Let me know, if you need another log file, as I am not able to shell in the dashy container. (Looked @ Google some months ago, this is normal behavior)

Steps done for your log files:

Stopped container in portainer

restarted dashy container

checked log file to be "normal"

See file: "2023-10-20_Dashy-it-4-Home_logs after RESTART.txt" 2023-10-20_Dashy-it-4-Home_logs after RESTART.txt https://github.com/Lissy93/dashy/files/13048532/2023-10-20_Dashy-it-4-Home_logs.after.RESTART.txt

tested dashy. Worked as expected

Rebuild of dashy without changing any *.yml file. "Just" started a rebuild;

  1. "Config" 2. "Update Configuration" 3. "Rebuild Application" 4. "Start Build"

dashy error while rebuilding. dashy dashboard not displayed anymore (Chrome refresh)

[image: image] https://user-images.githubusercontent.com/45201013/276776129-295360d3-dac1-4b6f-a3e6-7bd08895befd.png

[image: image] https://user-images.githubusercontent.com/45201013/276776319-02c8fa74-c0c4-4de1-ba00-55bd269c4b32.png

  • downloaded log file of dashy-container via portainer

See file: "2023-10-20 @.*** build NO CONFIG FILE CHANGE = BAD.txt" 2023-10-20 @.*** build NO CONFIG FILE CHANGE = BAD.txt https://github.com/Lissy93/dashy/files/13048535/2023-10-20.Dashy%402.1.1.build.NO.CONFIG.FILE.CHANGE.BAD.txt

Attached the log file of same container (restored in ProxMox), as soon as I move this container to a local-lvm storage, and doing a rebuild. @.*** build NO CONFIG FILE CHANGE = OK.txt https://github.com/Lissy93/dashy/files/13048561/Dashy%402.1.1.build.NO.CONFIG.FILE.CHANGE.OK.txt

Let me know if I can help to solve issue by providing more information.

Thanks so far, Luc

β€” Reply to this email directly, view it on GitHub https://github.com/Lissy93/dashy/issues/1337#issuecomment-1771863620, or unsubscribe https://github.com/notifications/unsubscribe-auth/AXBPCQFXNFFDPKA3KHSHSMDYAG5SLAVCNFSM6AAAAAA5OLJY3SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONZRHA3DGNRSGA . You are receiving this because you commented.Message ID: @.***>

CrazyWolf13 avatar Oct 20 '23 04:10 CrazyWolf13

Hi CrazyWolf13 ;-) Hi all,

Took some time, as I wanted to make some tests by moving my ProxMox from ZFS to ceph ...

So before digging in in the responses I was asked for: ceph does not give any problem whilst rebuilding the config in Dashy ! I will switch back to ZFS and redoo tomorrow latest the same rebuild process, and see if same problem does still apply. I'll post my results.

Here the answers to the questions. /home/home is because I ma running Dashy with multiple pages ... I admit, that the name of my first page does not help to understand this url ;-) .. But this is OK for me .. so for the other pages I have /home/it-4-home ...

Here the conf.yml file ...

pageInfo: title: '| it-4-Home |' description: Access all the systems/services hosted @ | it-4-Home | footerText: '' pages:

  • name: Home path: conf.yml
  • name: it-4-Home path: it-4-Home.yml
  • name: SmartHome path: SmartHome.yml
  • name: Networking path: Networking.yml
  • name: Monitoring path: Monitoring.yml
  • name: Music and Videos path: MusicAndVideos.yml
  • name: Documentation path: Documentation.yml sections:

Just my point of view ... (I am not a developer at all ;-) ) : I presume that this bug is more related to how Dashy is accessing files with different filesystems. ZFS versus local-lvm vs ceph ... this because other containers behave without no problem till now, whilst also running on any of these filesystems, but mainly on ZFS.

The only way to understand what is happening will be to have on your side someone debugging step by step the rebuild process on ZFS. Especially as this does happen on every rebuild. Be it with a minimalistic *.yml file, or with my more complex multi page *yml set-up.

Developer console in the next post, as I have switched back to ZFS ..

Thanks so far Luc

LuxBibi avatar Oct 21 '23 13:10 LuxBibi

Hi guys,

I switched back to ZFS .. and problem reoccurs exactly the same way, as before .. So local-lvm and ceph is working ... Local-lvm is not an optipon, as I do run a ProxMox-Cluster .. ceph is for this configuration consuming too much ressources .. So ZFS is the only possible option for me ..

Hope you'll find where this problem comes from ...

Hereafter the Chrome COnsole while refreshing the page ..

image

Thanks so far ..

Luc

LuxBibi avatar Oct 21 '23 14:10 LuxBibi

Hi guys,

I switched back to ZFS .. and problem reoccurs exactly the same way, as before .. So local-lvm and ceph is working ... Local-lvm is not an optipon, as I do run a ProxMox-Cluster .. ceph is for this configuration consuming too much ressources .. So ZFS is the only possible option for me ..

Hope you'll find where this problem comes from ...

Hereafter the Chrome COnsole while refreshing the page ..

image

Thanks so far ..

Luc

Please look again through the guide I sent you and send me the output of the Browser Console.

CrazyWolf13 avatar Oct 22 '23 08:10 CrazyWolf13

Hallo ,

Was not at Home for some days ... Therefore some delay ..

Hereafter the asked information. I sent last time the Network part, as the Console view showed only few lines ..

This time I activated in Menu "All the levels", the unselected "Verbose" mode .. which finally showed more information. Screenshot and Logfiles (preserved mode) are attached.

First Console file:

  • Navigated to my Dashy page.
  • Rebuild of Dashy with "Build operation failed"

image dashy.local-1698136920861.log

Second Console file:

  • Start of new Google Chrome (Microsoft Edge exact same behaving)
  • Activated the Console (Checked Verbose is still activated)
  • Navigated to my Dashy

This time the Dashy page apeared showing all the different steps as he was trying to build his environment. "Initializing" / "Running Checks" / "Building" / "Finishing Off" / "Almost Done" / "Not Long Left" / "Taking Longer than Expected" 2023-10-24_002 PicPick

and ended up in the Dashy Error Page ... with nothing in the console displayed. I've tested also in Edge, but exact same behaving. image

Just as some information which might be important for you. I stopped my Docker lxc on ProxMox, moved the root-disk to "llocal-lvm" storage, restarted the lxc and navigated to my Dashy page. Exact same sequence of Dashy startup as described hereabove after the rebuild. This ended up on the same error. So only way to get my config running again, is to redeploy a new clean Dashy container via Portainer. As all the pages are after this rebuild showing the exact same layout! I had to rebuild the Dashy environment via the Dash rebuild whilst on local-lvm storage. Worked like a charm. image

Log file of succesfull rebuild attached: dashyRebuildOnLocal-Lvm OK.txt

Moving it afterwards to ZFS gives a running Dashy container on ZFS.

Let me know if you do need more ..

Luc

LuxBibi avatar Oct 24 '23 09:10 LuxBibi

One thought ... but to be validated, if this can help on your side to narrow/pinpoint the issue ...

I can export my container after I've regenerated Dashy on ZFS .. so you can import on your side, and should be able to "see" what exactly is the problem, and myabe have a hint where this might come from .. ?

Just let me know ...

LuxBibi avatar Oct 24 '23 13:10 LuxBibi

Hi all,

Just to see, if this issue is still open, and if someone needs more information? I am there to help to get rid of this .. ;-)

LuxBibi avatar Nov 15 '23 00:11 LuxBibi

Hi @CrazyWolf13

I have moved the dashy docker to be stored on local-lvm, knowing, that this will disable in my ProxMox cluster the availability of dashy in case of failure of the docker-HOST, or the ProxMox host.

I therefore removed my dashy docker storage to cluZFS, and tried to apply the modifications I needed to do to the dashy files.

Seems, that this time an error Message was displayed at the end of the log fil in docker, which did not apear in my previous faulty rebuilds.

Hope this helps to make dashy cluZFS compatible .. ;-)

Available to help, in this tricky issue.

Luc

docker-dashy-LOG-file_

β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β•šβ•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β•šβ•β•β•β•β•β• β•šβ•β• β•šβ•β•β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β• β•šβ•β•


Welcome to Dashy! πŸš€ Your new dashboard is now up and running with Docker


Using Dashy V-2.1.1. Update Check Complete βœ… Dashy is Up-to-Date

[email protected] build vue-cli-service build Error: ENOENT: no such file or directory, stat '/app/dist/index.html' Error: ENOENT: no such file or directory, stat '/app/dist/index.html'

LuxBibi avatar Nov 22 '23 17:11 LuxBibi

I sadly cannot help you any further with this, the best bet is to hope for Lissy to look into this, however as she is really busy and there are way more important things heavily needing her attention. So I guess right now it's a bit out of the scope, however this is only my opinion and maybe lissy looks into this:)

CrazyWolf13 avatar Nov 22 '23 17:11 CrazyWolf13

Thanks @CrazyWolf13 for your fast reply.

Ok, I will keep an eye on this issue, so I can maybe help with some actions on my side, as @Lissy93 has some time to take a look at this.

Alm is so far documented in this ticket. I would be happy to help you guys, if any information is needed.

Thx so far. Thumbs up for this beautiful dashboard. Really great 🀩

Danke Tobias

LuxBibi avatar Nov 22 '23 19:11 LuxBibi

I'm going to be taking a look into this myself, and I'm wondering about a couple questions @LuxBibi:

  • Can you post the output of docker info on the host that initially led to the error (the LXC backed by ZFS)?
  • When you say cluster wide ZFS storage, what do you mean - do you mean a ZFS backed NFS/CIFS/SMB share, or some other out of band method (Multi-server DAS etc)?
  • What image tag exactly are you using? If possible, the full compose file would be nice as well
    • If you post any configuration/log file output here, make sure to use triple backticks/grave (```) to format it nicely: ```yaml yaml: here: true ``` becomes
yaml:
  here: true

TheRealGramdalf avatar Jan 09 '24 03:01 TheRealGramdalf

Hi all, Here the different information you've asked for. Let me know, if badly understood on my side, and if any info is missing. I will do my best to provide all you need to debug this issue.

Your question: - Can you post the output of docker info on the host that initially led to the error (the LXC backed by ZFS)?

--> The docker host where dashy is running is 'Prio2-0-srvc' build from "debian-11-turnkey-core_17.1-1_amd64.tar.gz" template. (Same issue occured with node.js version "debian-11-turnkey-nodejs_17.1-1_amd64.tar.gz"

Here the output of '$docker inf' on this lxc.

Client: Docker Engine - Community
 Version:    24.0.7
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.11.2
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.21.0
    Path:     /usr/libexec/docker/cli-plugins/docker-compose

Server:
 Containers: 3
  Running: 3
  Paused: 0
  Stopped: 0
 Images: 3
 Server Version: 24.0.7
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Using metacopy: false
  Native Overlay Diff: true
  userxattr: false
 Logging Driver: json-file
 Cgroup Driver: systemd
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
 Swarm: inactive
 Runtimes: io.containerd.runc.v2 runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: d8f198a4ed8892c764191ef7b3b06d8a2eeb5c7f
 runc version: v1.1.10-0-g18a0cb0
 init version: de40ad0
 Security Options:
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 6.2.16-19-pve
 Operating System: Debian GNU/Linux 11 (bullseye)
 OSType: linux
 Architecture: x86_64
 CPUs: 2
 Total Memory: 2GiB
 Name: Prio2-0-srvc
 ID: df970e67-8b11-4c5b-bbd6-c83da7d88155
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false

Your question: - When you say cluster wide ZFS storage, what do you mean - do you mean a ZFS backed NFS/CIFS/SMB share, or some other out of band method (Multi-server DAS etc)?

--> My ProxMox cluster consists of 3 MacMini nodes, where each does contain 2 SSD's. Each SSD/sda and SSD/sdb are the exact same brand/size on every MacMini. So ZFS consists of local attached SSD's (sdb). No CIFS used for this part.

The /dev/sdb is formatted on each proxmox with ZFS from within the proxmox UI (standard procedure). Later a ZFS pool was created, and lxc replication was put in place. Hereafter the details of sdb on pve node 'srvProxMox-2'

NAME   FSTYPE         FSVER  LABEL      UUID 
sdb
β”œβ”€sdb1 zfs_member 5000    cluZFS-1  5018820533082953278
└─sdb9

Your question: -What image tag exactly are you using? If possible, the full compose file would be nice as well_

--> My docker compose file:

version: "3.8"
services:
  dashy:
    image: lissy93/dashy
    container_name: Dashy-it-4-Home
    network_mode: bridge
    ports:
      - 8085:80
    # Set any environmental variables
    environment:
      - NODE_ENV=production
    # Specify your user ID and group ID. You can find this by running `id -u` and `id -g`
      - UID=1000
      - GID=1000
    # Specify restart policy
    restart: unless-stopped
    # Pass in your config file below, by specifying the path on your host machine
    volumes:
      # ProxMox CLuster:
      # bind details for the Dashy config files
      - /i4hDocker/data/Dashy-it-4-Home/conf.yml:/app/public/conf.yml
      - /i4hDocker/data/Dashy-it-4-Home/it-4-Home.yml:/app/public/it-4-Home.yml
      - /i4hDocker/data/Dashy-it-4-Home/dom-4-Home.yml:/app/public/dom-4-Home.yml
      - /i4hDocker/data/Dashy-it-4-Home/Networking.yml:/app/public/Networking.yml
      - /i4hDocker/data/Dashy-it-4-Home/MusicAndVideos.yml:/app/public/MusicAndVideos.yml
      - /i4hDocker/data/Dashy-it-4-Home/Monitoring.yml:/app/public/Monitoring.yml
      - /i4hDocker/data/Dashy-it-4-Home/Documentation.yml:/app/public/Documentation.yml
      - /i4hDocker/data/Dashy-it-4-Home/localIcons:/app/public/item-icons/

    healthcheck:
      test: ['CMD', 'node', '/app/services/healthcheck']
      interval: 1m30s
      timeout: 10s
      retries: 3
      start_period: 40s

Let me know if more is needed,

Luc

LuxBibi avatar Jan 09 '24 11:01 LuxBibi

One thought ... but to be validated, if this can help on your side to narrow/pinpoint the issue ...

I can export my container after I've regenerated Dashy on ZFS .. so you can import on your side, and should be able to "see" what exactly is the problem, and myabe have a hint where this might come from .. ?

Just let me know ...

I think that would be great. If you can upload that file I can take a look - if possible, try it with the same config, completely regenerating it from scratch, so that the only thing that changes (hopefully) is the backing storage. A docker info from the LXC each time would also be great.

TheRealGramdalf avatar Jan 09 '24 18:01 TheRealGramdalf

Hi,

Just to be aligned .. I'll do a;

  • docker info
  • docker export of working Dashy
  • move storage from local to ZFS
  • rebuild with the exact same *.yml files
  • docker info
  • docker export of corrupted Daschy. Thanks in advance, Luc

LuxBibi avatar Jan 09 '24 19:01 LuxBibi

Nearly:

  • Rebuild Container
  • docker info
  • docker export of working Dashy
  • move storage from local to ZFS
  • rebuild with the exact same *.yml files
  • docker info
  • docker export of corrupted Daschy.

Where a rebuild is:

  • Remove container (docker rm [containername]/remove via webui (I forget how portainer does it))
  • restart conatainer (docker compose up -d/start stack via webui)

Thanks for clarifying!

TheRealGramdalf avatar Jan 09 '24 20:01 TheRealGramdalf

Hi,

Here as asked the different export files. Hope this helps ;-)

One information which might be important; After removing the Dashy container within portainer ( with option "Automaticlly remove non-persistent volumes"), I recreated my | it-4-Home | Dashy container by using the docker compose file (portainer stack) I've sent to you previously. To be noticed, that the *.yml config files of Dashy (binded in compose file) are stored for persistance on the docker-host "Prio-2-0-srvc". So these files are the same since few days ;-).

The point I wanted to mention, is that after recreating the Dashy container from the portainer stack while being on local-lvm storage type, that after this docker compose up ... Dashy does for all the different "Pages" (*.yml config files) only display the same Home page for all the pages! A Rebuild does fix this. So this is what I did (rebuild Dashy from Dashy interface)

To give you the oportunity to see, if this issue is normal behaving I updated the prior sequence with an aditional docker export. So you do have the whole picture. (Maybe this is behaving by design)

So you'll find here attached;

  1. Remove existing container while preserving all the *.yml files
  2. Rebuild Container (with docker up or portainer stack)
  3. docker info
Client: Docker Engine - Community
 Version:    24.0.7
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.11.2
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.21.0
    Path:     /usr/libexec/docker/cli-plugins/docker-compose

Server:
 Containers: 3
  Running: 3
  Paused: 0
  Stopped: 0
 Images: 3
 Server Version: 24.0.7
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Using metacopy: false
  Native Overlay Diff: true
  userxattr: false
 Logging Driver: json-file
 Cgroup Driver: systemd
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
 Swarm: inactive
 Runtimes: io.containerd.runc.v2 runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: d8f198a4ed8892c764191ef7b3b06d8a2eeb5c7f
 runc version: v1.1.10-0-g18a0cb0
 init version: de40ad0
 Security Options:
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 6.2.16-19-pve
 Operating System: Debian GNU/Linux 11 (bullseye)
 OSType: linux
 Architecture: x86_64
 CPUs: 2
 Total Memory: 2GiB
 Name: Prio2-0-srvc
 ID: df970e67-8b11-4c5b-bbd6-c83da7d88155
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false
  1. docker export of badly behaving Dashy --> attached file = 0-Dashy-local-lvm-afterRebuild.tar
  2. Dashy-Pages-Rebuild (Rebuild within Dashy UI)
  3. docker info
Client: Docker Engine - Community
 Version:    24.0.7
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.11.2
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.21.0
    Path:     /usr/libexec/docker/cli-plugins/docker-compose

Server:
 Containers: 3
  Running: 3
  Paused: 0
  Stopped: 0
 Images: 3
 Server Version: 24.0.7
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Using metacopy: false
  Native Overlay Diff: true
  userxattr: false
 Logging Driver: json-file
 Cgroup Driver: systemd
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
 Swarm: inactive
 Runtimes: io.containerd.runc.v2 runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: d8f198a4ed8892c764191ef7b3b06d8a2eeb5c7f
 runc version: v1.1.10-0-g18a0cb0
 init version: de40ad0
 Security Options:
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 6.2.16-19-pve
 Operating System: Debian GNU/Linux 11 (bullseye)
 OSType: linux
 Architecture: x86_64
 CPUs: 2
 Total Memory: 2GiB
 Name: Prio2-0-srvc
 ID: df970e67-8b11-4c5b-bbd6-c83da7d88155
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false
  1. docker export of working Dashy (local-lvm) --> attached file = 1-Dashy-local-lvm-afterDashyRebuild.tar
  2. Stop LXC / move storage from local to ZFS / restarting / Testing Dashy => everything was working
  3. Dashy-Pages-Rebuild (Rebuild within Dashy UI) with the exact same *.yml files --> I do confirm that exact same error "Build operation failed" is displayd, rendering the Dashy container useless
  4. docker info
Client: Docker Engine - Community
 Version:    24.0.7
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.11.2
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.21.0
    Path:     /usr/libexec/docker/cli-plugins/docker-compose

Server:
 Containers: 3
  Running: 3
  Paused: 0
  Stopped: 0
 Images: 3
 Server Version: 24.0.7
 Storage Driver: overlay2
  Backing Filesystem: zfs
  Supports d_type: true
  Using metacopy: false
  Native Overlay Diff: true
  userxattr: false
 Logging Driver: json-file
 Cgroup Driver: systemd
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
 Swarm: inactive
 Runtimes: io.containerd.runc.v2 runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: d8f198a4ed8892c764191ef7b3b06d8a2eeb5c7f
 runc version: v1.1.10-0-g18a0cb0
 init version: de40ad0
 Security Options:
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 6.2.16-19-pve
 Operating System: Debian GNU/Linux 11 (bullseye)
 OSType: linux
 Architecture: x86_64
 CPUs: 2
 Total Memory: 2GiB
 Name: Prio2-0-srvc
 ID: df970e67-8b11-4c5b-bbd6-c83da7d88155
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false
  1. docker export of corrupted Daschy. --> attached file = 2-Dashy-ZFS-afterDashyRebuild.tar

I did not pul a new image from the docker hub while doing all these container rebuilds. Same Dashy image for all containers!

Hope this helps.

BTW: I will be in holiday starting this Friday till the 21. January. So you can organize yourself ;-)

Luc

LuxBibi avatar Jan 09 '24 22:01 LuxBibi

Files are too huge to uplload! Smalest file is 491 MB .. Max is 509 MB. Max upload on GitHub is 25 MB.

Let me know if we can find another service to transfer these files, like WeTransfer. But then I need an e-mail where I can send the link. :-(

Luc

LuxBibi avatar Jan 09 '24 23:01 LuxBibi

Hi,

I do recontact you, just to see if you have an idea, how to send you the files ?

I will be available till Friday evening ;-) ... Will be back on 22nd January ..

I adapt to your needs ..

Thanks for the support so far.

Luc

LuxBibi avatar Jan 11 '24 17:01 LuxBibi

Apologies, I was busy the past few days. Feel free to upload them to something like pixeldrain, and I'll take a look from there.

Also about the rebuild thing - this is a bug introduced in 2.1.1 that causes the application to not be rebuilt on startup (which it did in 2.1.0). See https://github.com/Lissy93/dashy/issues/1290#issuecomment-1884249018 for a full explanation.

TheRealGramdalf avatar Jan 11 '24 20:01 TheRealGramdalf

Hi ...

No problem at all ;-) .... I am happy, that you do take your time to look at this problem. Especially, as I do not know, if it is related to Dashy, or even a third party coincidence .. Hope you'l find it out ;-)

Amazing tool .. Better than WeTransfer, as no e-mail needed ;-) .. Perfect ..

So here we are .. Here is the link: 2024-01-11-dashyDockerExports pointing to all 3 files as described in my previous e-mail ...

Thanks so far for your help ;-)

Luc

LuxBibi avatar Jan 11 '24 20:01 LuxBibi

This may or may not work, but have you tried creating another ZFS instance to test to see if it's occurring there as well? I have had some whacky issues with permissions on my ZFS proxmox cluster before

JPDucky avatar Jan 13 '24 18:01 JPDucky

Hi,Thanks for your input.

I have some doubt about this, as I have in my cluster +/- 19 containers running. All are running smoothly! Dashy also runs smoothly on each node of the clustern as I move the lxc arround to the other nodes. (Migrating lxc = docker-host)The only difference Dashy does represent, is that it can β€œrebuild” some internal config. It is exactly this process, while being on the ZFS storage, that Dashy gets corrupted. Other containers, like WiKi,js / ntopng / grafana / … do all work correctly.

Depending on the news as I return from holiday on 22. January, I’ll give it a try;-).I’ll let you know.

Thanks for your input. Luc

LuxBibi avatar Jan 14 '24 08:01 LuxBibi

Hi all,

Just wanted to let you know, that I am back from holiday .. ;-) Ready to help, if needed. No hurry, as I know you have for sure other topics also to handle. Wanted to just to let you know, that I am am able as of now, to give further details, you may need ..

Your speed will be mine.

Thanks for your help so far.

Luc

LuxBibi avatar Jan 22 '24 17:01 LuxBibi

Allright, so I've finally got a chance to look at this a little bit.

I used Meld to take a look at the differences between the three; here are the main highlights:

  • The first export (0-local-lvm) is missing /app/dist/*.yml files (your dashboard configuration)
  • The second export (as expected) seems to have everything in order
  • The last export (2-zfs) is completely missing /app/dist/* - the directory is completely empty

The implications:

0-local-lvm
  • The configuration files are inside the container, and thus should be accessible to the rebuilder. This is connected to the issue regarding 2.1.1/2.1.0 - in 2.1.1, the configuration files aren't copied from /app/public to /app/dist. This is also the case with my own container - /app/dist/conf.yml is missing after I recreate the container.
1-local-lvm
  • Working as expected, mostly used as a reference point
2-zfs
  • There is nothing in /app/dist, which causes lots of issues. I'm uncertain exactly what the cause is, but it is distinctly different than 0-local-lvm - that was simply an issue with configuration files, but in ZFS the directory is empty. My best guess would be that there is either a bind mount or volume covering /app/dist, but that shouldn't be the case according to the compose file.
  • My next best guess is that when a rebuild is triggered, yarn removes everything in /app/dist as the first step (I confirmed this to be the case) - but something is preventing the rebuild, so nothing new is written.
What to do

Unfortunately it's quite hard to diagnose things remotely like this, but my current leading theory is that it's to do with the docker image being stored originally on LVM, but then migrated to ZFS - try deleting the docker image from your LXC completely (I believe you can do it through the portainer webUI) so it gets re-downloaded when you restart the container.

TheRealGramdalf avatar Jan 24 '24 22:01 TheRealGramdalf

Hi,

Thanks for this detailled feedback. I know, that debugging this issue is quite complex if you do not have the necessary infrastructure. Sorry for that. Thanks for all your help to find what is not behaving as expected ;-) as this will help us all.

I was according to your feedback deleteing all the dashy related items on docker (via Portainer). This includes;

  • Image
  • Docker container

Docker compose file and dashy *.yml files not changed since +/- a month. LXC container is running on ZFS-Storage also for +/- a month.

I repulled the image from Docker-Hub (latest which is TAG: 2.1.1 via the docker compose file and ended up in the Dashy container running (Healthy Status displayed in Portainer), but as usual with the same menu items displayed in all dashy-Menus. (This behavior is since the very beginning. No difference in lvm or ZFS or ceph. Nor really anyoing. OK for me, as I know I just need to rebuild to have entries showing up)

I then did a rebuild, to have these different Pages/Menus recreated, while remaining on ZFS.

Same error occured, as on the very beginning.

image

BTW: I created a ceph-storage on the same ProxMox cluster and Dashy does behave exactly the same way on ceph-storage, than on ZFS-Storage. :-(

I remain available for any information you may need.

P.S.: Image I used while creating this bug (TAG: 2.2.1) is the same as the image I just pulled (TAG: 2.1.1) Did I miss some "newer" not yet published image you want me to test?

LuxBibi avatar Jan 25 '24 16:01 LuxBibi

Hi @TheRealGramdalf , Hi all,

For every new release you generate, I test your new version on my ProxMox Cluster, to see if the error I do encounter, may have disapeared.

  • Config related to ProxMox is the same, despite the "normal" system updates and your latest Dashy - V-3.0.0. release.

All the other prior releases did not change anything on my problem. But, this release does react differently.

I hope that this may help you to pinpoint what the problem is. It is also intended to help other users now facing this new situation!

I do post my log fiels hereafter.


Details:

I was running Dashy 2.1.2 prior to the update to be on my ZFS filesystem. Just to remember. Dashy on ProxMox with ZFS-filesystem does enup in an erro while regenerating his pages/engine as any modifications was done in any *.yml (config) file(s)

New behavior; (Same LXC container as with prior release)

  • Dashy container is running on ZFS
  • I regenerated the Dashy docker container by pulling the Ver 3.0.0 image (Stack update and repull image)
    • same yml-config files as with prior Ver 2.1.2
    • adapted volume references as docker compose to point to new directory as indicated in your documentation </app/user-data/conf.yml>
    • doubled the LXC memory from 2048 to 4096 MB

Dashy container does not start anymore. (Constantly rebooting due to Restart Policy: "Unless Stopped")

LOG results while being on ZFS [Dashy crashing]

`$ node server Checking config file against schema... βœ”οΈ Config file is valid, no issues found SSL Not Enabled: Public key not present β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β•šβ•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β•šβ•β•β•β•β•β• β•šβ•β• β•šβ•β•β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β• β•šβ•β•


Welcome to Dashy! πŸš€ Your new dashboard is now up and running with Docker


Using Dashy V-3.0.0. Update Check Complete βœ… Dashy is Up-to-Date

  • Building for production... WARN A new version of sass-loader is available. Please upgrade for best experience. ERROR Error: EINVAL: invalid argument, rmdir '/app/dist/css' Error: EINVAL: invalid argument, rmdir '/app/dist/css' error Command failed with exit code 1. info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command. ERROR: "build" exited with 1. error Command failed with exit code 1. info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command. yarn run v1.22.19 $ NODE_OPTIONS=--openssl-legacy-provider npm-run-all --parallel build start $ NODE_OPTIONS=--openssl-legacy-provider vue-cli-service build $ node server Checking config file against schema... βœ”οΈ Config file is valid, no issues found SSL Not Enabled: Public key not present β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β•šβ•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β•šβ•β•β•β•β•β• β•šβ•β• β•šβ•β•β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β• β•šβ•β•

Welcome to Dashy! πŸš€ Your new dashboard is now up and running with Docker


Using Dashy V-3.0.0. Update Check Complete βœ… Dashy is Up-to-Date

  • Building for production... WARN A new version of sass-loader is available. Please upgrade for best experience. ERROR Error: EINVAL: invalid argument, rmdir '/app/dist/css' Error: EINVAL: invalid argument, rmdir '/app/dist/css' error Command failed with exit code 1. info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command. ERROR: "build" exited with 1. error Command failed with exit code 1. info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command. yarn run v1.22.19 $ NODE_OPTIONS=--openssl-legacy-provider npm-run-all --parallel build start $ NODE_OPTIONS=--openssl-legacy-provider vue-cli-service build $ node server Checking config file against schema... βœ”οΈ Config file is valid, no issues found SSL Not Enabled: Public key not present β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β•šβ•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β•šβ•β•β•β•β•β• β•šβ•β• β•šβ•β•β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β• β•šβ•β•`

After stopping the LXC container, and moving the storage to local-lvm, Dashy was able to start, and generate all the pages of my configuration, and it is working like a charm.

Log file running LXC on local-lvm storage [ALL OK]

Using Dashy V-3.0.0. Update Check Complete βœ… Dashy is Up-to-Date yarn run v1.22.19 $ NODE_OPTIONS=--openssl-legacy-provider npm-run-all --parallel build start $ NODE_OPTIONS=--openssl-legacy-provider vue-cli-service build $ node server Checking config file against schema... βœ”οΈ Config file is valid, no issues found SSL Not Enabled: Public key not present β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β•šβ•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β•šβ•β•β•β•β•β• β•šβ•β• β•šβ•β•β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β• β•šβ•β•


Welcome to Dashy! πŸš€ Your new dashboard is now up and running with Docker


Using Dashy V-3.0.0. Update Check Complete βœ… Dashy is Up-to-Date

  • Building for production... WARN A new version of sass-loader is available. Please upgrade for best experience. Error: ENOENT: no such file or directory, stat '/app/dist/index.html' DONE Compiled successfully in 408539ms10:33:05 PM File Size Gzipped dist/js/chunk-vendors.92e65062.js 6358.33 KiB 2293.83 KiB dist/js/dashy.43c6539a.js 768.52 KiB 231.87 KiB dist/js/chunk-4cfc5864.4aba8cfa.js 250.36 KiB 74.98 KiB dist/js/chunk-50f31ec3.96f6cd6b.js 79.45 KiB 19.64 KiB dist/precache-manifest.57f4a61195ab4a7 19.40 KiB 4.47 KiB 836b384a95ae3fa9e.js dist/js/chunk-180be55e.f947613e.js 15.94 KiB 5.30 KiB dist/js/chunk-468d3a74.a5fc4bfa.js 15.58 KiB 4.49 KiB dist/js/chunk-03c5a0ba.399ef71d.js 15.38 KiB 5.04 KiB dist/js/chunk-16e26d5d.ee5b2c63.js 14.94 KiB 4.49 KiB dist/js/chunk-2642eaf9.4eed1878.js 14.39 KiB 4.88 KiB dist/js/chunk-0367deae.f8ce8511.js 13.15 KiB 4.52 KiB dist/js/chunk-7bba3126.e4385130.js 12.26 KiB 4.26 KiB dist/js/chunk-08ca355a.26f829d9.js 11.12 KiB 4.12 KiB dist/js/chunk-38169201.f89dfac1.js 11.10 KiB 4.44 KiB dist/js/chunk-460e6092.c50e1a0e.js 11.02 KiB 4.07 KiB dist/js/chunk-445cc501.4e92c9f7.js 10.89 KiB 4.09 KiB dist/js/chunk-187213fc.f70a23c4.js 10.80 KiB 4.03 KiB dist/js/chunk-edbdb67c.2a7383c8.js 9.28 KiB 3.32 KiB dist/js/chunk-2925d418.d00939dd.js 9.04 KiB 3.08 KiB dist/js/chunk-93c6be8c.e412d699.js 8.25 KiB 3.10 KiB dist/js/chunk-c8bd4cd0.da35f1b3.js 8.07 KiB 2.99 KiB dist/js/chunk-0248a1e9.d8ae36a6.js 8.03 KiB 2.86 KiB dist/js/chunk-7ba8e45c.31261b47.js 7.82 KiB 2.96 KiB dist/js/chunk-49f2d909.cbc33fa0.js 7.73 KiB 2.87 KiB dist/js/chunk-7e15df28.012f53d9.js 7.61 KiB 2.66 KiB dist/js/chunk-070d32ac.458ca0d0.js 7.35 KiB 2.87 KiB dist/js/chunk-0894290e.7f4f8889.js 7.11 KiB 2.69 KiB dist/js/chunk-e77c83e6.48594eb8.js 7.05 KiB 2.65 KiB dist/js/chunk-92c623f0.7b960119.js 7.02 KiB 2.81 KiB dist/js/chunk-fc7a7722.2250ffe1.js 6.96 KiB 2.59 KiB dist/js/chunk-0c7116ec.15f56f4d.js 6.88 KiB 2.76 KiB dist/js/chunk-29548417.1b7f8472.js 6.56 KiB 2.59 KiB dist/js/chunk-26dbf0a4.f1a7a0e0.js 6.25 KiB 2.52 KiB dist/js/chunk-1b35c628.bfc608ce.js 6.20 KiB 2.50 KiB dist/js/chunk-88331f84.e7726ecb.js 6.16 KiB 2.53 KiB dist/js/chunk-8db027b8.4a5649f0.js 6.15 KiB 2.44 KiB dist/js/chunk-15b37c0a.fecf5ad8.js 6.10 KiB 2.45 KiB dist/js/chunk-b7e4a5ce.c1e5d13d.js 6.09 KiB 2.57 KiB dist/js/chunk-f05c978e.1921df2e.js 6.07 KiB 2.40 KiB dist/js/chunk-7abb8001.1fb01462.js 6.07 KiB 2.26 KiB dist/js/chunk-32eb6af1.d683115c.js 6.02 KiB 2.55 KiB dist/js/chunk-04659cb4.924484e8.js 5.99 KiB 2.39 KiB dist/js/chunk-44cb61f1.0013706e.js 5.97 KiB 2.08 KiB dist/js/chunk-4073bae0.51329ecf.js 5.93 KiB 1.91 KiB dist/js/chunk-11e20f6f.68f8e974.js 5.88 KiB 2.19 KiB dist/js/chunk-08fae180.5daf04b7.js 5.85 KiB 2.41 KiB dist/js/chunk-4ab61964.aa761fe6.js 5.83 KiB 2.44 KiB dist/js/chunk-b52460ac.ae1e2d77.js 5.74 KiB 2.37 KiB dist/js/chunk-cd40f4ae.ed73fab0.js 5.69 KiB 2.34 KiB dist/js/chunk-ecec4fc4.dedf003b.js 5.68 KiB 2.44 KiB dist/js/chunk-4ef6dcf5.778ab1cf.js 5.67 KiB 2.36 KiB dist/js/chunk-7c4d77dc.c25c13cb.js 5.40 KiB 2.26 KiB dist/js/chunk-bd9012c4.e4b54229.js 5.38 KiB 2.32 KiB dist/js/chunk-21680640.a656bd80.js 5.35 KiB 2.22 KiB dist/js/chunk-043d9c91.c014b344.js 5.29 KiB 2.17 KiB dist/js/chunk-f539423c.7c46861d.js 5.26 KiB 2.15 KiB dist/js/chunk-6b5de1e1.c41bee96.js 5.24 KiB 2.10 KiB dist/js/chunk-4f2c58c5.f336971c.js 5.03 KiB 2.00 KiB dist/js/chunk-736b2ef0.1b65ea1f.js 5.03 KiB 2.06 KiB dist/js/chunk-f38e0ad2.7a41f73d.js 5.01 KiB 2.13 KiB dist/js/chunk-3a3d0cd8.81d833a5.js 4.99 KiB 2.15 KiB dist/js/chunk-674ac328.b3d128f3.js 4.99 KiB 2.22 KiB dist/js/chunk-cee89fa8.0d3bc86e.js 4.92 KiB 2.12 KiB dist/js/chunk-677c8830.d65a072e.js 4.85 KiB 2.06 KiB dist/js/chunk-0633ac20.674a3d69.js 4.81 KiB 2.04 KiB dist/js/chunk-6a170920.666bec94.js 4.69 KiB 2.06 KiB dist/js/chunk-c02e690a.ea23b03d.js 4.51 KiB 1.92 KiB dist/js/chunk-b25c821e.8af52b87.js 4.33 KiB 1.91 KiB dist/js/chunk-781da5fb.be037a62.js 4.32 KiB 1.45 KiB dist/js/chunk-b54d81ae.b4cb65ce.js 4.31 KiB 1.87 KiB dist/js/chunk-aa9cebcc.2f578d67.js 4.10 KiB 1.60 KiB dist/js/chunk-665a1900.a7d21b0f.js 3.69 KiB 1.69 KiB dist/js/chunk-14192a80.056262ad.js 3.64 KiB 1.51 KiB dist/js/chunk-1e169674.7a54b293.js 2.88 KiB 1.36 KiB dist/js/chunk-6ab1f28d.9a3be93f.js 2.83 KiB 1.33 KiB dist/js/chunk-72e3b16c.0f88bca5.js 2.80 KiB 1.30 KiB dist/js/chunk-75cc9f4d.a5021e27.js 2.74 KiB 1.30 KiB dist/js/chunk-0387fd77.c052ffac.js 2.74 KiB 1.27 KiB dist/js/chunk-0044633e.3dc0bad5.js 2.40 KiB 1.11 KiB dist/js/chunk-284f6914.401e1214.js 2.32 KiB 1.10 KiB dist/js/chunk-73f090a0.0e3ec0c9.js 2.31 KiB 1.12 KiB dist/js/chunk-0c51289a.eac23d06.js 2.27 KiB 1.06 KiB dist/js/chunk-7132ce43.f79ba314.js 2.22 KiB 1.08 KiB dist/js/chunk-2d225b78.80adc7b1.js 2.05 KiB 1.09 KiB dist/js/chunk-2ab49ff8.5963cee6.js 1.91 KiB 0.99 KiB dist/js/chunk-c0f28fc6.2dbaa6ba.js 1.91 KiB 0.94 KiB dist/js/chunk-d42744f4.6acd67c8.js 1.90 KiB 0.98 KiB dist/js/chunk-7795c4fe.770bf2c1.js 1.04 KiB 0.57 KiB dist/service-worker.js 1.04 KiB 0.61 KiB dist/js/chunk-3767f013.3b314b6a.js 0.75 KiB 0.45 KiB dist/css/dashy.899627eb.css 268.45 KiB 32.60 KiB dist/css/chunk-fc7a7722.f1790b34.css 11.54 KiB 1.75 KiB dist/css/chunk-03c5a0ba.fdf5ccee.css 9.49 KiB 1.81 KiB dist/css/chunk-0248a1e9.2af758e1.css 7.32 KiB 1.32 KiB dist/css/chunk-4073bae0.262be67e.css 5.86 KiB 1.00 KiB dist/css/chunk-0c7116ec.8d663b8e.css 3.98 KiB 0.96 KiB dist/css/chunk-29548417.1e586604.css 3.78 KiB 0.69 KiB dist/css/chunk-7795c4fe.8e5b7c8e.css 3.78 KiB 0.92 KiB dist/css/chunk-2642eaf9.103376cf.css 3.54 KiB 0.86 KiB dist/css/chunk-93c6be8c.b621be85.css 3.53 KiB 0.88 KiB dist/css/chunk-2925d418.2f4219ad.css 3.48 KiB 0.77 KiB dist/css/chunk-26dbf0a4.3f521e8a.css 3.31 KiB 0.86 KiB dist/css/chunk-c8bd4cd0.25b1ca48.css 3.28 KiB 0.83 KiB dist/css/chunk-vendors.d8067ad8.css 2.74 KiB 0.83 KiB dist/css/chunk-f05c978e.04b75e3f.css 2.67 KiB 0.59 KiB dist/css/chunk-7e15df28.208bbeec.css 2.51 KiB 0.64 KiB dist/css/chunk-0367deae.0f98d711.css 2.48 KiB 0.67 KiB dist/css/chunk-4cfc5864.9357c852.css 2.48 KiB 0.57 KiB dist/css/chunk-14192a80.31a5db2c.css 2.38 KiB 0.58 KiB dist/css/chunk-49f2d909.26592934.css 2.38 KiB 0.57 KiB dist/css/chunk-7ba8e45c.17242d8b.css 2.30 KiB 0.55 KiB dist/css/chunk-7bba3126.b97a92c1.css 2.17 KiB 0.56 KiB dist/css/chunk-7c4d77dc.8c1925ff.css 2.06 KiB 0.49 KiB dist/css/chunk-781da5fb.38b3bad4.css 2.03 KiB 0.56 KiB dist/css/chunk-8db027b8.377fb75a.css 2.01 KiB 0.57 KiB dist/css/chunk-edbdb67c.0de3bd5e.css 1.94 KiB 0.56 KiB dist/css/chunk-e77c83e6.729d6dc8.css 1.93 KiB 0.55 KiB dist/loading-screen.css 1.93 KiB 0.65 KiB dist/css/chunk-04659cb4.f809b0eb.css 1.86 KiB 0.50 KiB dist/css/chunk-08ca355a.0e2f8538.css 1.85 KiB 0.55 KiB dist/css/chunk-460e6092.0bcf49d9.css 1.85 KiB 0.56 KiB dist/css/chunk-88331f84.b825db4a.css 1.81 KiB 0.55 KiB dist/css/chunk-070d32ac.3ca152a5.css 1.80 KiB 0.51 KiB dist/css/chunk-445cc501.d9af4531.css 1.79 KiB 0.53 KiB dist/css/chunk-187213fc.851bbb61.css 1.78 KiB 0.52 KiB dist/css/chunk-180be55e.2679cb7e.css 1.77 KiB 0.53 KiB dist/css/chunk-6ab1f28d.dcd44809.css 1.65 KiB 0.42 KiB dist/css/chunk-21680640.f72d1c0d.css 1.64 KiB 0.51 KiB dist/css/chunk-32eb6af1.b73f2acc.css 1.59 KiB 0.48 KiB dist/css/chunk-4ab61964.950bd772.css 1.57 KiB 0.49 KiB dist/css/chunk-92c623f0.7601575f.css 1.55 KiB 0.47 KiB dist/css/chunk-b52460ac.d91d8d0b.css 1.49 KiB 0.47 KiB dist/css/chunk-4f2c58c5.e91567b0.css 1.38 KiB 0.43 KiB dist/css/chunk-aa9cebcc.43dd3768.css 1.36 KiB 0.41 KiB dist/css/chunk-7abb8001.d5057fa6.css 1.30 KiB 0.44 KiB dist/css/chunk-468d3a74.e7e4907a.css 1.26 KiB 0.42 KiB dist/css/chunk-38169201.87f602e2.css 1.16 KiB 0.46 KiB dist/css/chunk-15b37c0a.ebae7724.css 1.15 KiB 0.36 KiB dist/css/chunk-16e26d5d.97cc876a.css 1.14 KiB 0.41 KiB dist/css/chunk-ecec4fc4.7db7f641.css 1.12 KiB 0.36 KiB dist/css/chunk-0633ac20.857ad57c.css 1.04 KiB 0.34 KiB dist/css/chunk-bd9012c4.bbf2305d.css 1.04 KiB 0.34 KiB dist/css/chunk-043d9c91.9438acdb.css 0.93 KiB 0.37 KiB dist/css/chunk-1b35c628.f7e5ac71.css 0.90 KiB 0.31 KiB dist/css/chunk-4ef6dcf5.f9dd4bd8.css 0.88 KiB 0.30 KiB dist/css/chunk-f539423c.4b2b2c2a.css 0.88 KiB 0.33 KiB dist/css/chunk-3a3d0cd8.5aaf7cba.css 0.88 KiB 0.33 KiB dist/css/chunk-11e20f6f.070a8cfa.css 0.87 KiB 0.35 KiB dist/css/chunk-b7e4a5ce.df4ad987.css 0.86 KiB 0.32 KiB dist/css/chunk-f38e0ad2.1ea48a31.css 0.84 KiB 0.28 KiB dist/css/chunk-0894290e.edb63a9d.css 0.79 KiB 0.32 KiB dist/css/chunk-0387fd77.7aa83618.css 0.75 KiB 0.28 KiB dist/css/chunk-6b5de1e1.9eb66c9f.css 0.71 KiB 0.31 KiB dist/css/chunk-44cb61f1.025edb8a.css 0.69 KiB 0.31 KiB dist/css/chunk-736b2ef0.98820bcd.css 0.60 KiB 0.33 KiB dist/css/chunk-677c8830.df6a5b00.css 0.59 KiB 0.23 KiB dist/css/chunk-284f6914.58ade778.css 0.46 KiB 0.24 KiB dist/css/chunk-08fae180.9b2da476.css 0.46 KiB 0.22 KiB dist/css/chunk-0c51289a.d6684378.css 0.38 KiB 0.17 KiB dist/css/chunk-1e169674.98a4aa99.css 0.36 KiB 0.16 KiB dist/css/chunk-75cc9f4d.98a4aa99.css 0.36 KiB 0.16 KiB dist/css/chunk-d42744f4.f1c873fc.css 0.36 KiB 0.16 KiB dist/css/chunk-674ac328.d604576c.css 0.36 KiB 0.19 KiB dist/css/chunk-2ab49ff8.2ca1d591.css 0.36 KiB 0.16 KiB dist/css/chunk-6a170920.3839d02e.css 0.31 KiB 0.20 KiB dist/css/chunk-c0f28fc6.b67ed63a.css 0.22 KiB 0.16 KiB dist/css/chunk-3767f013.c9ab3ab3.css 0.11 KiB 0.10 KiB dist/css/chunk-cee89fa8.0918bc41.css 0.08 KiB 0.10 KiB dist/css/chunk-b54d81ae.61a081a9.css 0.08 KiB 0.10 KiB dist/css/chunk-665a1900.eeb31e13.css 0.07 KiB 0.09 KiB dist/css/chunk-b25c821e.f58ec558.css 0.07 KiB 0.09 KiB dist/css/chunk-c02e690a.ccf83212.css 0.06 KiB 0.08 KiB dist/css/chunk-cd40f4ae.90cf07cd.css 0.06 KiB 0.07 KiB dist/css/chunk-0044633e.0e433876.css 0.00 KiB 0.02 KiB dist/css/chunk-7132ce43.0e433876.css 0.00 KiB 0.02 KiB dist/css/chunk-72e3b16c.0e433876.css 0.00 KiB 0.02 KiB dist/css/chunk-73f090a0.0e433876.css 0.00 KiB 0.02 KiB Images and other types of assets omitted. DONE Build complete. The dist directory is ready to be deployed. INFO Check out deployment instructions at https://cli.vuejs.org/guide/deployment.html `

I now wanted to know, if switching back to ZFS will give a working Ver 3.0.0 Dashy. BUT after stopping the LXC container and moving back the storage to ZFS, this time, Dashy is behaving exactly as mentionned above (Restarting continously, and producing the exact same log-file entries)

Hope this long comment will help to understand why your dashy-docker does "not like" ZFS filesystem. I was not able to test CEPH, as I will be travelling the next weeks ...

If you do need more details, just ping me ... I will answer, as I am back .. (+/- last week May.

Thanks for your fabulous Dashy ;-) .... It is still the only solution matching my needs ;-)

Luc

LuxBibi avatar Apr 25 '24 23:04 LuxBibi