Gaoyang Zhang

Results 26 comments of Gaoyang Zhang

Hi @raphCode, thanks for the reply. > As far as I understand, you have a shell with, say, `NEW_ENV="hello"` set. You then attach to a pre-existing zellij session and expect...

Hi @a-kenji, > How is your shell configured in tmux @blurgyy? If you are asking about the shell prompt, it is managed by starship, which is configured to show my...

No, I did not configure my shell in tmux in the first screenshot. The first screenshot was taken after I deleted my `~/.config/tmux/tmux.conf`, and I just checked there is no...

Hi @a-kenji, thank you for your reply! > Do you see the same behaviour when setting the shell as an interactive shell in tmux? Yes, I think so: ![image](https://user-images.githubusercontent.com/44701880/177136676-d6890074-9bca-4cb4-aeb2-fbdc10d5ca41.png) I...

The problem hits me today, and I managed to workaround this issue by adding a `GC_DONT_GC` environment variable to `hydra-evaluator.service` (some thing like https://gitlab.com/highsunz/flames/-/commit/9cd2a0a3f48abb0c5c57d3ee049f72e31cf1ec2e). This workaround comes from https://github.com/NixOS/nix/issues/4178#issuecomment-738886808.

Hi @johguenther, thank you for your reply. I don't think I have mangled with `CMAKE_MODULE_PATH` before, and I just checked that it's neither in my shell environment nor in my...

Well, in my case, all is found in `/lib64/...` by default and neither setting `CMAKE_MODULE_PATH` to `/usr/lib/cmake` nor setting it to `/usr` made any differences. There is a workaround as...

For anybody observing empty output while using LLaVA 1.6 34b and have replaced the mmproj model (`sha256-83720bd8438ccdc910deba5efbdc3340820b29258d94a7a60d1addc9a1b5f095` in my ~/.ollama/models/blobs) with , the reason is the default context window size...

I've put together a LLaVA 1.6 34b model with Q8_0 quantization according to my previous comment at , it's projection model projects an input image to 2880 tokens.