Athma Lakshmi narayanan
Athma Lakshmi narayanan
@wzhiyuan2016 Have you tried using the vitg14(big models) with 1536 output features.?
I could do something like this: Here checkpoint-9000 is a folder where the checkpoint is stored during training.. ``` from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("bigcode/santacoder") # checkpoint-9000 Special...
hi what is this option in Optionsconfig in chat: logits_all
same issues in python debugger. Whats the fix?
Is there any resolution?
@georgeliu1998 Sorry tried that and it doesn't work. Still get a similar error: > > 2024-10-02 10:06:07.652 [info] Setting indexing intent to should-index > 2024-10-02 10:06:07.655 [info] Creating merkle client....
@georgeliu1998 This is making cursor unusable
> (probably same as [#16546](https://github.com/vllm-project/vllm/issues/16546)) Logprobs and logits are different actually..logits are unnormalized last layer outputs of shape tokens x dictionary..they are very important for uncertainty estimation
issue persists: 2024-10-02 10:06:07.652 [info] Setting indexing intent to should-index 2024-10-02 10:06:07.655 [info] Creating merkle client. 2024-10-02 10:06:07.655 [info] Done creating merkle client. 2024-10-02 10:06:07.656 [info] Doing a startup handshake....