calycekr

Results 9 issues of calycekr

### Bug Report PR: Adding yarn support https://github.com/huggingface/text-generation-inference/pull/1099 We can find 'yarn' for rope_scaling type. `elif rope_scaling["type"] == "yarn":` But, there is no yarn in possible values in the tgi...

I have no problem accessing environment variables from .env.local with **npm run dev** or **npm run preview**, but when I run with **node build** after **npm run build**, I can't...

node.js 단에서 바로 사용하고 싶은데, 혹시 계획이 있으신지 궁금합니다.

https://github.com/huggingface/chat-ui/blob/6de97af071c69aa16e8f893adebb46f86bdeeaff/src/lib/components/chat/ChatMessage.svelte#L378-L384 When compared to other components, classNames is the only difference here. When rendered, the icon appears faint in the browser. Is there a reason for this, or is it...

bug
good first issue
front

While the initial part of streaming from the LLM server is fine, the screen display speed slows down as time progresses, particularly towards the end of the streaming process. However,...

enhancement
front

### Consistency and Handling Issues with Environment Variable Values 1. The values are either 0 or 1, or false or true. 2. The values of environment variables are treated as...

enhancement

Shouldn't it be better to prevent users from going to other conversations and the main screen while the answer is being generated? If you go to another screen while an...

enhancement
front

## Bug description The preprompt is missing in the rendered chat prompt when a user inputs a message. The expected behavior is for the preprompt to appear before the user's...

bug

## Bug description The error occurs when the LLM Server suddenly stops, and the chat-ui continues to send queries to the LLM Server, eventually leading to the chat-ui also crashing....

bug