Pao Sheng
Pao Sheng
hey, 我看到錯誤訊息中有個 `git-lfs filter-process: git-lfs: command not found` 或許可以試試先去安裝這個 command. 近期剛好有遇到一樣的錯誤訊息在其他專案上,後來也是在安裝後重新進行 git clone 就可以使用了
Hi @lsky-walt, from the log and context, I don’t know why it happened. If it's possible, can you integrate with https://langfuse.com/ (cloud or self-host both okay)? Just put the Langfuse...
Hey @lsky-walt, the local log is tricky to follow because the error message is ``` reformated_json[table["table_name"]] = table.get("table_contents", None) ~~~~~^^^^^^^^^^^^^^ KeyError: 'table_name' ``` If it is key error for table_name,...
Hi @Ahmed-ao, here is my config.yaml for Ollama model and embedder. I think you can base it on it and modify it to your version. ``` models: - api_base: http://host.docker.internal:11434/...
Hey @ravenizzed, @realcarlos, could you guys share your thoughts with us? What kind of option would work better for you—setting up an independent page on the UI, a more dedicated...
Hi @ravenizzed and @realcarlos, thanks so much for your feedback! We’ll chat in the team about how to make the config step even easier. If you have any more thoughts...
hey @Killer2OP, issue #735 has been done in #747, i think we can start working on this issue if it works to you! then if you need any help, don't...
@Spirizeon Good to hear the update! Thanks for the work! when the PR is ready to review, you can ping me in reviewers.
> Thanks for the support, I've implemented all the necessary changes but I want to test the endpoints before marking ready for review. Any documentation I can refer to for...
Hi @yuzhi-jiang, I noticed the error is due to leaking the kwargs for llm. If you don't want specified kwargs for llm, at least leave `kwargs: {}` in the model...