WuQic
WuQic
### 问题类型 其他问题 ### 基础模型 Llama-3-Chinese-Instruct-8B(基座模型) 可以支持 function calling 吗 或者有类似的数据集可以微调支持吗?
Where is LLMTask I can't import it ? ```py from phi.llm.openai import OpenAIChat from phi.task.llm import LLMTask from phi.assistant import Assistant from pydantic import BaseModel, Field class StoryTheme(BaseModel): setting: str...
When trying to use GPU acceleration locally, I encountered an error. Here are the steps I took. Could you please help identify where I might have gone wrong in my...
use this config can't answer the question ```yaml name: llama3-70b-chatQA mmap: true context_size: 8192 #threads: 11 #gpu_layers: 90 f16: true parameters: model: Llama3-ChatQA-1.5-70B-Q4_K_M.gguf function: # set to true to allow...