chairuilin

Results 5 comments of chairuilin

llama3 配置: LLM_MODEL = "ollama-llama3(max_token=4096)" AVAIL_LLM_MODELS = ["one-api-claude-3-sonnet-20240229(max_token=100000)", "ollama-llama3(max_token=4096)"] #如果你的模型是llama2,就填llama2,注意:一定不要填错 API_URL_REDIRECT = {"http://localhost:11434/api/chat": "http://:11434/api/chat"}# your address 下面是原因,感兴趣可以看 #模型调用的本质.即:request库必须匹配到对应的模型名才可以已正常访问,ollama是一个管理库,而不是名称. import requests url = 'http://*******:11434/api/chat' data = { "model": "llama3", "messages": [...

i read his code,and have some question(may i have poor coding skills) and thought his code is not completely match his paper. bro can you discuss with me?

感觉有点难用

mindmap本质就是# 与##,那么出一个小的标签指明是一个#还是二个##就行了(如果要拖拽工作量就小大,不过就近归纳也算是一个减便的方法)(或者说,##和##同级别,三个#就是就近的下级)