z-zeechung
z-zeechung
第二点有误,千帆平台不是上限8000token,而是上限8000个字符。这会导致某些插件的request过长而出错 希望在generate_from_baidu_qianfan函数中添加校验,在request过长时压缩或截断
nice work. i'm gonna impl a multi-llm interface for langchain
let's make some infrastructures like llm api, knowledge base, browser, shell env and so on. it harms nothing, at least
autogpt has a knowledge base, so i guess we'll be the same
count me in as well, thanks