lagent
lagent copied to clipboard
A lightweight framework for building LLM-based agents
I really like your framework and currently trying locally and evaluate now. Does Lagent framework support Ollama?If not , is it possible to work around and use my local Ollama...
I have taken a look at inside lagent_example.py and react.py. it seems like the react prompt does not incorporate the previous output as new input. Is it a systematic mistake,...
# llm添加openai风格服务端支持(自定义代理地址和模型名称) 使用mindsearch的时候,发现无法使用自定义代理地址、模型名称的openai风格服务端。 GPTAPI有进行模型校验等原因,秉承只add的原则。基于GPTAPI扩展GPTStyleAPI: ## 支持openai风格: 已测试通过: - xinference 1.2.0 - ollama 0.5.5 - one-api v0.6.10-alpha.6 - baichuan 直连 - lmdeploy 0.7.0 ## 使用方法: 设置api_base、 model_name、key(如果服务端不需要,可忽略) ``` api_base = 'http://192.168.26.213:13000/v1/chat/completions'...
API is valid but got invalid response, I am not sure where I got it wrong. ``` import os import logging import warnings as wa wa.warn_explicit = wa.warn = lambda...
# 所用版本 提交 #285,main 分支。 # 问题概述 使用如下代码尝试获取智能体回复时: ```python from lagent.agents import Agent from lagent.schema import AgentMessage from lagent.llms import GPTAPI # qwen 智能体配置来自项目 MindSearch url = "https://dashscope.aliyuncs.com/api/v1/services/aigc/text-generation/generation" qwen =...
1.read OPENAI_API_BASE from local environment variable 2.support local openai llm service
基于Lagent新版内容,通过定义的GPTAPI类调用浦语API,并利用Streamlit框架启动Web服务,能够集成并调用Arxiv文献检索工具。 11.22更新,侧边栏功能全部可用,同时给出了硅基流动API的使用例子。