langserve
langserve copied to clipboard
The agent does not use tool to make response?
I refer to https://github.com/langchain-ai/langserve/blob/main/examples/agent/server.py to build my own RAG agent.
test code as follows:
from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langserve import add_routes
from langchain_openai import ChatOpenAI
from langchain_huggingface import HuggingFaceEmbeddings
from langchain_milvus import Milvus
from pydantic.v1 import BaseModel
from typing import Any
from langchain.agents import AgentExecutor
from langchain.agents.format_scratchpad import format_to_openai_functions
from langchain_core.utils.function_calling import format_tool_to_openai_function
from langchain.agents.output_parsers import OpenAIFunctionsAgentOutputParser
from langchain_core.tools import tool
llm_end_point_url = "http://***:***/v1/"
model = ChatOpenAI(model="glm4v-9b",base_url=llm_end_point_url, api_key="api_key")
### embedding ###
embedding_model = HuggingFaceEmbeddings(model_name='/root/ljm/bge/bge-large-zh-v1.5')
### milvus ###
milvus_host = "***"
milvus_port = ***
collection_name = "langchain_lichi_txt"
vector_store = Milvus(
embedding_function=embedding_model,
collection_name="langchain_lichi_txt",
connection_args={"host": milvus_host, "port": milvus_port, "db_name": "glm3"}, # "db_name"字段指定数据库名称
)
retriever = vector_store.as_retriever(search_type="similarity", search_kwargs={"k": 3})
@tool
def litchi_rag(query: str) -> list:
"""该工具能够对关于荔枝的专业知识进行总结和介绍,并回答问题。"""
return retriever.get_relevant_documents(query)
tools = [litchi_rag]
prompt = ChatPromptTemplate.from_messages(
[
("system", "你是一位农业知识助手。"),
("user", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
llm_with_tools = model.bind(functions=[format_tool_to_openai_function(t) for t in tools])
agent = (
{
"input": lambda x: x["input"],
"agent_scratchpad": lambda x: format_to_openai_functions(
x["intermediate_steps"],
),
}
| prompt
| llm_with_tools
| OpenAIFunctionsAgentOutputParser()
)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
class Input(BaseModel):
input: str
class Output(BaseModel):
output: Any
app = FastAPI(
title="GLM4 LangChain Server",
version="1.0",
description="A simple api server using Langchain's Runnable interfaces",
)
add_routes(
app,
agent_executor.with_types(input_type=Input, output_type=Output).with_config(
{"run_name": "agent"}
),
path="/Litchi_RAG",
)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="****", port=8010)
Then I make query in client:
remote_runnable = RemoteRunnable("http://**:***/Litchi_RAG")
query = "深圳的荔枝有哪些品种?"
response = remote_runnable.invoke({"input": query})
print(response)
chain verbose as follows:
> Entering new AgentExecutor chain...
深圳位于中国广东省,由于其温暖的气候条件,非常适合荔枝的生长。深圳地区种植较多的荔枝品种包括但不限于以下几种:
1. **妃子笑**:这是深圳最常见的荔枝品种之一,以其甜美的口感和早熟的特点受到欢迎。
2. **桂味**:桂味荔枝也是深圳常见的品种,它以肉质爽脆、味道甜美而闻名。
3. **白糖罂**:这种荔枝果肉饱满,色泽鲜亮,口感清甜,深受消费者喜爱。
4. **黑叶**:黑叶荔枝果实较大,肉质结实,味道甘甜,是荔枝中的优良品种。
除了上述品种外,深圳还可能种植其他地方特色的荔枝品种。不过,具体的品种可能会随着时间和市场需求的变化而有所调整。如果你想了解最新的荔枝品种信息,建议咨询当地的农民合作社或者农业技术推广部门。
> Finished chain.
I notice that agent didn't use tool to make RAG, just directly response. Did I miss anything? I am stuck here. Any help would be greatly appreciated.