ms-agent icon indicating copy to clipboard operation
ms-agent copied to clipboard

[<Agent component: framework|tool|llm|etc...>]

Open kbzh2558 opened this issue 1 year ago • 1 comments

Description

请问如何运营rag agent自主进行多轮对话,谢谢。

例如:PROMPT_TEMPLATE="""

我的问题或指令:

{question}

请根据下面的参考信息回答我的问题或回复我的指令,并遵循以下改进后的指南: . . .

您的回复应该遵循以下改进后的结构: . . .


参考信息:

{context} """

目前我参照的模板是rag example中的llamaindex_rag.ipynb

import logging import sys import os

logging.basicConfig(stream=sys.stdout, level=logging.INFO) logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))

from llama_index.core import ( SimpleDirectoryReader, VectorStoreIndex, Settings ) from modelscope import snapshot_download

Specify multiple document paths

document_paths = [ 'D:\PPT&WORD\folder\knowledge' # Add more document paths as needed ]

Load data from multiple documents

documents = [] for path in document_paths: documents.extend(SimpleDirectoryReader(path).load_data())

embedding_name='damo/nlp_gte_sentence-embedding_chinese-base' local_embedding = snapshot_download(embedding_name) Settings.embed_model = "local:"+local_embedding

index = VectorStoreIndex.from_documents(documents)

os.environ['ZHIPU_API_KEY']='apikey'

from modelscope_agent.agents import RolePlay

role_template = '知识库查询小助手,可以优先通过查询本地知识库来回答用户的问题' llm_config = { 'model': 'GLM-4', 'model_server': 'zhipu' } function_list = []

bot = RolePlay(function_list=function_list,llm=llm_config, instruction=role_template)

index_ret = index.as_retriever(similarity_top_k=3) query = "西安交通大学图书馆有几部分组成?" result = index_ret.retrieve(query) print(result)

ref_doc = ' '.join([doc.text for doc in result]) response = bot.run("西安交通大学图书馆有几部分组成?", remote=False, print_info=True, ref_doc=ref_doc) text = '' for chunk in response: text += chunk print(text)

Link

No response

kbzh2558 avatar Jun 07 '24 05:06 kbzh2558

可以参考以下代码:

` from modelscope_agent.memory import MemoryWithRag from modelscope_agent.agents import RolePlay

role_template = '知识库查询小助手,可以优先通过查询本地知识库来回答用户的问题' llm_config = { 'model': 'GLM-4', 'model_server': 'zhipu' } function_list = [] file_paths = ['./tests/samples/常见QA.pdf'] bot = RolePlay(function_list=function_list,llm=llm_config, instruction=role_template) memory = MemoryWithRag(urls=file_paths, use_knowledge_cache=False) use_llm = True if len(function_list) else False

query = "高德天气API在哪申请" ref_doc = memory.run(query, use_llm=use_llm)

response = bot.run(query, remote=False, print_info=True, ref_doc=ref_doc) text = '' for chunk in response: text += chunk print(text) `

suluyana avatar Jun 13 '24 06:06 suluyana