ChatKBQA
ChatKBQA copied to clipboard
[ACL 2024] Official resources of "ChatKBQA: A Generate-then-Retrieve Framework for Knowledge Base Question Answering with Fine-tuned Large Language Models".
Traceback (most recent call last): File "LLMs/LLaMA/src/train_bash.py", line 16, in main() File "LLMs/LLaMA/src/train_bash.py", line 7, in main run_exp() File "C:\Code\ChatKBQA-main\LLMs\LLaMA\src\llmtuner\tuner\tune.py", line 26, in run_exp run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)...
Hello, I am attempting to reproduce the results of ChatKBQA on the WebQSP dataset, and I have some confusion regarding the metrics used. Specifically, I am trying to determine which...
Your work is really good and has given me a lot of inspiration, but when running the following command, the following situation occurs. CUDA_VISIBLE_DEVICES=1 nohup python -u eval_final.py --dataset WebQSP...
泛化能力
训练出来的模型,可以直接在任何新的图谱上使用吗? 还是说这种方法,必须在新的图谱上进行finetune后,才能使用。
Hi, After running python parse_sparql_webqsp.py program, SExpr expression will be generated, but there is null in the reproduced result. Is this reasonable? Thank you for your assistance. Best regards. ...
Hi, What are the hyper-parameters of rank and alpha?
Hello, my friend During the training of LLAMA2-13b on an A30 GPU equipped with 24GB of video memory, I am facing an error concerning GPU memory allocation. Are there any...
您好!请问一下这个项目可以用在新的知识图谱上吗?微调大模型生成的SExpr怎么转换为其它图谱的Sparql?
在跑代码的过程中loss刚开始为300多,然后一下跌到零值并且一直是零,请问是什么问题,数据集是按照百度网盘下载并放置。 
Hello!! Firstly, thank you for sharing your work. I greatly appreciate it. While running your code, I encountered some issues during the inference stage. Specifically, when I ran the following...