ChatGLM-6B icon indicating copy to clipboard operation
ChatGLM-6B copied to clipboard

I am using this for Question Answering, where I provide it the context.

Open allthingssecurity opened this issue 1 year ago • 1 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Current Behavior

Sometimes it answers out of the given context/text. I want it to answer from the text. Also it gives wrong answers for something, say for example if in text its mentioned a person can avail a certain policy , it pulls that text and says it cant. Not sure how to prompt it better.

Expected Behavior

Should give more factual answers I feel.

Steps To Reproduce

Give it a large context with certain facts and ask it question from those.

Environment

- OS: Linux
- Python: 3.10
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Using colab environment

Anything else?

Nothing

allthingssecurity avatar Mar 19 '23 02:03 allthingssecurity

In Chinese version of README, there is a "Limitations" section.

IMHO, these should be the common limitations in every LLM deployable in a customer graphics card.

evshiron avatar Mar 19 '23 05:03 evshiron