dalai
dalai copied to clipboard
Alpaca 13B repeats answers

After some initial answer, it repeats the end of the answers continuously. What should i do to fix this behaviour, thanks!
Similar with Alpaca 7B
Same issue here.
Couldn't figured out the issue, so i just went back to Llama 7B, you can adjust the parameters as follows for it:
prompt: (required) The prompt string
model: llama.7B (a larger model)
url: (optional) if you want to connect to a remote server, otherwise it will use the node.js API to run locally.
threads: 8 (The default number of threads should work well for a larger model)
n_predict: 128 (The default number of tokens to return)
seed: (optional) A specific seed, if you want reproducible results
top_k: 100 (Expanding the token selection to the top 100 for a larger model)
top_p: 0.95 (Allow for more diversity in the output)
repeat_last_n: 15 (Prevent repetition of the last 15 tokens)
repeat_penalty: 1.1 (Apply a moderate penalty to discourage repetition)
temp: 1.0 (Use a higher temperature for more diverse and creative output)
batch_size: 2 (For a larger model, a slightly larger batch size can be used)
skip_end: (optional) Set to true if you do not want the response to end with \n\n
I've also switched to the dalai npm package for nodejs, it picks up any models you install automatically. (Dalai folder)

Got appropriate results by modifying a prompt as suggested in https://github.com/cocktailpeanut/dalai/issues/166#issuecomment-1477163971
sampling parameters: temp = 0.100000, top_k = 100, top_p = 0.900000, repeat_last_n = 64, repeat_penalty = 1.100000
Below is an instruction that describes a task. Write a response that appropriately completes the request.
Question: How to cook fried eggs?
Answer: First, crack two eggs into a bowl and whisk them together with salt and pepper. Next, heat up a frying pan over medium-high heat and add some olive oil or butter. Then, pour in the egg mixture and let it cook for about three minutes, until the edges start to set. Finally, use a spatula to gently lift one side of the eggs and fold it over the other side.
A known issue across many models https://arxiv.org/pdf/2012.14660.pdf. Wonder how ChatGPT makes sure this does not happen. Maybe indeed just prompt design?
I just wanted to leave a comment here to I could participate in the this thread. I am also experiencing repetitive behavior. It doesn't happen all the time, but when it does it will repeat the last phrase over and over.
I read an article somewhere last year that openAI experienced repetitive behaviour in Davinci in early builds.
I am using alpaca 7B
I wonder if gpt3 reads over what it has written, and stops if it begins to repeat itself.
A regex string could do this. But really we need to stop the bot from responding without having to kill node.
I just wanted to leave a comment here to I could participate in the this thread. I am also experiencing repetitive behavior. It doesn't happen all the time, but when it does it will repeat the last phrase over and over.
I read an article somewhere last year that openAI experienced repetitive behaviour in Davinci in early builds.
I am using alpaca 7B
I wonder if gpt3 reads over what it has written, and stops if it begins to repeat itself.
A regex string could do this. But really we need to stop the bot from responding without having to kill node.
I configured mine to give shorter responses, maybe this will help?
model: "llama.13B",
prompt: removeSubstrings(message.content, false),
n_predict: 40,
top_k: 40,
top_p: 0.85,
temp: 0.6,
repeat_last_n: 64,
repeat_penalty: 1.0,
batch_size: 1,
skip_end: true,
},