llama2.c
                                
                                 llama2.c copied to clipboard
                                
                                    llama2.c copied to clipboard
                            
                            
                            
                        How do we reproduce your stories, by a more practical Q&A chat model?
If we can create models targeted to a custom dataset but not about "stories", then perhaps it would become useful.
For instance, how do we make it work for a Bible/scriptures dataset. And secondly, will it be reliable enough to produce the right results on a given prompt? It is more practical to have samples that can accurately respond to conversational prompts.
Thanks.
#462