Leandro Benedet Garcia
Leandro Benedet Garcia
> Start by using a model such as llama, by launching text-generation-webui with `--load-in-8bit` with a 7B or 13B model. Does it have to be a 7B or 13B models...
Hmm, I guess we should check all their repos to check if it is a repeating offense.
@henryiii I added you to the fork, I tried with just pull request but in the CI it showed the same error.
Allrighty, since it seems to be working now I will put it for review, thank you @henryiii
> What's the status of this PR? Anything left to do, should I review? it is ready for review.
> Any progress on restructuring the PR? Should I close it for now? I will try my hand at it this weekend.
> FYI: https://x.com/kylelukaszek/status/1832979451203686676 Sorry, could you post a screen shot? Twitter is banned in my country.
@ashtonmeuser I rebased with the branch extensions, but apparently github seems confused and stuff, I might need to make a new PR
> Out of curiosity, is it ClangFormat that's causing the massive diff? If so, I'm not opposed to a formatting of the entire repo (although I'll do this separately from...
Hello, just wanted to know if there is anything preventing this from getting merged.