Nicolas Patry
Nicolas Patry
> The actual prompt comes last, which could be truncated away It's always left truncated for generative models, therefore only loosing the initial part of a prompt. If you're using...
Knowing the length in tokens is not really useful if you don't know how to modify the original prompt in order to modify those tokens right ? > Pick a...
Can you try again ? Seems better now
> What do you think would make more sense @Narsil @osanseviero ? Both are fine to me. I don't necessarily know the scoping of this vs optimum and such.
Also ignore the failing tests if they work locally for you. Somehow the CI has issues with docker signals.
If you use a diffusers LoRA I think this is already what will happen, the lora will download the mother model and use them. @pcuenca that's correct, right ?
Hi @lalalune , Makes perfect sense. Would `FBX` be general enough for most use cases ? Seems like a good target (single file, pretty standard). It's definitely very addable, but...
Tagging @mishig25 for his view on the front end part of such widgets.
@bayartsogt-ya , the test is run. It's the red one. The current problem is that the docker takes too long to spawn for the test to pass (it's because the...
I have no idea. Maybe git blame can help ? It's possible it's an old relic of code when fairseq hub support was on a specific branch ?