Jesse Luoto
Jesse Luoto
I researched a bit about Ollama. If I'm correct, you could run Ollama locally and Humanify could connect to its API to use any model that Ollama uses. There seems...
Thank you for looking into this. I just pushed `ollama-support` branch that should start working if they start supporting the grammar flag
☝️ added llama3.1 8b model support
@dangelo352 unfortunately there's no Ollama support yet. You can run the model locally using `humanify local`
Since v2.2.0 there's now a configurable `--baseURL` parameter at the OpenAI mode. Unfortunately Ollama does not yet support structured outputs, although I'm sure it's on their roadmap as the official...
> Most notably the keyword `crypto` was generated by your tool and will NOT work as it is a reserved HTML keyword. Hmm. This is an interesting view; e.g. `crypto`...
I think I'd need to grab a list of all global browser variables to fix this reliably. Didn't find anything sufficient with a quick googling, so I'm thinking if I...
Yes, this warning is a known issue for now. The model should work fine, at least in my testing. I'll check later if I can find a fix for this.
I'll keep this issue open for now so I don't forget to fix it
A great idea! Do you know anything about creating sourcemaps with Babel or similar tools? I think usually Babel creates sourcemaps that start from the original file and you end...