J S
J S
Hi @fredrikekre , Hope all is well! Is there any chance we could reach a conclusion on the PR? I've responded in August to your comments but haven't heard back....
> I was waiting for [#200 (comment)](https://github.com/fredrikekre/Literate.jl/pull/200#discussion_r1307628784) Thank you, I havent realized that my explanation wasn’t sufficient. I’ll follow your suggestion and update it within the next few days.
There is no official support yet, but you can easily build it yourself with the following guide: https://svilupp.github.io/PromptingTools.jl/dev/how_it_works#Walkthrough-Example-for-aiextract It works well with mixtral and similar models.
Great! There are different API endpoints in Ollama: - generate - basically text completion-like; where you provide two fields: system, prompt; no multi-turn conversation (1 reply only) - api/chat -...
Just flagging that it might be easier to tackle this after we can provide "JSON type" representation to the open-source models (not JSON schema) - [reference](https://www.boundaryml.com/blog/type-definition-prompting-baml). See https://github.com/svilupp/PromptingTools.jl/issues/143
Hi @totpero , Would you mind running the awesome-lint and checking the errors it provides? I ran it and I'm getting: > ✖ 1:1 Missing or invalid Table of Contents...
I’ve read through the guidelines several times and I believe to satisfy all criteria? The only thing I skipped was the silly unicorn message, as I thought it’s a joke?
unicorn
Can I please bump this request?
> The list title is a bit long. I think it should be `Generative AI in Julia`. Thank you. That does not capture it well, because we want to add...