dustin fletcher

Results 11 comments of dustin fletcher

I just had this thought, too. I use christmas-community for managing my own list now, but for family members who have amazon wishlists, it would be great to be able...

there's also ``` mkp224o -Y file.yaml hostnameldkfjalgbnghweognvksjdbweoiugbewoiewglkeniwehiok.onion ``` (syntax found from `mkp224o --help`)

it seems that the provided example in `README.md` is incorrect. `topic:` should be under another `data:` block this works for me: ```yaml service: notify.ntfy data: title: Battery Alert message: phone...

i've also recently encountered this issue of style options (`--style=header-filename` in my case) not being passed when piping to another program. while asking an LLM for workarounds it informed me...

what's the relevance to LLM *interpretability*? LLM intrepretability is about trying to analyze / understand what's going on inside a model, so it's not such an inscrutable 'black box'

using cloud models in ollama works practically the exact same as using local models in ollama - the only difference is that for cloud models the ollama server is relaying...

you may need to first use `ollama run ...` or `ollama pull ...` with your desired cloud model + tag so that it shows up in your list of ollama...

ollama.com also now (somewhat confusingly) offers a cloud API, where you can use `https://ollama.com` for the ollama server URL and use an API key from https://ollama.com/settings/keys this already works with...

I would love this feature too. I have some good 3 hour podcast episodes I like to refer back to as an educational resource, and being able to save timestamps...

yes, it appears possible to add models with their own api base url through the openai plugin via an `extra-openai-models.yaml` file in `dirname "$(llm logs path)"` https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models https://llm.datasette.io/en/stable/openai-models.html#openai-extra-models