Daniel
Daniel
**Describe the bug** Bug reported by Sabin_Stargem from r/localllama - [ ] Reproduce Bug - [ ] Follow in [r/localllama thread](https://www.reddit.com/r/LocalLLaMA/comments/18pyul4/comment/kezkxpv/?utm_source=share&utm_medium=web2x&context=3)  **Environment details** - Operating System: Windows 11 -...
## Motivation To address user feedback and enhance the Hub experience by improving model recommendations, refining filtering options, and providing clearer model version visibility. ## Specs Major user stories: 1....
## Resources - https://www.reddit.com/r/LocalLLaMA/comments/18fr8u3/how_to_implement_function_calling_based_on_a/ - https://www.reddit.com/r/MachineLearning/comments/18hp7u3/p_implementing_function_calling_with/ - https://www.reddit.com/r/LocalLLaMA/comments/17ugn8i/guidance_for_selecting_a_functioncalling_library/ - https://www.reddit.com/r/LocalLLaMA/comments/180galp/whats_the_prompt_you_guys_use_for_function/ - https://www.reddit.com/r/LocalLLaMA/comments/18hjbku/optimising_function_calling_using_autogen_with/ - https://www.reddit.com/r/LocalLLaMA/comments/18i9hj4/how_do_i_get_local_llms_to_return_keywords/ - Guidance from Microsoft - LMQL - Grammar
/WIP Spec - Quick Switcher allows opening of Assistants and Models - How do we do full-text search on a local fs? (or is this mainly just quick switcher for...
## Questions - [ ] Should Jan be compatible with OpenAI plugins? (or tools?)
Where do we persist Settings parameters? - Look at VSCode and Obsidian, should we persist in a .jan? - https://code.visualstudio.com/docs/getstarted/settings
WIP Spec Similar to https://docs.obsidian.md/Plugins/User+interface/Status+bar
Note: This is a Product discussion, not confirmed as an issue yet. ## Objective - Threads view should educate the user on Jan as an Assistant, but also show the...
## Linked Milestone https://github.com/janhq/jan/milestone/21 ## Objective - Jan's architecture will default to Nitro, but have flexibility to incorporate other Model Backends / Inference Engines - We see very fast movement...
**Describe the bug** Reported by Sabin_Stargem from r/localllama. - [ ] Reproduce issue - [ ] Follow up on [r/localllama thread](https://www.reddit.com/r/LocalLLaMA/comments/18pyul4/comment/kezkxpv/?utm_source=share&utm_medium=web2x&context=3)  **Additional context** Add any other context or information...