obsidian-copilot icon indicating copy to clipboard operation
obsidian-copilot copied to clipboard

LOCAL model can't read Long Notes

Open fabiensc0ville opened this issue 1 year ago • 3 comments

Describe how to reproduce Both Chat and Long Note QA work great with LOCAL OLLAMA. However when I try local model on a long note, it ignores it, and comes back to the default prompt about Obsidian and AI

Expected behavior Answer questions about the active note

Screenshots

CHAT MODE LONG NOTE QA
Chat mode LONG QA

Additional context I believe the difference is the note length because it is the obvious one (it has 1 000 000 char and 1.2MB file size!), but I am not sure about it

fabiensc0ville avatar Jul 12 '24 17:07 fabiensc0ville

Consumer-grade graphics cards cannot support the complete context token of the local model

kiradzS avatar Jul 30 '24 11:07 kiradzS

@fabiensc0ville have you checked the doc and set the context window explicitly in Ollama? https://github.com/logancyang/obsidian-copilot/blob/master/local_copilot.md#ollama

Also you didn't mention which model you were using, how long is its context window?

logancyang avatar Aug 09 '24 19:08 logancyang

@fabiensc0ville have you checked the doc and set the context window explicitly in Ollama? https://github.com/logancyang/obsidian-copilot/blob/master/local_copilot.md#ollama

Also you didn't mention which model you were using, how long is its context window?

I have tested it, using [[file]] can send the file content to the LLM, but using the button to send the file content to the LLM does not work.

wwjCMP avatar Aug 10 '24 03:08 wwjCMP