ollama icon indicating copy to clipboard operation
ollama copied to clipboard

Error Ollama + Langchain + Google Colab + ngrok

Open SerhiyProtsenko opened this issue 1 year ago • 2 comments

When I use the combination: Ollama + Langchain + Google Colab + ngrok. I get an error (The models are downloaded, I can see them in Ollama list)

llm = Ollama(
    model="run deepseek-coder:6.7b",  base_url="https://e12b-35-231-226-171.ngrok.io/")
responce = llm.predict('What do you know about Falco?')

Output exceeds the [size limit](command:workbench.action.openSettings?%5B%22notebook.output.textLineLimit%22%5D). Open the full output data [in a text editor](command:workbench.action.openLargeOutput?5f7f2031-a63a-42c0-ac20-ccc8d53de6b2)---------------------------------------------------------------------------
JSONDecodeError                           Traceback (most recent call last)
File [~/miniconda3/envs/llm/lib/python3.11/site-packages/requests/models.py:971](https://file+.vscode-resource.vscode-cdn.net/home/serhiy/Scalarr/llm/%20RAG/~/miniconda3/envs/llm/lib/python3.11/site-packages/requests/models.py:971), in Response.json(self, **kwargs)
    970 try:
--> 971     return complexjson.loads(self.text, **kwargs)
    972 except JSONDecodeError as e:
    973     # Catch JSON-related errors and raise as requests.JSONDecodeError
    974     # This aliases json.JSONDecodeError and simplejson.JSONDecodeError

File [~/miniconda3/envs/llm/lib/python3.11/site-packages/simplejson/__init__.py:514](https://file+.vscode-resource.vscode-cdn.net/home/serhiy/Scalarr/llm/%20RAG/~/miniconda3/envs/llm/lib/python3.11/site-packages/simplejson/__init__.py:514), in loads(s, encoding, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, use_decimal, allow_nan, **kw)
    510 if (cls is None and encoding is None and object_hook is None and
    511         parse_int is None and parse_float is None and
    512         parse_constant is None and object_pairs_hook is None
    513         and not use_decimal and not allow_nan and not kw):
--> 514     return _default_decoder.decode(s)
    515 if cls is None:

File [~/miniconda3/envs/llm/lib/python3.11/site-packages/simplejson/decoder.py:389](https://file+.vscode-resource.vscode-cdn.net/home/serhiy/Scalarr/llm/%20RAG/~/miniconda3/envs/llm/lib/python3.11/site-packages/simplejson/decoder.py:389), in JSONDecoder.decode(self, s, _w, _PY3)
    388 if end != len(s):
--> 389     raise JSONDecodeError("Extra data", s, end, len(s))
    390 return obj

JSONDecodeError: Extra data: line 1 column 5 - line 1 column 19 (char 4 - 18)

During handling of the above exception, another exception occurred:
...
    973     # Catch JSON-related errors and raise as requests.JSONDecodeError
    974     # This aliases json.JSONDecodeError and simplejson.JSONDecodeError
--> 975     raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)

JSONDecodeError: Extra data: line 1 column 5 (char 4)

If I run from the terminal Ollama + Google Colab + ngrok, everything works with google colab and ngrok. Also, if I change the Python script to local base_url:

llm = Ollama(
    model="run deepseek-coder:6.7b",  base_url="http://localhost:11434")
responce = llm.predict('What do you know about Falco?')

everything works Ollama + Langchain.

Only the combination Ollama + Langchain + Google Colab + ngrok does not work

SerhiyProtsenko avatar Dec 15 '23 16:12 SerhiyProtsenko

If I run from the terminal Ollama + Google Colab + ngrok, everything works with google colab and ngrok.

This suggests there's some issue with langchain when using the ngrok host. Perhaps you can follow up in the langchain repo?

mxyng avatar Dec 15 '23 17:12 mxyng

@mxyng I asked https://github.com/langchain-ai/langchain/issues/14810 but nothing happens and the problem remains

SerhiyProtsenko avatar Dec 22 '23 13:12 SerhiyProtsenko

@SerhiyProtsenko are you still running into issues with this with the latest version of Ollama, and updating the model via an ollama pull? Just wanted to make sure it's not already fixed on either end.

(Closing for now, but let me know if you are still running into problems) Sorry about this.

mchiang0610 avatar Mar 11 '24 18:03 mchiang0610