core icon indicating copy to clipboard operation
core copied to clipboard

[BUG] AttributeError: 'coroutine' object has no attribute 'get'

Open salvogs opened this issue 9 months ago • 15 comments

My config consists in a local-cat with qdrant and Ollama.

After update from 1.5.1 to 1.6.1 i have this problem:

Schermata del 2024-05-06 13-17-36

traceback:

Traceback (most recent call last): File "/app/cat/looking_glass/stray_cat.py", line 403, in run cat_message = self.loop.run_until_complete( File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete File "/app/cat/looking_glass/stray_cat.py", line 350, in call raise e File "/app/cat/looking_glass/stray_cat.py", line 340, in call cat_message = await self.agent_manager.execute_agent(self) File "/app/cat/looking_glass/agent_manager.py", line 236, in execute_agent memory_chain_output = await self.execute_memory_chain(agent_input, prompt_prefix, prompt_suffix, stray) File "/app/cat/looking_glass/agent_manager.py", line 170, in execute_memory_chain return await memory_chain.ainvoke({**agent_input, "stop":"Human:"}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)])) File "/usr/local/lib/python3.10/site-packages/langchain/chains/base.py", line 212, in ainvoke raise e File "/usr/local/lib/python3.10/site-packages/langchain/chains/base.py", line 203, in ainvoke await self._acall(inputs, run_manager=run_manager) File "/usr/local/lib/python3.10/site-packages/langchain/chains/llm.py", line 275, in _acall response = await self.agenerate([inputs], run_manager=run_manager) File "/usr/local/lib/python3.10/site-packages/langchain/chains/llm.py", line 142, in agenerate return await self.llm.agenerate_prompt( File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/llms.py", line 643, in agenerate_prompt return await self.agenerate( File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/llms.py", line 1018, in agenerate output = await self._agenerate_helper( File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/llms.py", line 882, in _agenerate_helper raise e File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/llms.py", line 866, in _agenerate_helper await self._agenerate( File "/usr/local/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 444, in _agenerate final_chunk = await super()._astream_with_aggregation( File "/usr/local/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 343, in _astream_with_aggregation async for stream_resp in self._acreate_generate_stream(prompt, stop, **kwargs): File "/usr/local/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 174, in _acreate_generate_stream async for item in self._acreate_stream( File "/app/cat/factory/ollama_utils.py", line 121, in _acreate_stream_patch optional_detail = await response.json().get("error") AttributeError: 'coroutine' object has no attribute 'get'

salvogs avatar May 06 '24 11:05 salvogs

Confirm possible bug with Ollama 1.33 and Qdrant 1.9.1, same problem.

bositalia avatar May 06 '24 13:05 bositalia

#783 Will gonna resolve most problem with ollama, wait some time

valentimarco avatar May 07 '24 15:05 valentimarco

First of all, really great job, ceshire-cat is great.

Then...

Please, take my observation with some caution, since I am fairly new to this kind of environment. Maybe, I am writing a lot of rubbish...

It seems that here there are two problems:

  1. the first is in the ceshire-cat, core application, code file:

[prjroot_core]/cat/factory/ollama_utils.py

where, at line 122, the handling of the exception passes through an async object, which "immediately" tries to call a get method on something which is not there yet:

            optional_detail = await response.json().get("error")

so the exception (I am inside the docker environment created through your compose file)

cheshire_cat_core           |   File "/app/cat/factory/ollama_utils.py", line 121, in _acreate_stream_patch
cheshire_cat_core           |     optional_detail = await response.json().get("error")
cheshire_cat_core           | AttributeError: 'coroutine' object has no attribute 'get'

My pesonal fix here, inspired by this stackoverflow post 'coroutine' object has no attribute get || pyppeteer

                    asy_optional_detail = await response.json()
                    optional_detail = asy_optional_detail.get("error")
  1. Apparently, with my fix, the exception catcher does catch the exception, which at the end is, on the web UI:

Ollama call failed with status code 500. Details: option "stop" must be of type array

here is an extract of the log inside the container:

cheshire_cat_core           |   File "/app/cat/factory/ollama_utils.py", line 123, in _acreate_stream_patch
cheshire_cat_core           |     raise ValueError(
cheshire_cat_core           | ValueError: Ollama call failed with status code 500. Details: option "stop" must be of type array

Which is the ollama exception. On this one I am still lost. I googled a little, giving the exact exception nothing was there. I'll keep searching. however, still looking at the log, there are these two lines more or less immediately above the catcher:

cheshire_cat_core           |   File "/usr/local/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 343, in _astream_with_aggregation
cheshire_cat_core           |     async for stream_resp in self._acreate_generate_stream(prompt, stop, **kwargs):

which maybe could be interesting in order to solve this...

thank you again for this framework, really...

enrichicco avatar May 11 '24 08:05 enrichicco

ok... I am back... I was able to obtain at least one answer.

It seems that the problem is in the file (inside the docker backend container):

/app/cat/looking_glass/agent_manager.py

I changed the line 170, from

return await memory_chain.ainvoke({**agent_input, "stop": "Human:"}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))

to

return await memory_chain.ainvoke({**agent_input, "stop":["Human:"]}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))

and the call apparently worked... again, take this with caution: smally tested (just one shot), from a beginner on both sides (python api and ollama api interface)

enrichicco avatar May 11 '24 08:05 enrichicco

@enrichicco thank you for the effort you are putting! We already resolve the problem with the merge of #783 #813 in develop branch. To summarize the problem:

  1. Langchain patched all problems with Ollama (before that we used the ollama_utils.py to patch some issues)
  2. Ollama update the body of the APIs (this is why we have the error)

With the new changes, we also resolve the tool selection issue when a local model tries to select a tool (now even Phi-3 can launch tools)

valentimarco avatar May 11 '24 10:05 valentimarco

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

TobioDev avatar May 14 '24 13:05 TobioDev

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

not soon, is already in the develop branch but need some testing. If you want you can try it and give a feedback

valentimarco avatar May 14 '24 14:05 valentimarco

Hi @valentimarco,

I checked the commits and the development branch, and it seems to me that the coroutine object error is still there, i.e, this one:

         optional_detail = await response.json().get("error")

According to me, this should be something like this:

                    asy_optional_detail = await response.json()
                    optional_detail = asy_optional_detail.get("error")

it is not so bad, but in case of response codes != 200 and 404, it masks the (manually) raised value error (lets say, the real request exception) with the "coroutine object has no attribute get error" .

enrichicco avatar May 15 '24 16:05 enrichicco

Hi @valentimarco,

I checked the commits and the development branch, and it seems to me that the coroutine object error is still there, i.e, this one:

         optional_detail = await response.json().get("error")

According to me, this should be something like this:

                    asy_optional_detail = await response.json()
                    optional_detail = asy_optional_detail.get("error")

it is not so bad, but in case of response codes != 200 and 404, it masks the (manually) raised value error (lets say, the real request exception) with the "coroutine object has no attribute get error" .

Langchain changed the code on that parts, infact i delete ollama_utils.py and from my test with ollama 1.33 and 1.34 i don't have any problem. If you want we can see it in discord, maybe i am forgetting something

valentimarco avatar May 15 '24 16:05 valentimarco

ok... I am back... I was able to obtain at least one answer.

It seems that the problem is in the file (inside the docker backend container):

/app/cat/looking_glass/agent_manager.py

I changed the line 170, from

return await memory_chain.ainvoke({**agent_input, "stop": "Human:"}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))

to

return await memory_chain.ainvoke({**agent_input, "stop":["Human:"]}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))

and the call apparently worked... again, take this with caution: smally tested (just one shot), from a beginner on both sides (python api and ollama api interface)

I solved it like this too

mecodj avatar May 19 '24 18:05 mecodj

@Pingdred do you know if this fix needs to apply also for the new chain?

valentimarco avatar May 19 '24 20:05 valentimarco

@Pingdred do you know if this fix needs to apply also for the new chain?

Not for now because we no longer use the stop sequence (branch develop), however, there is to test and see if it is necessary to keep it

Pingdred avatar May 19 '24 20:05 Pingdred

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

not soon, is already in the develop branch but need some testing. If you want you can try it and give a feedback

I'm gonna apologize in advance for a dumb question: how do we switch to the develop branch? I followed the instructions here to install, but I can't figure out what the "image" line needs to be in order to use the develop branch (if that's even what you need to do lol)

If it isn't obvious, I am really new to docker and containerization. Thank you in advance!

greenmojo2 avatar May 24 '24 16:05 greenmojo2

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

not soon, is already in the develop branch but need some testing. If you want you can try it and give a feedback

I'm gonna apologize in advance for a dumb question: how do we switch to the develop branch? I followed the instructions here to install, but I can't figure out what the "image" line needs to be in order to use the develop branch (if that's even what you need to do lol)

If it isn't obvious, I am really new to docker and containerization. Thank you in advance!

  1. git clone https://github.com/cheshire-cat-ai/core.git
  2. git switch develop
  3. docker compose build
  4. docker compose up

valentimarco avatar May 24 '24 16:05 valentimarco

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

not soon, is already in the develop branch but need some testing. If you want you can try it and give a feedback

I'm gonna apologize in advance for a dumb question: how do we switch to the develop branch? I followed the instructions here to install, but I can't figure out what the "image" line needs to be in order to use the develop branch (if that's even what you need to do lol) If it isn't obvious, I am really new to docker and containerization. Thank you in advance!

  1. git clone https://github.com/cheshire-cat-ai/core.git
  2. git switch develop
  3. docker compose build
  4. docker compose up

Thank you so much!!! I cannot even begin to describe how excited I am to start experimenting with this!!!

greenmojo2 avatar May 25 '24 17:05 greenmojo2

@pieroit should this be closed as fixed?

matteocacciola avatar Oct 19 '24 09:10 matteocacciola