Guess

Results 142 comments of Guess

Excellent! According to `aask`,`process_message` should return a str. ```python async def aask( self, msg: Union[str, list[dict[str, str]]], system_msgs: Optional[list[str]] = None, format_msgs: Optional[list[dict[str, str]]] = None, images: Optional[Union[str, list[str]]] =...

According to your log: ```text 2024-03-15 22:03:34.325 | INFO | metagpt.utils.cost_manager:update_cost:52 - Total running cost: $0.001 | Max budget: $10.000 | Current cost: $0.001, prompt_tokens: 612, completion_tokens: 21 1 from...

#998 Starting from version 0.7, `Action._aask` defaults to stream mode." ```python async def _aask(self, prompt: str, system_msgs: Optional[list[str]] = None) -> str: """Append default prefix""" return await self.llm.aask(prompt, system_msgs) ```...

从0.7开始, Action._aask就已经默认是stream模式了。 ```python async def _aask(self, prompt: str, system_msgs: Optional[list[str]] = None) -> str: """Append default prefix""" return await self.llm.aask(prompt, system_msgs) ``` ```python async def aask( self, msg: str, system_msgs:...

`anthropic` is only available in `main` branch. What version of Metagpt are you using?

This is the first time I've encountered an issue with Ollama not supporting streams. Issue reports #987 and #966 indicate that Ollama is functioning, but the response is not meeting...

But this error message specifically indicates that it returns bytes data instead of an iterator. ``` 'async for' requires an object with aiter method, got bytes ```

This issue is similar to #966, both caused by LLM not returning in the required format. Metagpt expects: ``` 2024-03-12 00:07:13.249 | ... parse json from content inside [CONTENT][/CONTENT] ```...

Recommend using Python 3.9 or 3.10. Newer versions of Python may encounter situations where pip packages are not yet supported.