hisir

Results 21 comments of hisir

> 这个命令有效:“pip install --no-cache-dir paddlepaddle==2.5.2 paddleocr==2.7.0.3”, 但是为什么这个问题在将近半年后似乎仍然存在!?为什么它不是默认修复的!? 似乎是因为高版本使用了avx512加速,如果你的cpu支持avx512应该就没问题(

Does dify have any parts that require the use of a GPU?

I think this is fine, unless you curl directly, most programs will automatically handle this problem

![image](https://github.com/user-attachments/assets/f5b54eba-eb2f-46f7-8f5e-b8fa7cc247a3) We need to know how the `stream` parameter was initially passed in. I can override this `stream` in the function to fix the `stream=True` case, but when `stream` is...

![image](https://github.com/user-attachments/assets/c950e2a5-d6b1-4da9-bb0e-89aedd6326e9) When I changed the return value of `_handle_generate_response` to also be in stream format, the `stream=False` issue was also fixed. This indicates that the handling of the return value...

```python def _handle_invoke_result( self, invoke_result: LLMResult | Generator ) -> Generator[RunEvent | ModelInvokeCompleted, None, None]: """ Handle invoke result :param invoke_result: invoke result :return: """ if isinstance(invoke_result, LLMResult): return ```...

> Hi @Hisir0909, I am just adding my one cent here, `_handle_generate_response` has nothing to do when `stream=False`. Based on > > https://github.com/langgenius/dify/blob/7121afdd4426648b99055b7041b5e99bc7b1ad3a/api/core/model_runtime/model_providers/google/llm/llm.py#L217-L220 > > `_handle_generate_stream_response` is the method handling...

@AAEE86 @CXwudi Please take a look at my submission. Can it resolve the issue, and are my modifications reasonable? 🐸

`docker compose --env-file ./middleware.env up` and specifying `env-file` in the compose file are only different in priority. > The order of precedence (highest to lowest) is as follows: > >...

@jter I understand what you mean. Using both `env_file` and `environment` in `compose.yaml`, and the `.env` file not existing, results in the variables in `environment` being empty, correct? This causes...