Hanusz Leszek

Results 104 comments of Hanusz Leszek

Which code did you use? It seems like a network connectivity issue. Are you behind a corporate firewall or something interfering with your network?

Is it related to [transformers issue #28459](https://github.com/huggingface/transformers/issues/28459)?

@rahul-gj Sorry for the delay. I modified the `get_execution_result` argument typing so that it is identical to the other methods in the file. I believe this could solve your problem....

Release [b3472](https://github.com/ggerganov/llama.cpp/releases/tag/b3472) of `llama.cpp` has just been released with LLama 3.1 support, but I think we still need to wait for `llama-cpp-python` to bump its `llama.cpp` version.

Alright, now llama-cpp-python made the [release 0.2.84](https://github.com/abetlen/llama-cpp-python/blob/main/CHANGELOG.md#0284) which updated llama.cpp version to the latest commit.

`llama-cpp-python` dependency has been updated to `0.2.85` in release [v1.13](https://github.com/oobabooga/text-generation-webui/releases/tag/v1.13). LLaMA 3.1 is working now! Thanks @oobabooga

That's pretty bad... Thanks for your report. The annoying thing is that there are probably some people which are depending on this and changing the default will be a breaking...

> yes, there must be a breaking change.. but a patch release could contain a warning: Good idea!

Note that for the warning, the default value cannot be `None`. That's because there was a change in aiohttp from version 3.10 Before aiohttp 3.10, aiohttp was using `None` as...

Fixed in [version 4.0.0](https://github.com/graphql-python/gql/releases/tag/v4.0.0)