Michał Krassowski
Michał Krassowski
Thanks for testing, the Ollama provider is _experimental_ so there may be issues to iron out. Things I would suspect: - a) the models may be not very good at...
I can reproduce the prefix trimming issue with all providers in 2.19.0, whether streaming or not.
For some reason in 2.19.0 the suggestion includes an extra space. This is logging from GPT-4 without streaming (so logic which should not have changed since 2.18): 
Ah no, this was still `ollama` with `phi`, not GPT-4. So it looks like `ollama` output parsing may be off by a spurious whitespace at the beginning.
@pedrogutobjj did you have a chance to test the latest release, v2.19.1, which includes #900? Is it any better?
Is this what you would expect or not? To me it looks like syntactically valid response. Of course a bit useless, but this is down to the ability of the...
I see that, but it looks like the model is at fault here. It first inserted "import numpy as" and only then started the "def sum_matrizes(matrix1, matrix2):" part again. Previously...
Thanks! Just to help me reproduce, where was your cursor when you invoked the online completer?
Do you mean that your cursor was before in the first line here: ```python def soma_matrices(matriz1, matriz2):| ``` or in the new line: ```python def soma_matrices(matriz1, matriz2): | ``` or...
Of note https://github.com/jupyterlab/jupyter-chat natively supports opening conversation history from UI.