llama-coder
                                
                                 llama-coder copied to clipboard
                                
                                    llama-coder copied to clipboard
                            
                            
                            
                        Open-WebUI compatibility
Ran into an issue with using the Ollama API through Open-WebUI.
They add an id message at the beginning of the stream that does not include a '.response' part of the message which causes llama-coder to have issues because it still tries to iterate 'tokens.reponse'.
It can be easily fixed by adding the following code to Line 28 of the autocomplete.js file.
if ('response' in tokens) {
...
// This is just to show where the closing bracket goes, this is existing code.
if (totalLines > args.maxLines && blockStack.length === 0) {
        (0, log_1.info)('Too many lines, breaking.');
        break;
    }
}