web-llm-chat icon indicating copy to clipboard operation
web-llm-chat copied to clipboard

Chat with AI large language models running natively in your browser. Enjoy private, server-free, seamless AI conversations.

Results 21 web-llm-chat issues
Sort by recently updated
recently updated
newest added

Bumps [lint-staged](https://github.com/lint-staged/lint-staged) from 13.3.0 to 15.2.10. Release notes Sourced from lint-staged's releases. v15.2.10 Patch Changes #1471 e3f283b Thanks @​iiroj! - Update minor dependencies, including micromatch@~4.0.8. v15.2.9 Patch Changes #1463 b69ce2d...

dependencies

Bumps [@serwist/next](https://github.com/serwist/serwist) from 9.0.3 to 9.0.7. Release notes Sourced from @​serwist/next's releases. @​serwist/next@​9.0.7 Patch Changes #192 ceea5d1 Thanks @​DuCanhGH! - chore(build): remove extraneous Node.js API wrappers Doesn't seem that we...

dependencies

Bumps [next](https://github.com/vercel/next.js) from 13.5.6 to 14.2.7. Release notes Sourced from next's releases. v14.2.7 [!NOTE] This release is backporting bug fixes. It does not include all pending features/changes on canary. Core...

dependencies

Bumps [husky](https://github.com/typicode/husky) from 9.0.11 to 9.1.5. Release notes Sourced from husky's releases. v9.1.5 What's Changed fixes #1494, support pre-merge-commit hook by @​RainMeoCat in typicode/husky#1497 New Contributors @​RainMeoCat made their first...

dependencies

Currently searching "WebLLM Chat" on Google would display the following weird result: This PR attempts to change that. All code is generated by Cursor and I have not checked whether...

### Bug Description ## Error Description: When trying to load a model from cache in WebLLM (WebGPU), the following error occurs: **GPUPipelineError: [Invalid ShaderModule (unlabeled)] is invalid. While validating compute...

bug

### Bug Description Maximum compute workgroup exceeded ### Steps to Reproduce Use chrome on a devic with low memory ### Expected Behavior It split the layers or smtn ### Screenshots...

bug

### Problem Description My issue is that I have a server hosting docker containers, but no build environment or registry. It'd be great if I could just reference the image...

enhancement

### Problem Description Would you consider integrating mlc-ai/web-llm with Vercel AI SDK to provide: Easier API abstraction – Developers can easily interact with web-llm through Vercel’s AI tooling. Automatic streaming...

enhancement

### Bug Description I tried to run latest code locally on my end but got the reference error when loading pages. ``` ReferenceError: Worker is not defined at new WebLLMApi...

bug