mlc-assistant icon indicating copy to clipboard operation
mlc-assistant copied to clipboard

Chat with your documents and improve your writing using large-language models within your browser.

image

Chat with your documents and improve your writing using large-language models in your browser. We currently support using the MLC Assistant in Google Chrome with Overleaf, and plan on adding support for other platforms and browsers soon.

demo

Getting Started

1. Install Git LFS

Follow the instructions here to install Git LFS.

2. Create Conda environment (optional)

conda create --name mlc_assistant python=3.10
conda activate mlc_assistant

3. Run the startup script

This will start the server which runs the model locally, so that the Chrome extension can communicate with it.

./startup.sh

4. Install the Chrome extension

Launch Google Chrome and navigate to the extensions page by entering chrome://extensions. Enable Developer Mode by clicking the toggle switch next to Developer mode. Click the Load unpacked button and select the mlc-assistant/dist directory.

5. Enable inline generation (optional)

If you'd like your text to be generated directly in the document (instead of in a popup), enable inline generation by going to chrome://extensions, selecting Details for the mlc-assistant, clicking on Extension options, and then toggling the inline generation option.

You can now go to any Overleaf document, and select Option + Shift + 3 to invoke the MLC Assistant!

Development

If you'd like to contribute to development, or customize this implementation further, you can follow these steps.

Setting up MLC LLM

Follow the steps below (only for CPU on macOS, Windows, or Linux) to set up MLC LLM on your local machine. For usage with GPU, follow the instructions here. You can customize the model that is used by changing the model parameters that are cloned in the last step. To see the other models that are supported, go here.

# Install MLC packages
python -m pip install --pre -U -f https://mlc.ai/wheels mlc-chat-nightly mlc-ai-nightly

# Enable Git LFS to clone large directories
git lfs install
mkdir -p mlc-llm/dist/prebuilt

# Download prebuilt binaries and model parameters
# Note: This will install the Mistral model parameters, but for other models simply clone the parameters of the model you would like to run
git clone https://github.com/mlc-ai/binary-mlc-llm-libs.git mlc-llm/dist/prebuilt/lib
cd mlc-llm/dist/prebuilt && git clone https://huggingface.co/mlc-ai/mlc-chat-Mistral-7B-Instruct-v0.2-q4f16_1
cd ../..

You can now launch the local server. Depending on the model you chose above, the command for this will be different.

cd mlc-llm
python -m mlc_chat.rest --model Mistral-7B-Instruct-v0.2-q4f16_1

Building the Chrome extension

If you make any changes to the extension and would like to rebuild it, you will need to run the following commands. Start by installing npm here.

npm run build
npm run install

Links