marimo icon indicating copy to clipboard operation
marimo copied to clipboard

Add support for `GROQ API` and fix `iframe` documentation cutoff

Open Haleshot opened this issue 1 year ago • 5 comments

Description

Hope to expand our AI chat capabilities by adding support for the GROQ alongside our existing integrations with OpenAI, Gemini, and Anthropic (Claude) APIs. This addition will provide users with more options for AI-powered interactions within Marimo.

Intended addition:

Additionally, a small documentation issue where the content for the iframe function is being cut off in the docs website.

The GROQ API integration will involve:

  1. Updating the LLM implementation in marimo/_plugins/ui/_impl/chat/llm.py
  2. Adding examples for GROQ usage in the examples/ai/chat directory
  3. Updating the AI completion documentation at docs/guides/editor_features/ai_completion.html
  4. Adding GROQ to the list of built-in models in docs/api/inputs/chat.html

iframe Documentation Fix

The iframe documentation at docs/api/html.html#marimo.iframe is currently experiencing a display issue where content is cut off after the words "so if you have a". I think this problem is related to the <script/> tag in the documentation.

Suggested solution

GROQ API Integration

  1. Update marimo/_plugins/ui/_impl/chat/llm.py:

    • Add a new class marimo.ai.llm.openai.GroqLLM that implements the necessary methods to interact with the GROQ API.
  2. Create examples in examples/ai/chat:

    • Add a new example notebook demonstrating GROQ usage with mo.ai.chat.
  3. Update docs/guides/editor_features/ai_completion.html:

    • Add information about GROQ support in the AI completion documentation.
  4. Modify docs/api/inputs/chat.html:

    • Add GROQ to the list of built-in models in the chat API documentation.
  5. Update any relevant configuration or environment variable handling to include GROQ API keys and endpoints.

iframe Documentation Fix

  1. Wrap the script tag with using inline highlighting.

  2. Test the documentation build locally to verify that the full content of the iframe function documentation is displayed correctly (will refer to CONTRIBUTING.md for instructions to build docs locally).

  3. Located the issue introduced from this PR #2398; will be modifying marimo/_output/formatting.py accordingly.

Alternative

Nothing other than what was listed in the suggested solutions at the moment.

Additional context

  • GROQ has been gaining attention in the AI community for its performance and cost-effectiveness. Thought adding support for it (given the kind of free APIs it offers to various models) would be useful.

Relevant PRs (broken into 3 PRs):

  • [x] Fix inline docs - https://github.com/marimo-team/marimo/pull/2565
  • [ ] Add Groq for mo.ui.chat - https://github.com/marimo-team/marimo/pull/2757

Haleshot avatar Oct 06 '24 04:10 Haleshot

These sound good to me. I would suggest splitting these into three PRs:

  1. Adding marimo.ai.llm.GroqLLM, the example, and updating the docs
  2. Adding grq support to AI completion (if it is not already compatible with the OpenAI API, otherwise just updating the docs)
  3. Fixing the iframe documentation

Thanks for opening this issue!

akshayka avatar Oct 07 '24 17:10 akshayka

These sound good to me. I would suggest splitting these into three PRs:

  1. Adding marimo.ai.llm.GroqLLM, the example, and updating the docs
  2. Adding grq support to AI completion (if it is not already compatible with the OpenAI API, otherwise just updating the docs)
  3. Fixing the iframe documentation

Thanks for opening this issue!

Sure, will follow this!

Haleshot avatar Oct 08 '24 03:10 Haleshot

Trying to develop mo.ai.llm.groq; facing some issues with it. Introduced changes following https://github.com/marimo-team/marimo/pull/2479 by Myles; great reference.

Running into an error while testing a file I created in examples\ai\chat\groq-example.py;

[!CAUTION] AttributeError: module 'marimo._plugins.ui._impl.chat.llm' has no attribute 'groq'

Wrote code in:

  • https://github.com/marimo-team/marimo/blob/main/marimo/_plugins/ui/_impl/chat/convert.py: Implemented convert_to_groq_messages function
  • https://github.com/marimo-team/marimo/blob/main/marimo/_plugins/ui/_impl/chat/llm.py: Implemented the class groq(ChatModel):

Don't see any other relevant file changes being necessary (other than tests to be introduced along with markdown changes once the functionality works) as most of the other stuff are related to gemini AI completion assist in the above mentioned PR.

I can see the groq class being prompted in auto complete and docs also showing up (docstrings defined inside the class).

Haleshot avatar Oct 31 '24 14:10 Haleshot

@Haleshot are you using --sandbox or confirming when prompted? for developing locally, sandbox doesn't work (it will download the latest version of marimo, instead of your local version).

mscolnick avatar Oct 31 '24 14:10 mscolnick

@Haleshot are you using --sandbox or confirming when prompted? for developing locally, sandbox doesn't work (it will download the latest version of marimo, instead of your local version).

Oh my god. Was cranking my brain on what I was doing wrong as I just followed your PR. Thanks so much for the clarification! Works now.

Haleshot avatar Oct 31 '24 14:10 Haleshot

Closing this issue now; with all PRs (split up) being merged.

Haleshot avatar Nov 09 '24 07:11 Haleshot