Add support for `GROQ API` and fix `iframe` documentation cutoff
Description
Hope to expand our AI chat capabilities by adding support for the GROQ alongside our existing integrations with OpenAI, Gemini, and Anthropic (Claude) APIs. This addition will provide users with more options for AI-powered interactions within Marimo.
Intended addition:
- AI Completion
- AI Chatbot (with regards to latest release)
Additionally, a small documentation issue where the content for the iframe function is being cut off in the docs website.
The GROQ API integration will involve:
- Updating the LLM implementation in
marimo/_plugins/ui/_impl/chat/llm.py - Adding examples for GROQ usage in the
examples/ai/chatdirectory - Updating the AI completion documentation at
docs/guides/editor_features/ai_completion.html - Adding GROQ to the list of built-in models in
docs/api/inputs/chat.html
iframe Documentation Fix
The iframe documentation at docs/api/html.html#marimo.iframe is currently experiencing a display issue where content is cut off after the words "so if you have a". I think this problem is related to the <script/> tag in the documentation.
Suggested solution
GROQ API Integration
-
Update
marimo/_plugins/ui/_impl/chat/llm.py:- Add a new class
marimo.ai.llm.openai.GroqLLMthat implements the necessary methods to interact with the GROQ API.
- Add a new class
-
Create examples in
examples/ai/chat:- Add a new example notebook demonstrating GROQ usage with
mo.ai.chat.
- Add a new example notebook demonstrating GROQ usage with
-
Update
docs/guides/editor_features/ai_completion.html:- Add information about GROQ support in the AI completion documentation.
-
Modify
docs/api/inputs/chat.html:- Add GROQ to the list of built-in models in the chat API documentation.
-
Update any relevant configuration or environment variable handling to include GROQ API keys and endpoints.
iframe Documentation Fix
-
Wrap the script tag with using inline highlighting.
-
Test the documentation build locally to verify that the full content of the
iframefunction documentation is displayed correctly (will refer to CONTRIBUTING.md for instructions to build docs locally). -
Located the issue introduced from this PR #2398; will be modifying marimo/_output/formatting.py accordingly.
Alternative
Nothing other than what was listed in the suggested solutions at the moment.
Additional context
- GROQ has been gaining attention in the AI community for its performance and cost-effectiveness. Thought adding support for it (given the kind of free APIs it offers to various models) would be useful.
Relevant PRs (broken into 3 PRs):
- [x] Fix inline docs - https://github.com/marimo-team/marimo/pull/2565
- [ ] Add
Groqformo.ui.chat- https://github.com/marimo-team/marimo/pull/2757
These sound good to me. I would suggest splitting these into three PRs:
- Adding
marimo.ai.llm.GroqLLM, the example, and updating the docs - Adding
grqsupport to AI completion (if it is not already compatible with the OpenAI API, otherwise just updating the docs) - Fixing the
iframedocumentation
Thanks for opening this issue!
These sound good to me. I would suggest splitting these into three PRs:
- Adding
marimo.ai.llm.GroqLLM, the example, and updating the docs- Adding
grqsupport to AI completion (if it is not already compatible with the OpenAI API, otherwise just updating the docs)- Fixing the
iframedocumentationThanks for opening this issue!
Sure, will follow this!
Trying to develop mo.ai.llm.groq; facing some issues with it. Introduced changes following https://github.com/marimo-team/marimo/pull/2479 by Myles; great reference.
Running into an error while testing a file I created in examples\ai\chat\groq-example.py;
[!CAUTION] AttributeError: module 'marimo._plugins.ui._impl.chat.llm' has no attribute 'groq'
Wrote code in:
- https://github.com/marimo-team/marimo/blob/main/marimo/_plugins/ui/_impl/chat/convert.py: Implemented
convert_to_groq_messagesfunction - https://github.com/marimo-team/marimo/blob/main/marimo/_plugins/ui/_impl/chat/llm.py: Implemented the
class groq(ChatModel):
Don't see any other relevant file changes being necessary (other than tests to be introduced along with markdown changes once the functionality works) as most of the other stuff are related to gemini AI completion assist in the above mentioned PR.
I can see the groq class being prompted in auto complete and docs also showing up (docstrings defined inside the class).
@Haleshot are you using --sandbox or confirming when prompted? for developing locally, sandbox doesn't work (it will download the latest version of marimo, instead of your local version).
@Haleshot are you using --sandbox or confirming when prompted? for developing locally, sandbox doesn't work (it will download the latest version of marimo, instead of your local version).
Oh my god. Was cranking my brain on what I was doing wrong as I just followed your PR. Thanks so much for the clarification! Works now.
Closing this issue now; with all PRs (split up) being merged.