litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Feature]: Access fine-tuned Gemini via the Google AI Studio adapter

Open twardoch opened this issue 1 year ago • 1 comments

The Feature

Dnsure that we can access our fine-tuned Gemini via the Google AI Studio adapter. Haven't tested it yet.

Motivation, pitch

You can find-tune Google Gemini Pro 1.0 with your own data and they do inference with the Google AI Studio — for FREE?

On 19 March 2024, Google added the ability to fine-tune their Gemini Pro 1.0 model, and as far as I can tell, you can do it and then do inference for free (for now, I guess, and with the caveat that they're using your data if you use the free offering).

This is an incredible offer to get into fine-tuning and especially experiment with RAG vs. fine-tuning vs. a combo of those, with very little effort.

https://developers.googleblog.com/2024/03/tune-gemini-pro-in-google-ai-studio-or-gemini-api.html?m=1

Twitter / LinkedIn details

@adamtwar

twardoch avatar Mar 22 '24 10:03 twardoch

how to use the finetuned model tho

myudak avatar Jul 29 '24 01:07 myudak

+1 There seems to be no solution to accessing the fine tuned Models throw the gemini-Api.

the-wdr avatar Sep 11 '24 12:09 the-wdr

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

github-actions[bot] avatar Jan 29 '25 00:01 github-actions[bot]

Ran in to this today. If someone wants to do a request feel free

Here is the SEARCH/REPLACE block:

litellm/llms/vertex_ai/common_utils.py

<<<<<<< SEARCH
    stream: Optional[bool],
    gemini_api_key: Optional[str],
) -> Tuple[str, str]:
    _gemini_model_name = "models/{}".format(model)
    if mode == "chat":
        endpoint = "generateContent"
        if stream is True:
=======
    stream: Optional[bool],
    gemini_api_key: Optional[str],
) -> Tuple[str, str]:
    # Check if the model name already includes the 'tunedModels/' prefix
    if model.startswith("tunedModels/"):
        _gemini_model_name = model
    else:
        # For base models, prepend 'models/'
        _gemini_model_name = "models/{}".format(model)

    # The rest of the function remains the same...
    if mode == "chat":
        endpoint = "generateContent"
        if stream is True:
>>>>>>> REPLACE

jtromans avatar Mar 28 '25 22:03 jtromans

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

github-actions[bot] avatar Jun 27 '25 00:06 github-actions[bot]