[Feature]: Access fine-tuned Gemini via the Google AI Studio adapter
The Feature
Dnsure that we can access our fine-tuned Gemini via the Google AI Studio adapter. Haven't tested it yet.
Motivation, pitch
You can find-tune Google Gemini Pro 1.0 with your own data and they do inference with the Google AI Studio — for FREE?
On 19 March 2024, Google added the ability to fine-tune their Gemini Pro 1.0 model, and as far as I can tell, you can do it and then do inference for free (for now, I guess, and with the caveat that they're using your data if you use the free offering).
This is an incredible offer to get into fine-tuning and especially experiment with RAG vs. fine-tuning vs. a combo of those, with very little effort.
https://developers.googleblog.com/2024/03/tune-gemini-pro-in-google-ai-studio-or-gemini-api.html?m=1
Twitter / LinkedIn details
@adamtwar
how to use the finetuned model tho
+1 There seems to be no solution to accessing the fine tuned Models throw the gemini-Api.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
Ran in to this today. If someone wants to do a request feel free
Here is the SEARCH/REPLACE block:
litellm/llms/vertex_ai/common_utils.py
<<<<<<< SEARCH
stream: Optional[bool],
gemini_api_key: Optional[str],
) -> Tuple[str, str]:
_gemini_model_name = "models/{}".format(model)
if mode == "chat":
endpoint = "generateContent"
if stream is True:
=======
stream: Optional[bool],
gemini_api_key: Optional[str],
) -> Tuple[str, str]:
# Check if the model name already includes the 'tunedModels/' prefix
if model.startswith("tunedModels/"):
_gemini_model_name = model
else:
# For base models, prepend 'models/'
_gemini_model_name = "models/{}".format(model)
# The rest of the function remains the same...
if mode == "chat":
endpoint = "generateContent"
if stream is True:
>>>>>>> REPLACE
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.