generative-ai-python icon indicating copy to clipboard operation
generative-ai-python copied to clipboard

can't ask for JSON data when using fine-tuned model

Open hedihadi opened this issue 1 year ago • 3 comments

Description of the bug:

when i use gemini-1.5-flash or gemini-1.5-pro i have no problem asking for json return like so

model = genai.GenerativeModel("gemini-1.5-flash",system_instruction="you're a bot that answers questions by referencing Quran verses or Hadith or Quotes from notable islamic scholars.")

response_schema = """"
{
  "type": "object",
  "properties": {
    "answers": {
      "type": "array",
      "items": {
        "type": "object",
        "properties": {
          "type": {
            "type": "string",
            "enum": [
              "quran",
              "hadith",
              "quote"
            ]
          },
          "text": {
            "type": "string"
          },
          "chapter_number": {
            "type": "integer"
          },
          "verse_number": {
            "type": "string"
          }
        },
        "required": [
          "type",
          "text"
        ]
      }
    },
    "conclusion": {
      "type": "string"
    }
  },
  "required": [
    "answers",
    "conclusion"
  ]
}
"""

response = model.generate_content(
    "my mom is sad, what should i do?",
   generation_config = {
  "temperature": 1,
  "max_output_tokens": 8192,
        'response_mime_type':"application/json",
        "response_schema":response_schema

}
)

this sometimes doesn't respect the response_schema structure, so i decided to fine-tune a model to get more predictable data. i did that and basically replaced the model name like so

model = genai.GenerativeModel("tunedModels/quranchat-q68kkpnh5aad",system_instruction="""
you're a bot that answers questions by referencing Quran verses or Hadith or Quotes from notable islamic scholars.
""")

also dealing with the oauth from this documentation https://ai.google.dev/gemini-api/docs/oauth as i'm now using a fine-tuned model.

but now, when i run the same code i get this error 400 Developer instruction is not enabled for tunedModels/quranchat-q68kkpnh5aad and after some horrendous hours trying to see what this means, i realized it's because i have system_instruction variable, so i deleted that and ended up with this code

model = genai.GenerativeModel("tunedModels/quranchat-q68kkpnh5aad")

and ran the file again, now i get this error 400 Json mode is not enabled for tunedModels/quranchat-q68kkpnh5aad which this one is also happening because i have 'response_mime_type':"application/json" in my generation_config variable.

Actual vs expected behavior:

so i basically realized i can't use system instruction nor ask for json response when using fine-tuned data, is this a library restriction or gemini?

Any other information you'd like to share?

let me know if there's anything more i can share

hedihadi avatar Sep 30 '24 22:09 hedihadi

@hedihadi

For a tuned model, JSON mode is not supported, and also you cannot pass the system instruction.

Please refer the link for the current limitations of tuned model.

Gunand3043 avatar Oct 01 '24 14:10 Gunand3043

This is a valid feature request.

@Gunand3043, that's a good resource. Thanks for the link.

this sometimes doesn't respect the response_schema structure

That shouldn't be possible, it uses constrained decoding. Can you explain more? Is it just that it leaves out fields sometimes?

MarkDaoust avatar Oct 02 '24 18:10 MarkDaoust

Hey Guys, do we have any update on this feature, do we know if its being built or in the pipeline? Would be great to know this. Thanks

aaryanbrahm avatar Nov 03 '24 17:11 aaryanbrahm