[GEMINI] create_with_completion does not work when the response_model is a List[Object]
- [x] This is actually a bug report.
- [ ] I am not getting good LLM Results
- [ ] I have tried asking for help in the community on discord or discussions and have not received a response.
- [x] I have tried searching the documentation and have not found an answer.
What Model are you using?
- [ ] gpt-3.5-turbo
- [ ] gpt-4-turbo
- [ ] gpt-4
- [x] Other (please specify) - models/gemini-1.5-flash-latest
Describe the bug
- I have an instructor client with gemini
lite_instructor_client: instructor.Instructor = instructor.from_gemini(
client=instructorai.GenerativeModel(
model_name="models/gemini-1.5-flash-latest",
generation_config=generation_config,
),
mode=instructor.Mode.GEMINI_JSON,
)
- I am trying to
create_with_completion
listings = lite_instructor_client.create_with_completion(
response_model=List[ExtractedListing],
messages=[
{
"role": "user",
"content": "Extract the blog links from the following content.",
},
{
"role": "user",
"content": chunk, # some blog chunk here...
},
],
)
- Throws the following error:
client.py", line 331, in create_with_completion
return model, model._raw_response
^^^^^^^^^^^^^^^^^^^
AttributeError: 'list' object has no attribute '_raw_response'
I think this happens because model here is a list and we are trying to return ._raw_response.
Please help! Thank you!
I am also facing the same issue.
Related to this?
Here is a tempo fix
If you really want the response to be of type List[ExtractedListing]
For now if you want it to work with your type List[ExtractedListing], you should use the create_iterable instead of create_with_completion. (Is you want the call to just send you back the list without the usage metadata and others )
If you want the usage metadata and the completion :
Make your type a pydantic model instead, and create_with_completion will work 👍
Let me know.
I think it's related to this: https://github.com/instructor-ai/instructor/pull/1103#issuecomment-2675634416
Issue Reproduction & Root Cause Analysis
I've successfully reproduced this issue. Here's what's happening:
Reproduction
from typing import List
from pydantic import BaseModel
import instructor
from openai import OpenAI
class User(BaseModel):
name: str
age: int
client = instructor.from_openai(OpenAI())
users, completion = client.chat.completions.create_with_completion(
model="gpt-4o-mini",
response_model=List[User], # or Iterable[User]
messages=[{"role": "user", "content": "Extract: John is 30, Jane is 25"}],
)
# Error: AttributeError: 'list' object has no attribute '_raw_response'
Root Cause
In instructor/processing/response.py (lines 356-358), when handling IterableBase models:
if isinstance(model, IterableBase):
logger.debug(f"Returning takes from IterableBase")
return [task for task in model.tasks] # ← Returns plain list, no _raw_response!
The function returns a plain Python list without attaching _raw_response. Then create_with_completion() tries to access `model._raw_response" which fails.
Why Workaround Works
create_iterable() returns a generator and doesn't try to access _raw_response, so it works fine.
Design Challenge
The core issue is: How should raw response be attached to a plain list?
Python lists allow arbitrary attributes (tasks._raw_response = response), but this is unconventional.
Options:
- Attach to list - Simple but unconventional
- Return wrapper object - Proper but breaks API
- Make raw_response optional - Incomplete solution
No clean solution yet. Tagging as needs-investigation.
Solution Implemented
Created ListResponse wrapper class to fix this issue!
✅ Pull Request: #1870
What was done:
- Created ListResponse[T] class that inherits from list[T]
- Updated process_response() to return ListResponse for Iterable results
- Fixed prepare_response_model() to handle List[T] (was missing)
- ListResponse preserves _raw_response so create_with_completion() works
Key features:
- ListResponse acts like a normal Python list (100% compatible)
- Stores _raw_response for completion metadata
- Works with streaming and non-streaming
- Fully backward compatible
Usage:
users, completion = client.chat.completions.create_with_completion(
response_model=List[User],
messages=[...]
)
# users is now ListResponse[User] with _raw_response attached!