Temperature
The LLMExtraction might return totally different things every time you use it. We need to be able to set settings for the model like temperature etc in order to be able to make the responses more reliable.
+1 , temperature and top_p @unclecode
Hi @TasosNorma @stsfaroz thx for trying Crawl4ai, actually crawl4ai already supports passing additional arguments like temperature and other model parameters through extra_args. You can set these when you initialize LLMExtractionStrategy:
strategy = LLMExtractionStrategy(
provider="openai/gpt-4o-mini",
api_token="YOUR_TOKEN",
instruction="Extract structured data...",
extra_args={
"temperature": 0,
"top_p": 0.9,
"max_tokens": 2000,
# any other supported parameters for litellm
}
)
Please take a look and try it if you face any problems. Let me know.
@unclecode I'm trying to use with azure openai
LLMExtractionStrategy(
provider="azure/" + os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME"),
api_base=os.getenv("AZURE_OPENAI_ENDPOINT"),
api_token=os.getenv("AZURE_OPENAI_API_KEY"),
schema=Article.model_json_schema(),
extraction_type="schema",
instruction="From the crawled content, extract all the contents in proper order",
extra_args={
"temperature": 0,
"top_p": 0.0,}
)
I got this
litellm.main.completion() got multiple values for keyword argument 'temperature'
without this part , everything works fine
extra_args={
"temperature": 0,
"top_p": 0.0,}
@stsfaroz Oops!! In 1 or 2 days, you may try pip install -U crawl4ai, and hopefully all of your life problems will be resolved :D ;)