Seems Langchain Wandb can not handle the ChatOpenAI object?
System Info
The llm model using default davinci example provided in https://python.langchain.com/en/latest/ecosystem/wandb_tracking.html is OK. But seems not able to handle ChatOpenAI object. How to solve this? The code is as below:
Who can help?
No response
Information
- [X] The official example notebooks/scripts
- [ ] My own modified scripts
Related Components
- [ ] LLMs/Chat Models
- [ ] Embedding Models
- [ ] Prompts / Prompt Templates / Prompt Selectors
- [ ] Output Parsers
- [ ] Document Loaders
- [ ] Vector Stores / Retrievers
- [ ] Memory
- [ ] Agents / Agent Executors
- [ ] Tools / Toolkits
- [ ] Chains
- [ ] Callbacks/Tracing
- [ ] Async
Reproduction
Code:
from langchain.callbacks import WandbCallbackHandler, StdOutCallbackHandler
wandb_callback = WandbCallbackHandler(
job_type="inference",
project="langchain_callback_demo",
group="test_group",
name="llm",
tags=["test"],
)
callbacks = [StdOutCallbackHandler(), wandb_callback]
chat = ChatOpenAI(model_name='gpt-3.5-turbo', callbacks=callbacks, temperature=0, request_timeout=20)
resp = chat([HumanMessage(content="Write me 4 greeting sentences.")])
wandb_callback.flush_tracker(chat, name="simple")
Expected behavior
Error:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[3], line 4
2 chat = ChatOpenAI(model_name='gpt-3.5-turbo', callbacks=callbacks, temperature=0, request_timeout=20)
3 resp = chat([HumanMessage(content="Write me 4 greeting sentences.")])
----> 4 wandb_callback.flush_tracker(chat, name="simple")
File [~/miniconda3/envs/lang/lib/python3.11/site-packages/langchain/callbacks/wandb_callback.py:560](https://vscode-remote+wsl-002bubuntu-002d22-002e04.vscode-resource.vscode-cdn.net/home/ocean/projects/behavior/~/miniconda3/envs/lang/lib/python3.11/site-packages/langchain/callbacks/wandb_callback.py:560), in WandbCallbackHandler.flush_tracker(self, langchain_asset, reset, finish, job_type, project, entity, tags, group, name, notes, visualize, complexity_metrics)
558 model_artifact.add(session_analysis_table, name="session_analysis")
559 try:
--> 560 langchain_asset.save(langchain_asset_path)
561 model_artifact.add_file(str(langchain_asset_path))
562 model_artifact.metadata = load_json_to_dict(langchain_asset_path)
AttributeError: 'ChatOpenAI' object has no attribute 'save'
Seems like llms that inherit from BaseLLM (anything you import from langchain.llms) have the save method, but it doesn't look like it is implemented for chat models (anything imported from langchain.chat_models).
@hwchase17 @dev2049 is this in the works or would this be a good first issue for me to tackle? I'm new to open source, but I've been loving Langchain and I'd love to contribute.
Hey @firezym , @GraesonB , our preferred LangChain integration, W&B Prompts, can be found here:
https://docs.wandb.ai/guides/prompts/quickstart
The above is an earlier callback that we'll likely be deprecating in the coming months.
Would love to know what you think about Prompts!
@morganmcg1 Get it, thanks a lot for the info!
Hi, @firezym! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, you reported an issue regarding the Langchain Wandb library's inability to handle the ChatOpenAI object due to a missing 'save' attribute. GraesonB mentioned that this could be a good first issue for them to tackle and asked for input from hwchase17 and dev2049. morganmcg1 suggested using the W&B Prompts integration as an alternative solution and provided a link to the documentation. It seems that you acknowledged the information provided by morganmcg1.
Before we proceed, we would like to confirm if this issue is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days.
Thank you for your understanding, and we appreciate your contribution to the LangChain project. Let us know if you have any further questions or concerns.
Thanks. I think it is an outdated issue now. Please proceed. @dosu-beta
Thank you @firezym for closing the issue! Your contribution is greatly appreciated.