autogen
autogen copied to clipboard
[Bug]: TypeError: Object of type WindowsPath is not JSON serializable
Describe the bug
I want to log the output of the Agent using autogen.runtime_logging but it does not serialize the WindowsPath
Steps to reproduce
step 1: run the following code
import autogen
from autogen import AssistantAgent, UserProxyAgent
import os
import json
import pandas as pd
import tempfile
os.environ["OPENAI_API_KEY"] = "APIKEY"
llm_config = {"model": "gpt-3.5-turbo", "api_key": os.environ["OPENAI_API_KEY"]}
input='''
How to construct a network with two inputs in PyTorch?
"<p>Suppose I want to have the general neural network architecture:</p>
<pre><code>Input1 --> CNNLayer
\
---> FCLayer ---> Output
/
Input2 --> FCLayer
</code></pre>
<p>Input1 is image data, input2 is non-image data. I have implemented this architecture in Tensorflow.</p>
<p>All pytorch examples I have found are one input go through each layer. How can I define forward func to process 2 inputs separately then combine them in a middle layer? </p>
"
'''
# Start logging
logging_session_id = autogen.runtime_logging.start(config={"dbname": "logs.db"})
print("Logging session ID: " + str(logging_session_id))
temp_dir = tempfile.TemporaryDirectory()
code_executor = autogen.coding.LocalCommandLineCodeExecutor(
timeout=10,
work_dir=temp_dir.name,
)
assistant = AssistantAgent("assistant", llm_config=llm_config)
user_proxy = UserProxyAgent(
name="user_proxy",
code_execution_config={"executor": code_executor},
human_input_mode="NEVER",
is_termination_msg=lambda msg: "TERMINATE" in msg["content"],
)
# Start the chat
user_proxy.initiate_chat(
assistant,
message=input,
)
autogen.runtime_logging.stop()
step 2: get this output:
Traceback (most recent call last):
File "g:\code\@LLMGuideCodeGen\autogen_test.py", line 41, in <module>
user_proxy = UserProxyAgent(
^^^^^^^^^^^^^^^
File "C:\Users\moshi\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen\agentchat\user_proxy_agent.py", line 83, in __init__
super().__init__(
File "C:\Users\moshi\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen\agentchat\conversable_agent.py", line 145, in __init__
log_new_agent(self, locals())
File "C:\Users\moshi\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen\runtime_logging.py", line 62, in log_new_agent
autogen_logger.log_new_agent(agent, init_args)
File "C:\Users\moshi\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen\logger\sqlite_logger.py", line 244, in log_new_agent
json.dumps(args),
^^^^^^^^^^^^^^^^
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.2544.0_x64__qbz5n2kfra8p0\Lib\json\__init__.py", line 231, in dumps
return _default_encoder.encode(obj)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.2544.0_x64__qbz5n2kfra8p0\Lib\json\encoder.py", line 200, in encode
chunks = self.iterencode(o, _one_shot=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.2544.0_x64__qbz5n2kfra8p0\Lib\json\encoder.py", line 258, in iterencode
return _iterencode(o, 0)
^^^^^^^^^^^^^^^^^
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.2544.0_x64__qbz5n2kfra8p0\Lib\json\encoder.py", line 180, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type WindowsPath is not JSON serializable
Model Used
gpt-3.5-turbo; gpt-4
Expected Behavior
it should execute without error
Screenshots and logs
No response
Additional Information
python version : Python.3.11
Thanks @Moshiii, I'll look into improving on logging and also fix this issue with serialization early next week.
Thank you so much for the prompt reply. also, I am a researcher on software engineering topic. Is there any way to have a quick chat with the team for collaboration opportunities? thank you!
Thank you so much for the prompt reply. also, I am a researcher on software engineering topic. Is there any way to have a quick chat with the team for collaboration opportunities? thank you!
Join our discord and find us :D
Hi @Moshiii, looked a bit closer into this, will exclude WindowsPath or dump WindowsPath as string when serializing the agent unblock you?
im not sure what you mean. Can you share a piece of code I can run it on my end
cheng-tan @.***>于2024年4月15日 周一17:02写道:
Hi @Moshiii https://github.com/Moshiii, looked a bit closer into this, will exclude WindowsPath or dump WindowsPath as string when serializing the agent unblock you?
— Reply to this email directly, view it on GitHub https://github.com/microsoft/autogen/issues/2286#issuecomment-2057800298, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABZUBJHYLRKDPHBRBDX7ERTY5Q55PAVCNFSM6AAAAABFYOGLQ2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANJXHAYDAMRZHA . You are receiving this because you were mentioned.Message ID: @.***>
you can add the WindowsPath object to here: https://github.com/microsoft/autogen/blob/main/autogen/logger/sqlite_logger.py#L224. Adding the field to exclude
will skip serializing that field, adding the object type to no_recursive
will serialize it to string
Thanks, I'll check it out.
cheng-tan @.***>于2024年4月16日 周二14:23写道:
you can add the WindowsPath object to here: https://github.com/microsoft/autogen/blob/main/autogen/logger/sqlite_logger.py#L224. Adding the field to exclude will skip serializing that field, adding the object type to no_recursive will serialize it to string
— Reply to this email directly, view it on GitHub https://github.com/microsoft/autogen/issues/2286#issuecomment-2059084710, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABZUBJF2QOPW6VY252SQDYDY5UQ45AVCNFSM6AAAAABFYOGLQ2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANJZGA4DINZRGA . You are receiving this because you were mentioned.Message ID: @.***>
@Moshiii did you manage to solve the issue? I am facing the same error currently and not sure on how to solve it!
I am also facing the same error currently and not sure on how to solve it! It occurred when tools call agent reply a list[BaseModel].
tools call function:
Book Model:
tool call tool_responses:
Please help me
I am also facing the same error currently and not sure on how to solve it! It occurred when tools call agent reply a list[BaseModel].
tools call function:
Book Model:
tool call tool_responses:
Please help me
I had test the function. It works correctly. But it reports the serializable error when called as tool executor and reply. @cheng-tan