autogen icon indicating copy to clipboard operation
autogen copied to clipboard

Enable streaming support for openai ChatCompletion #217

Open Alvaromah opened this issue 2 years ago • 10 comments

Why are these changes needed?

The OpenAI API, along with other LLM frameworks, offers streaming capabilities that enhance debugging workflows by eliminating the need to wait for complete responses, resulting in a more efficient and time-saving process.

This is a simple and non very intrusive mechanism to support streaming.

To enable streaimg just use this code:

llm_config={
    "config_list": config_list,
    # Enable, disable streaming (defaults to False)
    "stream": True,
}

assistant = autogen.AssistantAgent(
    name="assistant",
    llm_config=llm_config,
)

Related issue number

Related to #217

Checks

  • [ ] I've included any doc changes needed for https://microsoft.github.io/autogen/. See https://microsoft.github.io/autogen/docs/Contribute#documentation to build and test documentation locally.
  • [ ] I've added tests (if relevant) corresponding to the changes introduced in this PR.
  • [ ] I've made sure all auto checks have passed.

Alvaromah avatar Oct 28 '23 18:10 Alvaromah

Thanks. Are you aware of #393 and #203 ? We'll switch to openai v1 soon. Is it better to develop streaming support based on the newer version?

sonichi avatar Oct 28 '23 21:10 sonichi

As you know, OpenAI v1.0 is a total rewrite of the library with many breaking changes. The more impact changes I've observed include:

  • You must instantiate a client, instead of using a global default.
  • The Azure API shape differs from the core API.

These changes may require adjustments in AutoGen codebase. However, once these updates are in place, implementing streaming should be straightforward, based on my testing.

On the other hand, I haven't noticed significant changes in the streaming behavior in version v1.0, aside from the new structure it returns in the updated version.

Alvaromah avatar Oct 29 '23 13:10 Alvaromah

As you know, OpenAI v1.0 is a total rewrite of the library with many breaking changes. The more impact changes I've observed include:

  • You must instantiate a client, instead of using a global default.
  • The Azure API shape differs from the core API.

These changes may require adjustments in AutoGen codebase. However, once these updates are in place, implementing streaming should be straightforward, based on my testing.

On the other hand, I haven't noticed significant changes in the streaming behavior in version v1.0, aside from the new structure it returns in the updated version.

That's great. The changes are already in place in the PR branch #393 . The core functionalities work w/ openai v1.0 in that branch. If you could experiment with the streaming support based on that branch, we might have a chance to integrate the streaming support into v0.2 before its release.

sonichi avatar Oct 29 '23 14:10 sonichi

@Alvaromah I get the below error when I try to run this. Not running openai 1.0

ERROR:root:Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided.
Traceback (most recent call last):
  File "/Users/ragy/Documents/RNA/GitHub.nosync/agentcloud/agent-backend/src/utils/log_exception_context_manager.py", line 17, in log_exception
    yield
  File "/Users/ragy/Documents/RNA/GitHub.nosync/agentcloud/agent-backend/src/agents/base.py", line 107, in init_socket_generate_team
    user_proxy.initiate_chat(
  File "/Users/ragy/Library/Caches/pypoetry/virtualenvs/agent-backend-rnB-mn_B-py3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 594, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/Users/ragy/Library/Caches/pypoetry/virtualenvs/agent-backend-rnB-mn_B-py3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 346, in send
    raise ValueError(
ValueError: Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided.

ragyabraham avatar Oct 30 '23 01:10 ragyabraham

@Alvaromah I get the below error when I try to run this. Not running openai 1.0

ERROR:root:Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided.
Traceback (most recent call last):
  File "/Users/ragy/Documents/RNA/GitHub.nosync/agentcloud/agent-backend/src/utils/log_exception_context_manager.py", line 17, in log_exception
    yield
  File "/Users/ragy/Documents/RNA/GitHub.nosync/agentcloud/agent-backend/src/agents/base.py", line 107, in init_socket_generate_team
    user_proxy.initiate_chat(
  File "/Users/ragy/Library/Caches/pypoetry/virtualenvs/agent-backend-rnB-mn_B-py3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 594, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/Users/ragy/Library/Caches/pypoetry/virtualenvs/agent-backend-rnB-mn_B-py3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 346, in send
    raise ValueError(
ValueError: Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided.

Could you please provide a sample code to repro this error? Are you using functions?

Thanks

Alvaromah avatar Oct 30 '23 16:10 Alvaromah

I made some changes in the streaming implementation to support more scenarios. However, this version is not compatible with OpenAI v1.0. I am preparing another PR for dev/v0.2

Alvaromah avatar Oct 30 '23 21:10 Alvaromah

Created PR #491 for dev/v0.2 tested on openai v1.0.0b3

Alvaromah avatar Oct 30 '23 22:10 Alvaromah

@Alvaromah I get the below error when I try to run this. Not running openai 1.0

ERROR:root:Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided.
Traceback (most recent call last):
  File "/Users/ragy/Documents/RNA/GitHub.nosync/agentcloud/agent-backend/src/utils/log_exception_context_manager.py", line 17, in log_exception
    yield
  File "/Users/ragy/Documents/RNA/GitHub.nosync/agentcloud/agent-backend/src/agents/base.py", line 107, in init_socket_generate_team
    user_proxy.initiate_chat(
  File "/Users/ragy/Library/Caches/pypoetry/virtualenvs/agent-backend-rnB-mn_B-py3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 594, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/Users/ragy/Library/Caches/pypoetry/virtualenvs/agent-backend-rnB-mn_B-py3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 346, in send
    raise ValueError(
ValueError: Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided.

Could you please provide a sample code to repro this error? Are you using functions?

Thanks

Sorry, it was actually an issue on my end. Working well for me now. Thank you

ragyabraham avatar Oct 31 '23 04:10 ragyabraham

Codecov Report

Merging #465 (3afc70d) into main (5694486) will decrease coverage by 41.35%. Report is 2 commits behind head on main. The diff coverage is 5.00%.

@@            Coverage Diff             @@
##             main    #465       +/-   ##
==========================================
- Coverage   42.59%   1.25%   -41.35%     
==========================================
  Files          21      21               
  Lines        2526    2559       +33     
  Branches      566     573        +7     
==========================================
- Hits         1076      32     -1044     
- Misses       1359    2527     +1168     
+ Partials       91       0       -91     
Flag Coverage Δ
unittests 1.25% <5.00%> (-41.27%) :arrow_down:

Flags with carried forward coverage won't be shown. Click here to find out more.

Files Coverage Δ
autogen/oai/completion.py 2.15% <100.00%> (-16.38%) :arrow_down:
autogen/oai/chat_completion_proxy.py 2.56% <2.56%> (ø)

... and 19 files with indirect coverage changes

codecov-commenter avatar Oct 31 '23 10:10 codecov-commenter

You are amazing!!!

Louis2602 avatar Nov 21 '23 14:11 Louis2602