ipex-llm
ipex-llm copied to clipboard
Add AutoGen CPU and GPU example
Add GPU and CPU AutoGen agent chat example
1. Why the change?
To support AutoGen agent chat with BigDL-LLM loading on Intel GPU and CPU
2. Summary of the change
Add the documentation of how to run AutoGen with BigDL-LLM loaded local LLM serving on Intel GPU and CPU. Add the corresponding agent chat Python example file. Change the FastChat package to support BigDL-LLM loading. Record encountered errors and respective possible causes and fixes.
3.. How to run the example?
Check the BigDL/python/llm/example/GPU/Applications/autogen/README.md
file for the GPU example. CPU one can be fount at BigDL/python/llm/example/CPU/Applications/autogen/README.md