MetaGPT icon indicating copy to clipboard operation
MetaGPT copied to clipboard

How to integrate with my private LLM model

Open RamPageMz opened this issue 1 year ago • 3 comments

I'm new to this product. and i scan the document and find that we can config ~/.metagpt/config2.yaml to match a list of populate llm model, such as gpt, claude and so on.

My team have trained a private llm model and deployed in local machine. How to config in project to talk with my model? Is there a ny doc about that?

RamPageMz avatar Jul 01 '24 03:07 RamPageMz

You can refer to https://docs.deepwisdom.ai/main/en/guide/get_started/configuration/llm_api_configuration.html#ollama-api

voidking avatar Jul 02 '24 11:07 voidking

same question

xinyuzhidad avatar Jul 02 '24 12:07 xinyuzhidad

The issue has been fixed. I create a new wrapper for my llm.

BTW, i'm stuck with another issue: https://github.com/geekan/MetaGPT/issues/1390 (which cannot receive message from other role). Any one help?

RamPageMz avatar Jul 11 '24 09:07 RamPageMz

Closing this issue due to prolonged inactivity. If there's any further information or assistance needed, feel free to reopen the issue or create a new one. Thanks!

shenchucheng avatar Oct 10 '24 12:10 shenchucheng