opencompass icon indicating copy to clipboard operation
opencompass copied to clipboard

[Feature] ollama发布的模型如何评估支持

Open 18600709862 opened this issue 9 months ago • 2 comments

描述该功能

ollama直接发布本地模型非常方便,如何对这种api进行评估,是否可以给一个例子。

是否希望自己实现该功能?

  • [ ] 我希望自己来实现这一功能,并向 OpenCompass 贡献代码!

18600709862 avatar Apr 26 '24 05:04 18600709862

API模型评估在opencompass/opencompass/models里有较多例子,如https://github.com/open-compass/opencompass/blob/main/opencompass/models/bytedance_api.py 你可以尝试在里面实现一个新的类并写好generate函数,从而可以在外面的config中调用这个新的API模型class

bittersweet1999 avatar Apr 28 '24 14:04 bittersweet1999

@18600709862 请问实现了吗,可以share一下代码吗

jiusi9 avatar Jul 30 '24 02:07 jiusi9

同求

mdjhacker avatar Oct 13 '24 09:10 mdjhacker

我实现了,但是每次运行的结果不一样,有遇到过的吗

luhairong11 avatar Oct 24 '24 01:10 luhairong11

我实现了,但是每次运行的结果不一样,有遇到过的吗

我也实现了 但结果是一样的,用的opencompass里的openai的api改的,方法为: 修改openai接口为 os.environ.get('OPENAI_BASE_URL', 'http://localhost:11434/v1/'), 然后我举一个使用例子:

from opencompass.models import OpenAI

api_meta_template = dict(round=[
    dict(role='HUMAN', api_role='HUMAN'),
    dict(role='BOT', api_role='BOT', generate=True),
], )

models = [
    dict(
        abbr='Ollama_Qwen25_7b',
        type=OpenAI,
        path='qwen2.5:7b',
        key=
        'ollama',  # The key will be obtained from $OPENAI_API_KEY, but you can write down your key here as well
        meta_template=api_meta_template,
        query_per_second=5,
        max_out_len=2048,
        max_seq_len=4096,
        batch_size=64),
]

mdjhacker avatar Oct 24 '24 01:10 mdjhacker

我实现了,但是每次运行的结果不一样,有遇到过的吗

我也实现了 但结果是一样的,用的opencompass里的openai的api改的,方法为: 修改openai接口为 os.environ.get('OPENAI_BASE_URL', 'http://localhost:11434/v1/'), 然后我举一个使用例子:

from opencompass.models import OpenAI

api_meta_template = dict(round=[
    dict(role='HUMAN', api_role='HUMAN'),
    dict(role='BOT', api_role='BOT', generate=True),
], )

models = [
    dict(
        abbr='Ollama_Qwen25_7b',
        type=OpenAI,
        path='qwen2.5:7b',
        key=
        'ollama',  # The key will be obtained from $OPENAI_API_KEY, but you can write down your key here as well
        meta_template=api_meta_template,
        query_per_second=5,
        max_out_len=2048,
        max_seq_len=4096,
        batch_size=64),
]

你这温度参数都没设置,结果还能一样呀,有点奇怪

luhairong11 avatar Oct 24 '24 01:10 luhairong11

我实现了,但是每次运行的结果不一样,有遇到过的吗

我也实现了 但结果是一样的,用的opencompass里的openai的api改的,方法为: 修改openai接口为 os.environ.get('OPENAI_BASE_URL', 'http://localhost:11434/v1/'), 然后我举一个使用例子:

from opencompass.models import OpenAI

api_meta_template = dict(round=[
    dict(role='HUMAN', api_role='HUMAN'),
    dict(role='BOT', api_role='BOT', generate=True),
], )

models = [
    dict(
        abbr='Ollama_Qwen25_7b',
        type=OpenAI,
        path='qwen2.5:7b',
        key=
        'ollama',  # The key will be obtained from $OPENAI_API_KEY, but you can write down your key here as well
        meta_template=api_meta_template,
        query_per_second=5,
        max_out_len=2048,
        max_seq_len=4096,
        batch_size=64),
]

我的实现方式是这样的,可以看看有什么不一样的地方吗 https://github.com/open-compass/opencompass/issues/1634

luhairong11 avatar Oct 24 '24 02:10 luhairong11