gpt_academic icon indicating copy to clipboard operation
gpt_academic copied to clipboard

自动编译Docker镜像并上传到ghcr

Open reonokiy opened this issue 2 years ago • 4 comments

主要完成了两个方面的工作:

  • #489 使用Github Actions根据Dockerfile编译镜像,然后上传到ghcr.io
  • #652 从环境变量中读取配置

由于Docker通常使用环境变量或者env_file作为配置项目的方法,在使用Github Actions自动编译镜像之后还修改了config.py来从环境变量读取配置。不过读取环境变量作为配置的部分写的比较粗糙,可能需要修改来匹配项目。

现在使用Docker Compose部署服务的一个示例:

# docker-compose.yml
version: '3'
services:
  gpt_academic:
    image: ghcr.io/sperjar/gpt_academic:master
    container_name: gpt_academic
    environment:
      GPT_ACADEMIC_API_KEY: your-api-key-here
      GPT_ACADEMIC_WEB_PORT: 10054
      # more config
    ports:
      - 10054:10054

reonokiy avatar Apr 30 '23 05:04 reonokiy

你好,可以把config.py中的变更清除,然后放到get_conf中吗,我们希望config.py保持简单、易拓展的状态

https://github.com/binary-husky/gpt_academic/blob/9d3b01af754a0a950bde4c58ecb91a044e32b889/toolbox.py#LL520C1-L526C60

比如

    try:
        r = getattr(importlib.import_module('config_private'), arg)
    except:
        r = getattr(importlib.import_module('config'), arg)

修改成

try:
    r = os.environ[arg] # 获取环境变量(arg)
except KeyError:
    try:
        r = getattr(importlib.import_module('config_private'), arg)
    except:
        r = getattr(importlib.import_module('config'), arg)

binary-husky avatar Apr 30 '23 06:04 binary-husky

我需要直接在read_single_conf_with_lru_cache函数中对于环境变量进行格式的处理吗?这样感觉很多非str的类型都需要重新做一次判断,感觉会导致函数比较臃肿。 或者是新建一个config_env.py将环境变量转换成config.py的格式再在这里导入

reonokiy avatar Apr 30 '23 06:04 reonokiy

我需要直接在read_single_conf_with_lru_cache函数中对于环境变量进行格式的处理吗?这样感觉很多非str的类型都需要重新做一次判断,感觉会导致函数比较臃肿。 或者是新建一个config_env.py将环境变量转换成config.py的格式再在这里导入

字符串可以根据默认值的数据类型,进行转换,下面这个函数是我在另一个项目中用的代码,这里不需要的数据类型可以直接raise error也行

def auto_convertion():
    replace_item = # 环境变量值(字符串)
    original_item = # config.py中的默认值
    if isinstance(original_item, float):
        replace_item = float(replace_item)
    elif isinstance(original_item, bool):
        if replace_item == 'True':
            replace_item = True
        elif replace_item == 'False':
            replace_item = False
        elif isinstance(replace_item, bool):
            replace_item = replace_item
        else:
            assert False, ('enter True or False, but have:', replace_item)
    elif isinstance(original_item, int):
        assert int(replace_item) == float(replace_item), ("warning, this var **%s** has an int default, but given a float override!")
        replace_item = int(replace_item)
    elif isinstance(original_item, str):
        replace_item = replace_item
    elif isinstance(original_item, list):
        assert isinstance(replace_item, list)
    elif isinstance(original_item, dict):
        assert isinstance(replace_item, dict)
    else:
        assert False, ('not support this type')
    return replace_item

binary-husky avatar Apr 30 '23 08:04 binary-husky

应该修改完毕了,主要的逻辑在这里:https://github.com/binary-husky/gpt_academic/pull/662/commits/e5e3e0aa43d8db8b5d731dbabcb1f44d1fed7808#diff-71e5cf52ccb5b80caa00b86ae70d9e941a3e5cdb8279d7cb5c197c636cd79ee4R521 代码中给环境变量加了一个前缀是为了防止和其他已有项目之间造成冲突。

简单跑了一下这个测试是正常的,但是没有具体测试每个值的情况。

# docker-compose.yml
version: '3'
services:
  gpt_academic:
    image: ghcr.io/sperjar/gpt_academic:master
    container_name: gpt_academic
    environment:
      GPT_ACADEMIC_API_KEY: your-api-key-here
      GPT_ACADEMIC_WEB_PORT: 10054
      # more config
    ports:
      - 10054:10054

后续应该还要补充相关的README和文档,环境变量处理的部分和直接配置config.py有些不一致。

reonokiy avatar Apr 30 '23 10:04 reonokiy

应该修改完毕了,主要的逻辑在这里:e5e3e0a#diff-71e5cf52ccb5b80caa00b86ae70d9e941a3e5cdb8279d7cb5c197c636cd79ee4R521 代码中给环境变量加了一个前缀是为了防止和其他已有项目之间造成冲突。

简单跑了一下这个测试是正常的,但是没有具体测试每个值的情况。

# docker-compose.yml
version: '3'
services:
  gpt_academic:
    image: ghcr.io/sperjar/gpt_academic:master
    container_name: gpt_academic
    environment:
      GPT_ACADEMIC_API_KEY: your-api-key-here
      GPT_ACADEMIC_WEB_PORT: 10054
      # more config
    ports:
      - 10054:10054

后续应该还要补充相关的README和文档,环境变量处理的部分和直接配置config.py有些不一致。

感谢,等我稍稍调整一下注释

binary-husky avatar May 01 '23 15:05 binary-husky

这样写更简单明了一点,我在windows cmd中进行了测试没问题,请您试一下docker compose是否一样work

    环境变量可以是 `GPT_ACADEMIC_CONFIG`(优先),也可以直接是`CONFIG`
    例如在windows cmd中,既可以写:
        set USE_PROXY=True
        set API_KEY=sk-j7caBpkRoxxxxxxxxxxxxxxxxxxxxxxxxxxxx
        set proxies={"http":"http://127.0.0.1:10085", "https":"http://127.0.0.1:10085",}
        set AVAIL_LLM_MODELS=["gpt-3.5-turbo", "chatglm"]
        set AUTHENTICATION=[("username", "password"), ("username2", "password2")]
    也可以写:
        set GPT_ACADEMIC_USE_PROXY=True
        set GPT_ACADEMIC_API_KEY=sk-j7caBpkRoxxxxxxxxxxxxxxxxxxxxxxxxxxxx
        set GPT_ACADEMIC_proxies={"http":"http://127.0.0.1:10085", "https":"http://127.0.0.1:10085",}
        set GPT_ACADEMIC_AVAIL_LLM_MODELS=["gpt-3.5-turbo", "chatglm"]
        set GPT_ACADEMIC_AUTHENTICATION=[("username", "password"), ("username2", "password2")]

binary-husky avatar May 02 '23 06:05 binary-husky

以前从来没用过docker-compose,是这样写的吗?(不好看, 但是似乎管用)

version: '3'
services:
  gpt_academic:
    image: fuqingxu/gpt_academic:no-local-llms
    container_name: gpt_academic
    environment:
      # see `config.py` all configuration options !
      API_KEY:                  '     your-api-key-here                                                      '
      WEB_PORT:                 '     10054                                                                  '
      proxies:                  '     {"http":"http://127.0.0.1:10085", "https":"http://127.0.0.1:10085",}   '
      AVAIL_LLM_MODELS:         '     ["gpt-3.5-turbo", "chatglm"]                                           '
      AUTHENTICATION:           '     [("username", "password"), ("username2", "password2")]                 '

    ports:
      - 10054:10054

binary-husky avatar May 02 '23 07:05 binary-husky

yaml格式没有特殊字符字符串不打引号也是可以的, 但是你的配置里API_KEY用引号了就不要多加空格了: ~' your-api-key-here '~

version: '3'
services:
  gpt_academic:
    image: fuqingxu/gpt_academic:no-local-llms
    container_name: gpt_academic
    environment:
      # see `config.py` all configuration options !
      API_KEY: 'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' # your-api-key-here
      WEB_PORT: '10054'
      proxies: '{"http":"http://198.18.0.1:7890", "https":"http://198.18.0.1:7890",}'
      AVAIL_LLM_MODELS: '["gpt-3.5-turbo", "chatglm"]'
      AUTHENTICATION: '[("username", "password"), ("username2", "password2")]'

    ports:
      - 10054:10054

也可以使用env_file导入文件中的变量,如果配置比较多可以分开写:

# test.env
API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
WEB_PORT=10054
proxies='{"http":"http://198.18.0.1:7890", "https":"http://198.18.0.1:7890",}'
AVAIL_LLM_MODELS='["gpt-3.5-turbo", "chatglm"]'
AUTHENTICATION='[("username", "password"), ("username2", "password2")]'
version: '3'
services:
  gpt_academic:
    image: fuqingxu/gpt_academic:no-local-llms
    container_name: gpt_academic
    env_file:
      - test.env
    ports:
      - 10054:10054

还有一个需要注意的点是:如果不是使用容器的host模式的话,需要使用局域网的代理地址而不是localhost本机地址。

reonokiy avatar May 02 '23 12:05 reonokiy