MetaGPT icon indicating copy to clipboard operation
MetaGPT copied to clipboard

Feature mute stream log for info level

Open azurewtl opened this issue 10 months ago • 3 comments

Features see issue : #1109 llm stream_response alway been print regardless of debug flag I modified logs.py rather than config2.py. This is because log is been set via define_log_level rather than config.yaml To keep things simple, a global variable _print_level is introduced in logs.py

Influence define_log_level(print_level="INFO") will mute log_llm_stream

azurewtl avatar Mar 27 '24 10:03 azurewtl

@azurewtl Thanks for your improvements. It works when I call define_log_level(print_level="DEBUG") in my Python script entry if I want to print the LLM stream log. However, I'm considering whether to provide a simpler way to set it up, perhaps by passing environment variables METAGPT_LOG_PRINT_LEVEL and METAGPT_LOG_FILE_LEVEL?

shenchucheng avatar Mar 28 '24 03:03 shenchucheng

@azurewtl Thanks for your improvements. It works when I call define_log_level(print_level="DEBUG") in my Python script entry if I want to print the LLM stream log. However, I'm considering whether to provide a simpler way to set it up, perhaps by passing environment variables METAGPT_LOG_PRINT_LEVEL and METAGPT_LOG_FILE_LEVEL?

I thought about using config. however, my proposal is that debug flag should be pass via command line tools. Oftentimes, If we want to see debug info during development, I run program and pass debug flag such as:

python run.py --debug

Maybe YAML better off storage config that is NOT changed on ad-hoc basis. Otherwise if I want to show less or more info, I need to change config frequently.

azurewtl avatar Mar 28 '24 06:03 azurewtl

@shenchucheng I think this is a new requirement and can be discussed additionally. This PR can be merged first @azurewtl If there are new changes, submit another PR

geekan avatar Mar 28 '24 06:03 geekan

Anyway, I’ll merge first

geekan avatar Apr 05 '24 13:04 geekan