David Gilbertson
David Gilbertson
Thanks @sgugger, I just tried that. There's still plenty of noise from code in HF (datasets) like this: ```py logger.warning(f"Loading cached processed dataset at {cache_file_name}") ``` Also, `Trainer` will call...
I've got a full schedule with study at the moment, sorry. So in summary this looks like 2.5 issues: * messages like "loaded from cache" should be log level INFO,...
Ah, interesting, I thought the first one was quite clear cut. From the [Python logging how-to](https://docs.python.org/3/howto/logging.html): * INFO: Confirmation that things are working as expected. * WARNING: An indication that...
Oh good! :) Yes I just checked and this is actually coming from the datasets package.
@sgugger something else I've just noticed is that sometimes transformers will set the log level to info. I can't pin down exactly when, but I see that there's lots of...
Hmm, are these scripts ever called from the application code? There's definitely _something_ that sets the log level to INFO, and I think it's related to loading a model for...
Here's an example, I have this code that references a model not in my cache. ```py print(f"Verbosity: {transformers.logging.get_verbosity()}") conf = transformers.AutoModel.from_pretrained("distilgpt2") print(f"Verbosity: {transformers.logging.get_verbosity()}") ``` Interestingly, it ran and had the...
Actually I'm going to re-open this, since there's still the bug of `TrainingArguments` defaulting to log level INFO so that just the act of creating a `Trainer` changes the log...
No, if I have log level set to WARNING (the default) and create a `Trainer`, this _changes_ the log level to INFO. This code: ```py print(f"Verbosity: {transformers.logging.get_verbosity()}") trainer = transformers.Trainer(...
`TrainingArguments` defaults to `passive`, doesn't it? See [here](https://github.com/huggingface/transformers/issues/20154#issuecomment-1310772907)