GaussianMixtures.jl
GaussianMixtures.jl copied to clipboard
Support arbitrary loggers
Hi, in the current version of the package in train.jl
there are these few lines
if Logging.global_logger().min_level ≤ Logging.Debug
loglevel = :iter
elseif Logging.global_logger().min_level ≤ Logging.Info
loglevel = :final
else
loglevel = :none
end
which prevent from using the module with e.g. loggers from LoggingExtras
(which do not all have a min_level
attribute). As it seems that this loglevel is only used to call the Clustering package, wouldn't it be cleaner to simply leave Clustering deal with its logging level? Or is there a standard way in the API to do that? I see in the docs that there is a min_enabled_level
which might help, not sure if it's exactly the same though.
This also doesn't let me silence the logger. I need to fit thousands of mixtures and don't want any output while fitting individual mixtures - otherwise there's too much output, and it also slows everything down.
I tried to use a logger with a very high level, but it still prints logs from k-means:
with_logger(SimpleLogger(stdout, Logging.LogLevel(50))) do
GMM(N_COMPONENTS, data, nIter=1000)
end
This outputs:
K-means converged with 11 iterations (objv = 47.97814024236598)
Looks like I can't influence that log level because the code is querying the global logger directly with Logging.global_logger()
I found a way of silencing logging by temporarily changing the global logger:
prev_logger = global_logger(SimpleLogger(devnull, Logging.LogLevel(50)))
result = run_many_gmms(GMM, data, N_COMPONENTS)
global_logger(prev_logger);
I never understand these logging interfaces. If you have a more satisfactory solution, please submit a PR.