You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
which prevent from using the module with e.g. loggers from LoggingExtras (which do not all have a min_level attribute). As it seems that this loglevel is only used to call the Clustering package, wouldn't it be cleaner to simply leave Clustering deal with its logging level? Or is there a standard way in the API to do that? I see in the docs that there is a min_enabled_level which might help, not sure if it's exactly the same though.
The text was updated successfully, but these errors were encountered:
This also doesn't let me silence the logger. I need to fit thousands of mixtures and don't want any output while fitting individual mixtures - otherwise there's too much output, and it also slows everything down.
I tried to use a logger with a very high level, but it still prints logs from k-means:
with_logger(SimpleLogger(stdout, Logging.LogLevel(50))) doGMM(N_COMPONENTS, data, nIter=1000)
end
This outputs:
K-means converged with 11 iterations (objv = 47.97814024236598)
Looks like I can't influence that log level because the code is querying the global logger directly with Logging.global_logger()
I found a way of silencing logging by temporarily changing the global logger:
prev_logger =global_logger(SimpleLogger(devnull, Logging.LogLevel(50)))
result =run_many_gmms(GMM, data, N_COMPONENTS)
global_logger(prev_logger);
Hi, in the current version of the package in
train.jl
there are these few lineswhich prevent from using the module with e.g. loggers from
LoggingExtras
(which do not all have amin_level
attribute). As it seems that this loglevel is only used to call the Clustering package, wouldn't it be cleaner to simply leave Clustering deal with its logging level? Or is there a standard way in the API to do that? I see in the docs that there is amin_enabled_level
which might help, not sure if it's exactly the same though.The text was updated successfully, but these errors were encountered: