Shortcuts

setup_visdom_logging#

ignite.handlers.logger_utils.setup_visdom_logging(trainer, optimizers=None, evaluators=None, log_every_iters=100, **kwargs)[source]#

Method to setup Visdom logging on trainer and a list of evaluators. Logged metrics are:

  • Training metrics, e.g. running average loss values

  • Learning rate(s)

  • Evaluation metrics

Warning

This function uses VisdomLogger which is currently untested due to the visdom package being unmaintained and difficult to install with modern Python packages. Use at your own risk.

Parameters:
  • trainer (Engine) – trainer engine

  • optimizers (Optimizer | dict[str, torch.optim.optimizer.Optimizer] | None) – single or dictionary of torch optimizers. If a dictionary, keys are used as tags arguments for logging.

  • evaluators (Engine | dict[str, ignite.engine.engine.Engine] | None) – single or dictionary of evaluators. If a dictionary, keys are used as tags arguments for logging.

  • log_every_iters (int) – interval for loggers attached to iteration events. To log every iteration, value can be set to 1 or None.

  • kwargs (Any) – optional keyword args to be passed to construct the logger.

Returns:

VisdomLogger

Return type:

VisdomLogger

Examples

from ignite.handlers.logger_utils import setup_visdom_logging
# Assume `trainer`, `evaluator`, and `optimizer` are already defined
vd_logger = setup_visdom_logging(
    trainer=trainer,
    optimizers=optimizer,
    evaluators={"validation": evaluator},
    log_every_iters=100
)

# Logger instance can be closed
vd_logger.close()
×

Search Docs