VisCy
VisCy copied to clipboard
clean up viscy cli display
viscy --help prints a useful and succinct help message.
But, viscy subcommand --help prints a lot of lightning CLI info that is not relevant, e.g.,
viscy preprocess --help prints:
--lr_scheduler CONFIG | CLASS_PATH_OR_NAME | .INIT_ARG_NAME VALUE
One or more arguments specifying "class_path" and
"init_args" for any subclass of {torch.optim.lr_schedu
ler.LRScheduler,lightning.pytorch.cli.ReduceLROnPlatea
u}. (type: Union[LRScheduler, ReduceLROnPlateau],
known subclasses:
torch.optim.lr_scheduler.LRScheduler,
monai.optimizers.LinearLR,
monai.optimizers.ExponentialLR,
torch.optim.lr_scheduler.LambdaLR,
monai.optimizers.WarmupCosineSchedule,
torch.optim.lr_scheduler.MultiplicativeLR,
torch.optim.lr_scheduler.StepLR,
torch.optim.lr_scheduler.MultiStepLR,
torch.optim.lr_scheduler.ConstantLR,
torch.optim.lr_scheduler.LinearLR,
torch.optim.lr_scheduler.ExponentialLR,
torch.optim.lr_scheduler.SequentialLR,
torch.optim.lr_scheduler.PolynomialLR,
torch.optim.lr_scheduler.CosineAnnealingLR,
torch.optim.lr_scheduler.ChainedScheduler,
torch.optim.lr_scheduler.ReduceLROnPlateau,
lightning.pytorch.cli.ReduceLROnPlateau,
torch.optim.lr_scheduler.CyclicLR,
torch.optim.lr_scheduler.CosineAnnealingWarmRestarts,
torch.optim.lr_scheduler.OneCycleLR,
torch.optim.swa_utils.SWALR,
lightning.pytorch.cli.ReduceLROnPlateau)
Compute dataset statistics before training or testing for normalization:
--data_path DATA_PATH
(required, type: <class 'Path'>)
--channel_names CHANNEL_NAMES, --channel_names+ CHANNEL_NAMES
(type: Union[list[str], Literal[-1]], default: -1)
--num_workers NUM_WORKERS
(type: int, default: 1)
--block_size BLOCK_SIZE
(type: int, default: 32)
@ziw-liu please fix this.