alphapeptdeep
alphapeptdeep copied to clipboard
Custom LR scheduler not supported
Bug description
Trying to set a custom learning rate scheduler with the set_lr_scheduler_class
throws an NotImplemented
error.
To Reproduce
from torch.optim.lr_scheduler import ReduceLROnPlateau
from your_package import LR_SchedulerInterface # Replace with the actual import path
class CustomReduceLROnPlateau(LR_SchedulerInterface):
def __init__(self, optimizer, num_warmup_steps, num_training_steps, patience=10, factor=0.1, mode='min', threshold=1e-4, threshold_mode='rel', cooldown=0, min_lr=0, eps=1e-8, **kwargs):
super().__init__()
self.scheduler = ReduceLROnPlateau(
optimizer,
mode=mode,
factor=factor,
patience=patience,
threshold=threshold,
threshold_mode=threshold_mode,
cooldown=cooldown,
min_lr=min_lr,
eps=eps,
**kwargs
)
def step(self, metrics, epoch=None):
self.scheduler.step(metrics, epoch)
def get_last_lr(self):
return self.scheduler.optimizer.param_groups[0]['lr']
from peptdeep.model.generic_property_prediction import (
ModelInterface_for_Generic_AASeq_Regression,
)
from peptdeep.model.generic_property_prediction import (
Model_for_Generic_AASeq_Regression_Transformer,
)
transformer = ModelInterface_for_Generic_AASeq_Regression(
model_class=Model_for_Generic_AASeq_Regression_Transformer
)
transformer.target_column_to_train = 'normlogintensity'
transformer.target_column_to_predict = 'transformer_predictions'
transformer.train(data_train, warmup_epoch=10, epoch=50, verbose=True)
Expected behavior No error.
Version (please complete the following information):
- Installation Type: pip
- peptdeep version 1.2.1
Additional context I see in the source code that this isn't implemented. Would be nice if it was.