I am using ReduceLROnPlateau to modify the learning rate during training of a PyTorch mode. ReduceLROnPlateau does not inherit from LRScheduler and does not implement the get_last_lr method which is PyTorch's recommended way of getting the current learning rate when using a learning rate scheduler.
How can I get the learning rate when using ReduceLROnPlateau?
Currently I am doing the following but am not sure if this is rigorous and correct:
lr = optimizer.state_dict()["param_groups"][0]["lr"]