torch.optim.lr_scheduler.MultiplicativeLR
class MultiplicativeLR extends LRSchedulernew MultiplicativeLR(optimizer: Optimizer, options: {
/** Function(s) to compute multiplicative factor. One function or one per param group. */
lr_lambda: LRLambda | LRLambda[];
/** The index of last epoch (default: -1) */
last_epoch?: number;
/** Whether to print a message for each update (default: false) */
verbose?: boolean;
})
Constructor Parameters
optimizerOptimizer- Wrapped optimizer
options{ /** Function(s) to compute multiplicative factor. One function or one per param group. */ lr_lambda: LRLambda | LRLambda[]; /** The index of last epoch (default: -1) */ last_epoch?: number; /** Whether to print a message for each update (default: false) */ verbose?: boolean; }- Scheduler options
lr_lambdas(LRLambda[])- – Lambda functions for each param group
MultiplicativeLR scheduler: Multiplicative learning rate factor via lambda function.
MultiplicativeLR is like LambdaLR but applies multiplicative factors instead of absolute factors. Each epoch, the learning rate is multiplied by the return value of the lambda function. This is useful for implementing exponential decay or other multiplicative schedules via custom functions.
Key difference from LambdaLR:
- LambdaLR: η_t = η_base * λ(t) (absolute multiplier, function returns 0-1)
- MultiplicativeLR: η_t = η_{t-1} * λ(t) (multiplicative, compounds)
Use cases:
- Multiplicative decay schedules (e.g., 0.95 factor each epoch)
- Exponential decay patterns via custom functions
- Schedules that depend on previous learning rate
- Research/experimental schedules
Algorithm: Each epoch, multiply current learning rate by λ(epoch):
- η_t = η_{t-1} * λ(t)
- Example: lambda returns 0.95 → decay by 5% each epoch
- Multiplicative: Each epoch multiplies by factor (compounding effect).
- Research tool: Like LambdaLR, useful for experimental schedules.
- Per-group support: Can provide different lambdas for different parameter groups.
- Less common: LambdaLR more popular (absolute vs multiplicative preference).
Examples
// Constant multiplicative decay: 5% per epoch
const lambda = (epoch: number) => 0.95;
const scheduler = new torch.optim.MultiplicativeLR(optimizer, { lr_lambda: lambda });// Epoch-dependent decay
const lambda = (epoch: number) => Math.pow(0.95, epoch);
const scheduler = new torch.optim.MultiplicativeLR(optimizer, { lr_lambda: lambda });// Different lambdas for different parameter groups
const lambda1 = (epoch: number) => 0.95; // 5% decay
const lambda2 = (epoch: number) => 0.99; // 1% decay
const scheduler = new torch.optim.MultiplicativeLR(optimizer, {
lr_lambda: [lambda1, lambda2] // Array of lambdas
});