torch.optim.lr_scheduler.LambdaLR
class LambdaLR extends LRSchedulernew LambdaLR(optimizer: Optimizer, options: {
/** Function(s) to compute LR multiplier. One function or one per param group. */
lr_lambda: LRLambda | LRLambda[];
/** The index of last epoch (default: -1) */
last_epoch?: number;
/** Whether to print a message for each update (default: false) */
verbose?: boolean;
})
Constructor Parameters
optimizerOptimizer- Wrapped optimizer
lr_lambdas(LRLambda[])- – Lambda functions for each param group
LambdaLR scheduler: Custom learning rate schedule via lambda function.
LambdaLR provides maximum flexibility: specify learning rate via a function of epoch number. Allows any custom schedule imaginable. Most flexible but requires user to define the schedule function.
Use cases:
- Arbitrary custom schedules not covered by other schedulers
- Combining multiple schedule patterns
- Experimental schedules during research
Algorithm: η_t = η_base * λ(t) where λ is your function
- Maximum flexibility: Define any schedule via function.
- Research tool: Useful for experimenting with new schedules.
- Per-group lambdas: Can provide different lambdas for different parameter groups.
Examples
// Linear warmup then constant
const lambda = (epoch: number) => epoch < 10 ? epoch / 10 : 1.0;
const scheduler = new torch.optim.LambdaLR(optimizer, { lr_lambda: lambda });// Warmup then exponential decay
const lambda = (epoch: number) => {
if (epoch < 5) return epoch / 5; // Warmup
return Math.pow(0.95, epoch - 5); // Exponential decay
};
const scheduler = new torch.optim.LambdaLR(optimizer, { lr_lambda: lambda });