torch.optim.lr_scheduler.ChainedScheduler
class ChainedSchedulernew ChainedScheduler(schedulers: LRScheduler[])
Constructor Parameters
schedulersLRScheduler[]- List of schedulers to chain
schedulers(LRScheduler[])- – List of chained schedulers
ChainedScheduler: Run multiple schedulers simultaneously at each step.
ChainedScheduler calls step() on all schedulers in sequence at each iteration. Different from SequentialLR which switches between schedulers at milestones. ChainedScheduler applies all schedulers' updates in parallel/series.
Use cases:
- Combining independent schedule modifications
- Applying multiple schedule patterns simultaneously
- Deprecated in favor of SequentialLR (more flexible)
How it works:
- Initialize with list of schedulers
- Each step() calls step() on all schedulers
- Final learning rate is result of all schedulers applied
- All schedulers run: Every step() calls step() on all schedulers.
- Order matters: Learning rate accumulates through schedulers.
- Deprecated: SequentialLR usually better (switches at milestones vs always all).
- Recommend: Use SequentialLR instead for clearer semantics.
Examples
// Chain ConstantLR (warmup) then ExponentialLR (decay)
const scheduler1 = new torch.optim.ConstantLR(optimizer, { factor: 0.1, total_iters: 2 });
const scheduler2 = new torch.optim.ExponentialLR(optimizer, { gamma: 0.9 });
const scheduler = new torch.optim.ChainedScheduler([scheduler1, scheduler2]);
// Each step() calls both scheduler1.step() and scheduler2.step()
for (let epoch = 0; epoch < 100; epoch++) {
train();
scheduler.step();
}