torch.autograd.once_differentiable
function once_differentiable<T extends BackwardFn>(_target: any, _propertyKey: string, descriptor: TypedPropertyDescriptor<T>): TypedPropertyDescriptor<T>Decorator to mark a backward function as only supporting single differentiation.
If a function decorated with @once_differentiable is differentiated more than once (i.e., double backward is attempted), an error will be raised.
Use this when your backward() saves intermediate activations that would not support double backward.
Parameters
_targetany_propertyKeystringdescriptorTypedPropertyDescriptor<T>
Returns
TypedPropertyDescriptor<T>Examples
class MyFunc extends torch.autograd.Function {
static forward(ctx: FunctionCtx, x: Tensor): Tensor {
// ... forward logic
}
@once_differentiable
static backward(ctx: FunctionCtx, grad_output: Tensor): [Tensor] {
// ... backward logic (only supports single backward)
}
}