torch.autograd.allow_mutation_on_saved_tensors
function allow_mutation_on_saved_tensors(): booleanCheck if in-place mutations on saved tensors are safe during backward.
Returns whether the autograd engine safely allows in-place modifications on saved intermediate tensors. This is primarily a compatibility function with PyTorch's autograd system. In torch.js, intermediate tensors are cloned during backward computation, making in-place mutations safe.
Background: During backward pass, saved tensors from the forward pass are used to compute gradients. In some frameworks, modifying these tensors in-place can corrupt gradient computation. This function checks if the implementation safely handles such mutations.
torch.js implementation: torch.js clones tensors rather than using them directly, so in-place mutations on saved tensors don't affect gradient computation. This function always returns true for this implementation.
When this matters:
- Custom autograd functions that modify intermediate values
- Advanced gradient computation tricks
- Framework compatibility checks
- Generally not an issue for standard operations
- Always true in torch.js: Safe cloning ensures this always returns true
- PyTorch compatibility: Provided for API compatibility with PyTorch
- Not typically needed: Most code doesn't rely on this; standard operations work fine
- Implementation detail: Reflects internal cloning strategy in torch.js
- Deterministic: Returns same value every time (always true)
- Not for general use: This is mainly an internal check for framework developers
- Safe by default: In standard code, no mutations on saved tensors should occur
- Custom operations only: Only relevant when implementing custom autograd functions
Returns
boolean– Always true in torch.js since cloning prevents mutation issuesExamples
// Check if safe to mutate saved tensors
if (torch.autograd.graph.allow_mutation_on_saved_tensors()) {
// Safe to perform in-place mutations on saved tensors
savedTensor.add_(value);
}// Custom autograd function with mutations
class CustomFunction {
static forward(x: Tensor): Tensor {
// Save tensor for backward
this.saved = x.clone();
return x.square();
}
static backward(grad: Tensor): Tensor {
if (torch.autograd.graph.allow_mutation_on_saved_tensors()) {
// Safe to modify saved tensor in-place
this.saved.mul_(2);
}
return grad.mul(this.saved);
}
}See Also
- PyTorch torch.autograd.graph.allow_mutation_on_saved_tensors()
- saved_tensors_hooks - Register hooks for customizing tensor storage
- torch.autograd.graph - Autograd graph manipulation module
- torch.autograd - Autograd module