torch.Tensor.Tensor.register_post_accumulate_grad_hook
Registers a backward hook that runs after gradients have been accumulated.
The hook is called after the gradient has been accumulated into the tensor's .grad attribute. The hook receives the tensor itself.
Parameters
hook(tensor: Tensor) => void- Function to call with the tensor after gradient accumulation
Returns
{ remove: () => void }– A handle that can be used to remove the hook via handle.remove()Examples
const x = torch.randn(3, 3, { requires_grad: true });
x.register_post_accumulate_grad_hook((t) => {
console.log('Final gradient:', t.grad);
});