torch.Tensor.Tensor.backward
Tensor.backward(options?: { gradient?: Tensor; retain_graph?: boolean; create_graph?: boolean }): voidTensor.backward(gradient: Tensor, options?: { retain_graph?: boolean; create_graph?: boolean }): voidComputes the gradient of current tensor w.r.t. graph leaves.
The graph is differentiated using the chain rule. If the tensor is non-scalar (i.e. its data has more than one element) and requires gradient, the function additionally requires specifying gradient.
Parameters
options{ gradient?: Tensor; retain_graph?: boolean; create_graph?: boolean }optional- Options for backward computation