torch.nn.utils.clip_grad_norm_
function clip_grad_norm_(parameters: Iterable<Tensor>, max_norm: number): Tensorfunction clip_grad_norm_(parameters: Iterable<Tensor>, max_norm: number, norm_type: number | 'inf', error_if_nonfinite: boolean, foreach: boolean | null, options?: ClipGradNormOptions): TensorClip the gradient norm of an iterable of parameters.
The norm is computed over all gradients together, as if they were concatenated into a single vector. Gradients are modified in-place.
Parameters
parametersIterable<Tensor>- Iterable of Tensors that will have gradients normalized
max_normnumber- Maximum norm of the gradients
Returns
Tensor– Total norm of the parameter gradients (before clipping)Examples
// Clip gradients to max norm of 1.0
const total_norm = torch.nn.utils.clip_grad_norm_(model.parameters(), 1.0);