torch.nn.utils.clip_grads_with_norm_
function clip_grads_with_norm_(parameters: Iterable<Tensor>, total_norm: Tensor | number, max_norm: number, options?: ClipGradsWithNormOptions): voidfunction clip_grads_with_norm_(parameters: Iterable<Tensor>, total_norm: Tensor | number, max_norm: number, foreach: boolean | null, options?: ClipGradsWithNormOptions): voidScale the gradients of an iterable of parameters given a pre-calculated total norm and desired max norm.
This is useful when you want to compute the total norm once and then scale multiple parameter groups.
Parameters
parametersIterable<Tensor>- Iterable of Tensors whose gradients will be scaled
total_normTensor | number- Pre-calculated total norm of gradients
max_normnumber- Maximum allowed norm
optionsClipGradsWithNormOptionsoptional- Optional settings for clipping: -
foreach: Use faster foreach implementation (default: null)
Examples
const total_norm = torch.nn.utils.get_total_norm(all_grads, { norm_type: 2.0 });
torch.nn.utils.clip_grads_with_norm_(params1, total_norm, 1.0);
torch.nn.utils.clip_grads_with_norm_(params2, total_norm, 1.0);