torch.nn.utils.clip_grad_value_
function clip_grad_value_(parameters: Iterable<Tensor>, clip_value: number, options?: ClipGradValueOptions): voidfunction clip_grad_value_(parameters: Iterable<Tensor>, clip_value: number, foreach: boolean | null, options?: ClipGradValueOptions): voidClip the gradients of an iterable of parameters at specified value.
Gradients are modified in-place. Unlike clip_grad_norm_, this clips each gradient element independently to the range [-clip_value, clip_value].
Parameters
parametersIterable<Tensor>- Iterable of Tensors that will have gradients clipped
clip_valuenumber- Maximum allowed value of the gradients. The gradients are clipped in the range [-clip_value, clip_value]
optionsClipGradValueOptionsoptional- Optional settings for clipping: -
foreach: Use faster foreach implementation (default: null)
Examples
// Clip each gradient element to [-0.5, 0.5]
torch.nn.utils.clip_grad_value_(model.parameters(), 0.5);