torch.relu_
function relu_<S extends Shape, D extends DType, Dev extends DeviceType>(input: Tensor<S, D, Dev>): Tensor<S, D, Dev>In-place ReLU activation function.
Applies ReLU activation and modifies the input tensor in-place. Note: On GPU backends, true in-place modification may not be possible; this function provides API compatibility with PyTorch.
Parameters
inputTensor<S, D, Dev>- Input tensor to modify in-place
Returns
Tensor<S, D, Dev>– The modified input tensorSee Also
- PyTorch torch.nn.functional.relu_
- relu - Out-of-place version