torch.hardsigmoid
function hardsigmoid<S extends Shape, D extends DType = DType, Dev extends DeviceType = DeviceType>(input: Tensor<S, D, Dev>): Tensor<S, D, Dev>Applies the Hardsigmoid function element-wise.
Hardsigmoid is a computationally efficient piecewise linear approximation of the sigmoid function, designed for mobile and embedded devices. It eliminates expensive exponential computations while maintaining the sigmoid's property of mapping inputs to (0, 1). Widely used in MobileNetV3 and other efficient neural networks.
- Efficiency: No exponential computation, very fast on CPU and GPU
- Approximation: Excellent approximation to sigmoid in practical range (-3 to 3)
- Output range: Strictly in [0, 1], suitable for gating/probability outputs
- Use case: Mobile networks, embedded systems, real-time applications
- Derivative: 1/6 in linear region, 0 elsewhere - allows gradient flow
Parameters
inputTensor<S, D, Dev>- The input tensor
Returns
Tensor<S, D, Dev>– A new tensor with Hardsigmoid applied element-wise, values in [0, 1]Examples
// Basic usage
const x = torch.tensor([-5, -3, -1, 0, 1, 3, 5]);
torch.hardsigmoid(x); // [0, 0, 0.333, 0.5, 0.667, 1, 1]
// Piecewise linear approximation comparison
const sigmoid_out = torch.sigmoid(x); // Smooth S-curve with exp()
const hardsigmoid_out = torch.hardsigmoid(x); // Linear approximation, no exp
// Hardsigmoid is very close to sigmoid in the -3 to 3 range
// In MobileNetV3-style efficient network
const x = torch.randn(batch_size, channels, height, width);
const se = torch.nn.Linear(channels, channels);
const gated = torch.hardsigmoid(se(torch.mean(x, [2, 3]))); // Efficient gating
// As learned gating parameter (hard sigmoid for efficiency)
const logits = torch.randn(batch, features);
const gates = torch.hardsigmoid(logits); // Fast, bounded in [0, 1]See Also
- PyTorch torch.nn.functional.hardsigmoid()
- sigmoid - Smooth alternative with exponential computation
- hardswish - Combines hardsigmoid with multiplication for efficient gating
- silu - Another smooth gating alternative