torch.nn.functional.log_softmax
function log_softmax<S extends Shape, D extends DType = DType, Dev extends DeviceType = DeviceType>(input: Tensor<S, D, Dev>, options?: SoftmaxOptions): Tensor<S, D, Dev>function log_softmax<S extends Shape, D extends DType = DType, Dev extends DeviceType = DeviceType>(input: Tensor<S, D, Dev>, dim: number, options?: SoftmaxOptions): Tensor<S, D, Dev>Log-Softmax activation function: numerically stable log(softmax(x)).
Applies log(softmax(x)) = x - log(Σ exp(x)) efficiently and numerically stably. More efficient and numerically stable than computing softmax then log separately, especially important for:
- Classification loss functions (combined with negative log likelihood)
- Probability calculations requiring log-space (important for numerical stability)
- Deep networks where numerical precision is critical
When to use Log-Softmax:
- With NLLLoss (Negative Log Likelihood Loss)
- When you need log-probabilities instead of probabilities
- For numerical stability in probability calculations
- In likelihood-based models and Bayesian inference
Important: For simple classification with targets, use CrossEntropyLoss directly, which is more convenient and handles the softmax + log + loss combination automatically. Log-Softmax is useful when you need explicit log-probabilities for custom loss functions.
- Numerical stability: More stable than softmax + log for extreme values
- Log-space calculations: Output in log-space, not interpretable as probabilities
- Use with NLLLoss: Designed to work with Negative Log Likelihood Loss
- Simpler alternative: CrossEntropyLoss combines softmax + log + loss for classification
Parameters
inputTensor<S, D, Dev>- Input tensor of any shape (typically logits [batch, num_classes])
optionsSoftmaxOptionsoptional- Optional settings: -
dtype: Output data type
Returns
Tensor<S, D, Dev>– Tensor with log_softmax applied, same shape as inputExamples
// Classification with explicit log-probabilities
const logits = torch.randn([batch_size, num_classes]);
const log_probs = torch.nn.functional.log_softmax(logits, { dim: -1 }); // Log-probabilities
// Use with NLLLoss
const nll_loss = new torch.nn.NLLLoss();
const targets = torch.tensor([0, 2, 1, 0]); // Class indices
const loss = nll_loss.forward(log_probs, targets);// Custom loss requiring log-probabilities
const logits = torch.randn([batch_size, num_classes]);
const log_probs = torch.nn.functional.log_softmax(logits, { dim: -1 });
const target_log_probs = torch.log(target_dist); // Target distribution
const kl_div = (target_log_probs - log_probs).mean();See Also
- PyTorch torch.nn.functional.log_softmax
- softmax - Regular softmax for probability outputs
- cross_entropy - One-step function for classification
- nll_loss - Used with log_softmax outputs