torch.Tensor.Tensor.logaddexp
Tensor.logaddexp<O extends Shape>(other: Tensor<O>): Tensor<DynamicShape, D, Dev>Logarithm of sum of exponentials (log-sum-exp).
Computes log(exp(self) + exp(other)) element-wise, using numerical tricks to avoid overflow. Instead of computing exp() then log() directly (which overflows for large values), uses the identity: log(exp(a) + exp(b)) = max(a,b) + log(1 + exp(-|a-b|)).
Stability:
- Direct computation log(exp(a) + exp(b)) overflows for |a|, |b| > 700
- logaddexp remains accurate across entire floating-point range
- Critical for log-probability computations in machine learning
Use Cases:
- Log-domain computations (avoiding underflow/overflow in probabilities)
- Computing log-sum-exp for softmax in log-space
- Combining log-probabilities from independent events
- Numerically stable maximum operation in log space
- Training with very small/large probability values
- Numerical stability: Safe for any input values, no overflow/underflow.
- Log-domain operation: Result is log of sum, not sum itself.
- Symmetric: logaddexp(a, b) = logaddexp(b, a).
- Broadcasting: Other broadcasts with self.
- ML critical: Essential for probability computations in machine learning.
Parameters
otherTensor<O>- Second operand (broadcastable)
Returns
Tensor<DynamicShape, D, Dev>– Tensor with log(exp(self) + exp(other)), same shapeExamples
// Stable log-probability combination
const log_p1 = torch.tensor([-1000, -100, 0]);
const log_p2 = torch.tensor([-1000, -100, 1]);
const combined = log_p1.logaddexp(log_p2); // Stable even with -1000!
// Log-domain softmax computation
const logits = torch.randn(32, 10);
const log_probs = logits.log_softmax(-1); // Log probabilities
// Combine independent log-probabilities
const p1_log = torch.tensor([-2, -3, -4]);
const p2_log = torch.tensor([-2.5, -2.8, -3.5]);
const combined = p1_log.logaddexp(p2_log); // P(A or B) in log space
// Training with very small probabilities
const rare_event_log_prob = torch.tensor([-1000]); // log(10^{-1000})
const normal_log_prob = torch.tensor([-10]);
const total = rare_event_log_prob.logaddexp(normal_log_prob);See Also
- PyTorch torch.logaddexp()
- logsumexp - Sum multiple log values (more efficient)
- log_softmax - Log softmax using logaddexp internally
- softmax - Exponential-based softmax (less stable)
- exp - Exponential (inverse operation)