torch.var_mean
function var_mean<D extends DType = DType, Dev extends DeviceType = DeviceType>(input: Tensor<Shape, D, Dev>, options?: StdVarMeanOptions): { var: Tensor<Shape, D, Dev>; mean: Tensor<Shape, D, Dev> }function var_mean<D extends DType = DType, Dev extends DeviceType = DeviceType>(input: Tensor<Shape, D, Dev>, dim: number, correction: number, keepdim: boolean, options?: StdVarMeanOptions): { var: Tensor<Shape, D, Dev>; mean: Tensor<Shape, D, Dev> }Computes both the variance and mean along a dimension.
Efficiently computes variance AND mean in a single pass, returning both results as an object. Similar to std_mean but returns variance (σ²) instead of std (σ). Useful for:
- Statistical analysis: reporting mean and variance together
- Covariate computation: both parameters for distributions
- Numerical stability: avoiding sqrt by using variance directly
- Model diagnostics: checking data homogeneity via variance
- Expectation-maximization algorithms: computing sufficient statistics
- Efficient computation: single pass computation of both metrics
This is more efficient than calling var() and mean() separately, computing mean once and using it for variance calculation without redundant computation.
- Computational efficiency: Single pass computes both var and mean
- Variance vs std: var = std². Use var_mean for variance, std_mean for std
- Numerical stability: Variance avoids sqrt operation (can be more stable)
- Correction parameter: 0 for population, 1 for sample statistics
- Return format: Object with var, mean properties for destructuring
- Not the same as std_mean: var = std² (different scale)
- Efficiency comparison: var_mean() faster than var() + mean() separately
- Numerical precision: Very large variance values may overflow
Parameters
optionsStdVarMeanOptionsoptional
Returns
{ var: Tensor<Shape, D, Dev>; mean: Tensor<Shape, D, Dev> }– Object with var and mean tensors computed along the dimensionExamples
// Statistical analysis: reporting distribution parameters
const x = torch.tensor([[1, 2, 3], [4, 5, 6]]);
const { var: variance, mean } = torch.var_mean(x, 1);
// var: [0.667, 0.667], mean: [2, 5]
// Gaussian parameter fitting
const data = torch.randn(10000);
const { mean: mu, var: sigma2 } = torch.var_mean(data);
// sigma2 ≈ 1 for standard normal
// Variance-based anomaly detection
const batch = torch.randn(32, 100);
const { mean: batch_means, var: batch_vars } = torch.var_mean(batch, 1);
const stable = batch_vars.lt(2); // Samples with low variance are more stable
// Distributional test: checking homoscedasticity
const group1 = torch.randn(1000);
const group2 = torch.randn(1000).mul(2); // Higher variance
const { var: var1 } = torch.var_mean(group1);
const { var: var2 } = torch.var_mean(group2);
// Numerical stability: avoiding sqrt
const features = torch.randn(1000, 50);
const { mean: mu, var: sigma2 } = torch.var_mean(features, 0);
// Use sigma2 directly instead of sqrt(sigma2) if only variance neededSee Also
- PyTorch torch.var_mean()
- var - Compute variance alone
- mean - Compute mean alone
- std_mean - Get standard deviation and mean (std = sqrt(var))
- normalize - Normalize using variance