torch.distributions.StudentT
class StudentT extends Distributionnew StudentT(df: number | Tensor, options?: StudentTOptions)
- readonly
df(Tensor) - – Degrees of freedom.
- readonly
loc(Tensor) - – Location parameter.
- readonly
scale(Tensor) - – Scale parameter.
- readonly
arg_constraints(unknown) - readonly
support(unknown) - readonly
has_rsample(unknown) - readonly
mean(Tensor) - readonly
mode(Tensor) - readonly
variance(Tensor)
Student's t-distribution: continuous distribution for heavy-tailed normal-like data.
Parameterized by degrees of freedom (df), location (loc), and scale. Generalizes normal distribution with heavier tails for modeling outliers and robustness. Essential for:
- Statistical inference with small samples (t-tests)
- Robust modeling (heavier tails than normal)
- Modeling data with outliers and extreme values
- Bayesian modeling with student-t priors (robust)
- Approximating normal with uncertainty
- Heavy-tailed phenomena (financial returns, measurement errors)
- Mixture models with normal components and heavy tails
Ratio distribution: StudentT(ν, μ, σ) = μ + σ * Z / √(V/ν) where Z ~ Normal(0, 1) and V ~ Chi2(ν) independently
- Heavy tails: Probability of extreme values higher than normal
- Robust inference: Better for data with outliers than normal
- Limiting case: t(∞) = Normal (converges to normal with large df)
- t-tests: Student's t-test uses this distribution for inference
- Moment existence: Mean exists only for df 1, variance for df 2
- Symmetric: Always symmetric around location parameter μ
- Mean undefined: For df ≤ 1, mean doesn't exist
- Variance undefined: For df ≤ 2, variance is infinite
- Small df: df 1 has undefined mean; df = 1 is Cauchy (undefined mean/variance)
- Extreme values: Heavy tails mean occasional very large deviations
- Numerical stability: Very small df can cause numerical issues
Examples
// Standard t-distribution: df=3, mean=0, std=1
const t_dist = new torch.distributions.StudentT(3);
t_dist.sample(); // Heavy-tailed, robust to outliers
// T-test with small sample: df = n - 1
const sample_size = 20;
const df = sample_size - 1; // 19 degrees of freedom
const t_critical = torch.tensor([1.729]); // For α=0.1, two-tailed
const test_dist = new torch.distributions.StudentT(df);
// Robust regression: use t-distribution instead of normal
// Student-t likelihood is more robust to outliers than normal
const df = 2; // Low df for heavy tails
const loc = torch.tensor([0]); // Model prediction
const scale = torch.tensor([1]); // Model uncertainty
const likelihood = new torch.distributions.StudentT(df, { loc, scale });
// Different degrees of freedom: varying tail heaviness
const dfs = torch.tensor([1, 2, 5, 10, 30]); // Low to high
const dist = new torch.distributions.StudentT(dfs);
const samples = dist.sample([1000]); // Compare tail behavior
// Bayesian robust regression: hierarchical model
// Data might have outliers, use heavy-tailed likelihood
const data = torch.tensor([...]); // Measurements with potential outliers
const mu_pred = torch.tensor([...]); // Model predictions
const sigma = torch.tensor([...]); // Measurement uncertainty
const df = 4; // Moderate tail heaviness
const likelihood = new torch.distributions.StudentT(df, { loc: mu_pred, scale: sigma });
const log_likelihood = likelihood.log_prob(data).sum();
// Location-scale variant: shifted and scaled t-distribution
const location = 10;
const scale = 2;
const df = 5;
const shifted_t = new torch.distributions.StudentT(df, { loc: location, scale: scale });
// Mean = 10, heavier tails than Normal(10, 2)
// Comparison: normal vs t-distribution
const df_values = torch.tensor([1, 2, 5, Infinity]); // Cauchy to Normal
const dists = df_values.map(df => new torch.distributions.StudentT(df));
// Lower df -> more extreme values more probable