torch.distributions.Uniform
class Uniform extends Distributionnew Uniform(low: number | Tensor, high: number | Tensor, options?: DistributionOptions)
- readonly
low(Tensor) - – Lower bound of the uniform distribution (inclusive).
- readonly
high(Tensor) - – Upper bound of the uniform distribution (exclusive).
- readonly
arg_constraints(unknown) - readonly
support(Constraint) - readonly
has_rsample(unknown) - readonly
mean(Tensor) - readonly
mode(Tensor) - readonly
variance(Tensor)
Uniform distribution: constant probability over an interval [low, high).
Parameterized by lower and upper bounds. Assigns equal probability to all values in the interval. Essential for:
- Random initialization of model parameters (when bounds matter)
- Prior distributions in Bayesian models (non-informative prior)
- Data augmentation and sampling
- Baseline comparisons (maximum entropy for bounded domain)
- Scenario testing and robustness
The probability density function: f(x) = 1/(high - low) for x in [low, high), 0 otherwise
- Interval notation: [low, high) is half-open (includes low, excludes high)
- No maximum entropy: Subject to bounds, has maximum entropy
- Constant density: All values in range have equal probability (not normalized to 1)
- CDF simplicity: CDF is linear: F(x) = (x - low) / (high - low)
- Reparameterizable: rsample() = low + (high - low) * U[0,1)
- Bounded domain: Finite support unlike normal, crucial advantage for constraints
- Bounds order: Must have low high, otherwise error
- Floating-point bounds: high is excluded; careful with boundary conditions
- Not smooth: PDF is discontinuous at boundaries
Examples
// Standard uniform on [0, 1)
const uniform = new torch.distributions.Uniform(0, 1);
const samples = uniform.sample([1000]);
const log_probs = uniform.log_prob(samples);
// Uniform on custom range [-1, 1)
const custom = new torch.distributions.Uniform(-1, 1);
const sample = custom.sample(); // value in [-1, 1)
// Batched uniform distributions
const lows = torch.tensor([0, -5, 10]);
const highs = torch.tensor([1, 5, 20]);
const dist = new torch.distributions.Uniform(lows, highs);
const batch_samples = dist.sample(); // [3] shaped samples
// Parameter initialization: uniform instead of normal
const layer_size = 512;
const input_size = 256;
const limit = Math.sqrt(6.0 / (input_size + layer_size)); // uniform bound
const init_dist = new torch.distributions.Uniform(-limit, limit);
const weights = init_dist.sample([layer_size, input_size]);
// Prior distribution: model uncertainty uniformly
const param_dist = new torch.distributions.Uniform(0, 1); // no prior knowledge
const sample = param_dist.sample([10]); // unbiased sample
// Probability of value in specific range: P(a < X < b)
const dist = new torch.distributions.Uniform(0, 10);
const p_1_to_3 = dist.cdf(3) - dist.cdf(1); // P(1 < X < 3) = 0.2