torch.distributions.broadcast_all
Broadcasts multiple tensors and scalars to a common compatible shape.
Takes any number of tensors and scalars and broadcasts them all to a single compatible shape. This is essential for probability distribution operations where parameters may have different shapes but need to be broadcastable together. Useful for:
- Distribution parameterization: Broadcasting shape parameters (loc, scale) with batch shapes
- Batch operations: Ensuring all parameters can be broadcast together for efficient computation
- Flexible API: Allowing scalars or tensors as distribution parameters
- Shape alignment: Computing the event shape from multiple parameter shapes
Automatically infers the device from any tensor arguments (prioritizing WebGPU if present), and converts scalar numbers to tensors on that device. Returns all tensors with identical broadcast-compatible shapes.
- Device inference: Detects device from tensor arguments (WebGPU preferred)
- Scalar conversion: Numbers are converted to tensors on the inferred device
- Empty inputs: Returns empty array if no arguments provided
- Single input: Returns array with single tensor if only one argument
- Broadcasting rules: Follows standard NumPy/PyTorch broadcasting rules
- Incompatible shapes: Throws error if shapes cannot be broadcast together
- Device consistency: All tensors are moved to the inferred device
- Memory: Creates new expanded tensors; doesn't modify originals
Parameters
values(number | Tensor)[]- Variadic arguments: any number of tensors and/or scalars (numbers)
Returns
Tensor[]– Array of tensors all broadcast to the same shapeExamples
// Broadcast different shaped tensors
const t1 = torch.tensor([1, 2, 3]); // shape [3]
const t2 = torch.tensor([[4], [5]]); // shape [2, 1]
const [b1, b2] = torch.distributions.broadcast_all(t1, t2);
// Both have shape [2, 3]// Mix scalars and tensors
const loc = torch.randn([10, 1]); // shape [10, 1]
const scale = 2.0; // scalar
const [bc_loc, bc_scale] = torch.distributions.broadcast_all(loc, scale);
// Both have shape [10, 1]// Distribution parameter broadcasting
const loc = torch.tensor([0, 1]); // shape [2]
const scale = torch.tensor([1, 2, 3]); // shape [3]
const [bc_loc, bc_scale] = torch.distributions.broadcast_all(loc, scale);
// Both have shape [2, 3] (standard broadcasting rules)See Also
- PyTorch torch.broadcast_tensors() (conceptually similar)
- torch.broadcast_to - Broadcast single tensor to shape
- torch.expand - Expand tensor dimensions