torch.broadcast_to
function broadcast_to(input: Tensor, shape: number[]): voidBroadcasts input to a new shape.
Expands a tensor to a new shape by repeating elements along singleton dimensions. The input shape must be broadcastable to the target shape - each dimension must either match the target size, be 1 (will be expanded), or not exist (will be prepended). Useful for:
- Element-wise operations: matching shapes for arithmetic between different-sized tensors
- Batching: expanding batch operations to match batch size
- Memory efficiency: avoiding copies where possible (uses views when feasible)
- Feature broadcasting: matching feature dimensions across different structures
- No copy for view: Broadcasting reuses storage when possible (memory efficient)
- Dimension ordering: Matches from the right (trailing dims align first)
- Singleton dims only: Can only broadcast dims of size 1 or exact matches
- Non-broadcastable shapes: Will throw error if dimensions are incompatible
- Shape requirements: Target shape must have compatible dimensions
Parameters
inputTensor- The input tensor
shapenumber[]- The target shape to broadcast to
Returns
A tensor with the broadcast shape, sharing storage with input where possible
Examples
// Basic broadcasting: expand singleton dimension
const x = torch.randn(1, 3, 1);
const result = torch.broadcast_to(x, [2, 3, 4]); // [2, 3, 4]
// Broadcasting with prepended dimensions
const y = torch.randn(3);
const result2 = torch.broadcast_to(y, [2, 3]); // [2, 3]
// Batch operation broadcasting
const single_weight = torch.randn(1, 64);
const batched = torch.broadcast_to(single_weight, [32, 64]); // [32, 64]
// Multi-dimensional expansion for batching
const kernel = torch.randn(1, 1, 3, 3);
const expanded = torch.broadcast_to(kernel, [32, 64, 3, 3]); // [32, 64, 3, 3]See Also
- PyTorch torch.broadcast_to()
- broadcast_shapes - Compute broadcast shape from multiple shapes
- broadcast_tensors - Broadcast multiple tensors to common shape
- expand - Alternative expansion method