torch.Tensor.Tensor.index_put
Tensor.index_put(indices: (Tensor | null)[], values: Tensor, options?: IndexPutOptions): Tensor<DynamicShape, D, Dev>Tensor.index_put(indices: (Tensor | null)[], values: Tensor, accumulateOrOptions?: boolean | IndexPutOptions, options?: IndexPutOptions): Tensor<DynamicShape, D, Dev>Creates a new tensor with values updated at specified indices (non-destructive).
Replaces or accumulates values into specified index locations. Can accumulate (add) instead of replacing, useful for scatter-add operations. Supports advanced indexing with multiple index tensors. Returns a new tensor; original is unchanged.
Use Cases:
- Scatter/gather-add operations in loss computations
- Masked assignment in neural networks
- Accumulating gradients at specific indices
- Building sparse tensor operations
- One-hot encoding and related operations
- Index count: Must provide one index tensor per dimension
- Broadcast: Values must broadcast with indexed positions
- Accumulate: Default replaces; set true to add values
- Non-destructive: Original tensor unchanged, returns new tensor
- Batch indices: Can use batch of indices for advanced indexing
- Index shape: Index tensors must have compatible/broadcastable shapes
- Out of bounds: May have undefined behavior with out-of-bounds indices
Parameters
indices(Tensor | null)[]- Array of index tensors (one per dimension)
valuesTensor- Values to put/accumulate at the specified indices
optionsIndexPutOptionsoptional
Returns
Tensor<DynamicShape, D, Dev>– New tensor with updated values at specified indicesExamples
// Basic replacement
const x = torch.zeros(3, 3);
const result = x.index_put([torch.tensor([0, 2]), torch.tensor([1, 2])], torch.tensor([1, 2]));
// result[0, 1] = 1, result[2, 2] = 2
// Accumulation (scatter-add)
const x = torch.ones(3, 3);
const result = x.index_put(
[torch.tensor([0, 0]), torch.tensor([1, 1])],
torch.tensor([2, 3]),
true // accumulate
);
// result[0, 1] = 1 + 2 = 3, result[0, 1] = 1 + 3 = 4
// Scatter-add for loss gradients
const embeddings = torch.randn(1000, 64);
const indices = torch.tensor([0, 1, 2, 1, 0]); // Which embeddings to update
const grads = torch.randn(5, 64); // Gradients for each sample
const grad_embeddings = torch.zeros_like(embeddings).index_put(
[indices], grads, true
); // Accumulate gradientsSee Also
- PyTorch torch.index_put() (or tensor.index_put())
- index_put_ - In-place version (modifies original)
- scatter - Scatter operation (similar functionality)
- scatter_ - In-place scatter
- gather - Reverse operation (gather instead of scatter)