torch.randperm
function randperm(n: number, options: TensorOptions = {}): TensorReturns a random permutation of integers from 0 to n - 1.
Generates a 1-D tensor containing a random permutation of integers in [0, n). Each permutation is equally likely (uniform distribution over all n! permutations). Uses Fisher-Yates shuffle algorithm. Essential for:
- Data shuffling: Shuffling dataset indices for training
- Cross-validation: Randomly splitting data into folds
- Experimental design: Random assignment to control/treatment groups
- Sampling without replacement: Selecting random subset without duplicates
- Index randomization: Randomizing order for algorithms
- Batch creation: Creating random batch indices for mini-batch learning
Implementation: Generates array [0, 1, 2, ..., n-1] then shuffles in-place using Fisher-Yates algorithm for O(n) time complexity and uniform distribution.
- Uniform distribution: All n! permutations are equally likely
- No replacement: Each integer appears exactly once
- Range [0, n): Includes 0, excludes n
- Fisher-Yates: O(n) algorithm guarantees uniform shuffling
- 1D output: Always returns 1-D tensor regardless of input
- Device support: Works on CPU; GPU support depends on device
- Memory efficient: Linear space complexity O(n)
- Seed dependence: Results depend on global RNG seed; use manual_seed for reproducibility
- No gradient flow: randperm is non-differentiable (sampling operation)
- Large n: Time complexity is O(n); very large n can be slow
- Random state: Each call generates different permutation unless seed reset
Parameters
nnumber- Size of permutation (generates integers 0 to n-1)
optionsTensorOptionsoptional- Optional settings: -
dtype: Data type (default: 'float32') -device: Compute device (default: global device)
Returns
Tensor– 1-D tensor of shape [n] containing a random permutation of [0, 1, ..., n-1]Examples
// Generate random permutation
const perm = torch.randperm(5); // e.g., [2, 0, 4, 1, 3]
// Shuffle dataset indices for training
const dataset_size = 1000;
const indices = torch.randperm(dataset_size); // Random order: [234, 567, 12, ...]
const shuffled_data = data.index_select(0, indices); // Reorder data by random indices
// Create random batches without replacement
const all_indices = torch.randperm(1000); // Shuffle all indices
const batch_indices = all_indices.slice(0, 0, 32); // First 32 indices form first batch
const batch_data = data.index_select(0, batch_indices); // Get corresponding data
// Cross-validation split
const n_folds = 5;
const all_indices = torch.randperm(1000);
for (let fold = 0; fold < n_folds; fold++) {
const fold_size = 1000 / n_folds;
const val_idx = all_indices.slice(0, fold * fold_size, (fold + 1) * fold_size);
const train_idx = torch.cat([
all_indices.slice(0, 0, fold * fold_size),
all_indices.slice(0, (fold + 1) * fold_size, 1000)
]);
}
// Random assignment to control/treatment groups
const n_subjects = 100;
const order = torch.randperm(n_subjects);
const control_group = order.slice(0, 0, 50); // First 50 get control
const treatment_group = order.slice(0, 50, 100); // Last 50 get treatment
// Data augmentation: random row reordering
const data = torch.randn(1000, 128);
const random_order = torch.randperm(1000);
const augmented = data.index_select(0, random_order); // Shuffle rows randomly
// Generate with specific dtype
const perm_int32 = torch.randperm(100, { dtype: 'int32' });See Also
- PyTorch torch.randperm()
- randint - Generate random integers (with replacement)
- manual_seed - Set random seed for reproducibility
- index_select - Select elements by indices (often used with randperm)
- argsort - Get sorting indices (deterministic alternative to random ordering)