torch.nn.FractionalMaxPool2d
class FractionalMaxPool2d extends Modulenew FractionalMaxPool2d(kernel_size: number | [number, number], options?: FractionalMaxPoolOptions)
- readonly
kernel_size(number | [number, number]) - readonly
output_size(number | [number, number] | null) - readonly
output_ratio(number | [number, number] | null) - readonly
return_indices(boolean)
2D fractional max pooling: stochastic max pooling with random kernel sizes for regularization.
Applies max pooling with randomized kernel sizes determined by output_ratio or output_size. Different random kernel for each forward pass (stochastic). Essential for:
- Data augmentation during training (improves generalization)
- Regularization through stochastic downsampling
- Robustness to input variations (random pooling windows)
- Training neural networks with improved test accuracy
How it works: Unlike regular pooling with fixed kernel_size, fractional pooling randomly selects pooling regions to create different downsampling patterns each iteration. Acts as implicit regularizer.
Parameters:
- output_size: desired output spatial dimensions (e.g., [7, 7])
- output_ratio: downsampling ratio (e.g., [0.5, 0.5] = half resolution) One of output_size or output_ratio must be specified.
- Stochastic regularization: Randomness acts as data augmentation
- Training only: Typically used during training; deterministic at test time
- Improves generalization: Random pooling patterns improve test accuracy
- Different each iteration: Each forward pass potentially different downsampling
- Information loss: Random max pooling loses non-max values
- Stochastic: Non-deterministic output (unless seeded)
- Parameter requirement: Must specify either output_size or output_ratio
- Training effect: Stochasticity best utilized during training; fix for inference
Examples
// Stochastic pooling with fixed output size
const pool = new torch.nn.FractionalMaxPool2d(2, { output_size: [7, 7] }); // output 7x7
const x = torch.randn([32, 64, 224, 224]);
const y = pool.forward(x); // [32, 64, 7, 7] - random pooling pattern// Data augmentation with fractional pooling
const pool = new torch.nn.FractionalMaxPool2d(3, { output_ratio: [0.75, 0.75] }); // 75% size
const batch = torch.randn([16, 128, 56, 56]);
// Each forward pass uses different random kernel:
const y1 = pool.forward(batch); // [16, 128, ~42, ~42]
const y2 = pool.forward(batch); // [16, 128, ~42, ~42] - different pattern