torch.memory.clearPool
function clearPool(): voidClears the buffer pool, releasing all pooled (unused) buffers.
The buffer pool holds onto GPU/CPU memory from destroyed tensors for reuse, reducing allocation overhead. clearPool() frees this memory back to the system while keeping active tensors intact. Useful for long-running applications or when switching between memory-intensive tasks.
Behavior:
- Releases memory from destroyed tensors held in pool
- Active tensors are NOT affected
- Fragmentation is reduced
- Future allocations may require new GPU/CPU allocation
- Synchronous operation (blocks until complete)
Use Cases:
- Free memory between major tasks or phases
- Reduce memory footprint when idle
- Switch between memory-intensive operations
- Manual memory management for constrained devices
- Clean up after batch processing
- Only pooled memory: Active tensors are unaffected
- Synchronous: Blocks until buffers are freed
- Performance cost: May fragment memory over time
- System-dependent: Freed memory may not return to OS immediately
Returns
void
Examples
// Free memory between tasks
const before = torch.memory.stats();
console.log(`Before: ${torch.memory.formatBytes(before.pooledBytes)}`);
torch.scope(() => {
const big = torch.randn([10000, 10000]);
// ...process...
}); // big is destroyed, memory goes to pool
const mid = torch.memory.stats();
console.log(`Pooled: ${torch.memory.formatBytes(mid.pooledBytes)}`);
torch.memory.clearPool(); // Free the pooled memory
const after = torch.memory.stats();
console.log(`After: ${torch.memory.formatBytes(after.pooledBytes)}`);
// Switching between workloads
// Task 1: Large model training
trainLargeModel();
torch.memory.clearPool(); // Free intermediate buffers
// Task 2: Small inference
runInference(); // Starts fresh without fragmentation
// Memory-constrained device
function processWithLimitedMemory(datasets: Data[]) {
for (const data of datasets) {
torch.scope(() => {
process(data);
});
torch.memory.clearPool(); // Keep memory usage low
}
}See Also
- [PyTorch N/A (torch.js specific)](https://pytorch.org/docs/stable/generated/N/A .html)
- stats - Check pooled memory amount
- resetPeak - Reset peak memory counter
- scope - Automatic memory management