torch.memory.resetPeak
function resetPeak(): voidResets the peak memory usage counter.
Clears the recorded maximum memory usage, allowing you to track peak usage for a specific phase or operation. Useful for profiling different parts of your application separately.
Behavior:
- Resets peak counter to current memory usage
- Active tensors are unaffected
- Pooled memory is unaffected
- Use before profiling sections to measure peak for that section
Use Cases:
- Profile memory usage per training epoch
- Measure peak for different models/algorithms
- Monitor memory between different phases
- Isolate peak usage for specific operations
- Fast operation: O(1) time complexity
- Only counter: Doesn't free any memory
- Immediate effect: Peak is reset right away
- Affects stats(): Next stats() call shows new peak baseline
Returns
void
Examples
// Profile memory for each epoch
for (let epoch = 0; epoch < 10; epoch++) {
torch.memory.resetPeak();
torch.scope(() => {
for (let batch = 0; batch < 100; batch++) {
const data = torch.randn([32, 256]);
const output = model.forward(data);
// ... loss and backward ...
}
});
const epochStats = torch.memory.stats();
console.log(`Epoch ${epoch} peak: ${torch.memory.formatBytes(epochStats.peakBytes)}`);
}
// Compare different algorithms
torch.memory.resetPeak();
const result1 = algorithmA(input);
const peakA = torch.memory.stats().peakBytes;
torch.memory.resetPeak();
const result2 = algorithmB(input);
const peakB = torch.memory.stats().peakBytes;
console.log(`Algorithm A peak: ${torch.memory.formatBytes(peakA)}`);
console.log(`Algorithm B peak: ${torch.memory.formatBytes(peakB)}`);
// Verify memory efficiency
torch.memory.resetPeak();
const data = loadLargeDataset();
const peak = torch.memory.stats().peakBytes;
console.log(`Loaded dataset using ${torch.memory.formatBytes(peak)}`);See Also
- [PyTorch N/A (torch.js specific)](https://pytorch.org/docs/stable/generated/N/A .html)
- stats - Get current memory statistics
- clearPool - Free pooled memory