torch.nn.is_uninitialized_buffer
function is_uninitialized_buffer(value: unknown): value is UninitializedBufferChecks if a value is an UninitializedBuffer.
Returns true if the value is an instance of UninitializedBuffer, which represents a buffer that has not yet been materialized with a concrete shape. Uninitialized buffers are used in lazy modules to defer buffer creation until the first forward pass when the input shape is known. Unlike parameters, buffers are not trainable but may be registered in models for state management (e.g., running statistics, normalization state). Useful for:
- Type narrowing: Determine if a buffer is lazy or materialized
- Lazy module implementation: Check buffers during module initialization
- Debugging: Verify buffer initialization state
- State management: Handle lazy buffers in model state_dict operations
- Shape inference: Defer buffer creation until input dimensions are known
- Type guard: Acts as TypeScript type guard with proper narrowing
- Instance check: Uses instanceof for runtime type checking
- Non-trainable: Buffers don't participate in gradient computation
- Lazy initialization: Part of lazy module pattern for unknown input shapes
- Persistency: Can be configured as persistent (saved in state_dict) or not
Parameters
valueunknown- The value to check
Returns
value is UninitializedBuffer– True if value is an UninitializedBuffer instance, false otherwiseExamples
// Check if a value is an uninitialized buffer
const uninit_buf = new torch.nn.UninitializedBuffer({ dtype: 'float32' });
torch.nn.is_uninitialized_buffer(uninit_buf); // true
// After materialization, it becomes a Buffer
const buf = uninit_buf.materialize([10, 20]);
torch.nn.is_uninitialized_buffer(buf); // false// Type guard in module initialization
function initializeBuffers(buffers: unknown[]) {
for (const b of buffers) {
if (torch.nn.is_uninitialized_buffer(b)) {
// Handle uninitialized buffer (lazy)
console.log('Found lazy buffer, shape will be determined at first forward pass');
} else {
// Regular materialized buffer
console.log('Buffer is already initialized with shape:', b.shape);
}
}
}// Lazy BatchNorm with uninitialized running statistics
const lazy_bn = new torch.nn.LazyBatchNorm1d();
// Internal running_mean and running_var buffers are uninitialized
for (const [name, buf] of lazy_bn.named_buffers()) {
if (torch.nn.is_uninitialized_buffer(buf)) {
console.log(`${name} will be materialized on first forward pass`);
}
}See Also
- [PyTorch torch.nn.UninitializedBuffer (type check)](https://pytorch.org/docs/stable/generated/torch.nn.UninitializedBuffer .html)
- UninitializedBuffer - The uninitialized buffer class
- is_uninitialized_parameter - Check for uninitialized parameters
- Buffer - Materialized buffer class