torch.nn.LazyInstanceNorm1d
class LazyInstanceNorm1d extends _LazyInstanceNormLazy Instance Normalization 1D: automatically infers number of channels for instance normalization.
Extends InstanceNorm1d with lazy initialization. Automatically infers number of channels from the first input. Instance normalization normalizes each channel independently per sample (not across batch). Essential for:
- Style transfer (removes instance-specific contrast while preserving content)
- Generative models (SPADE, normalizing each instance independently)
- When you don't want batch statistics (online learning, single-sample inference)
InstanceNorm is simpler than BatchNorm: normalizes each sample's features independently, without batch statistics. Lazy variant automatically determines num_features from input.shape[1].
- Instance-level: Normalizes each sample independently (not across batch)
- No batch statistics: Useful for single-sample inference
- Style transfer: Often used in style transfer networks
- Generative models: Common in generative adversarial networks
- Input must be 2D or 3D (batch + features [+ sequence])
- Each sample normalized independently (no batch statistics sharing)
Examples
// Lazy InstanceNorm for sequences
const lazy_in = new torch.nn.LazyInstanceNorm1d();
const x = torch.randn([32, 128]); // [batch, features]
const normalized = lazy_in.forward(x); // Initializes with 128 features
// Each of 32 samples normalized independently across its 128 features