torch.serialization.serialize_to_zip
Serialize a tensor or state dict to PyTorch ZIP format (.pt/.pth files).
Creates a ZIP archive compatible with PyTorch's torch.load() for loading in Python. Useful for exporting models from torch.js to PyTorch and other frameworks. The format preserves tensor metadata (dtype, shape) for full compatibility. Useful for:
- Exporting models trained in torch.js to PyTorch
- Sharing models with Python ecosystems
- Creating checkpoints for cross-framework workflows
- Backward compatibility with existing PyTorch pipelines
Note: For saving in torch.js environments, use safetensors format instead (faster, safer, more portable). This format is primarily for PyTorch compatibility.
- PyTorch compatibility: Creates .pt/.pth compatible with torch.load()
- Metadata preserved: Shape, dtype, and other metadata maintained
- State dict format: Objects with tensor values are serialized as OrderedDict
- Single tensor: Can serialize individual tensors or state dicts
- Format choice: For torch.js, prefer safetensors (faster, safer)
- Large models: May use significant memory during serialization
- Python only: Output is designed for PyTorch/Python loading
Parameters
Returns
Promise<Blob>– Promise resolving to the serialized ZIP file as a BlobExamples
// Serialize a single tensor
const weights = torch.randn(100, 100);
const blob = await torch.serialize_to_zip(weights);
// Save blob to file or download
// Serialize a state dict (recommended for models)
const state_dict = {
'layer1.weight': torch.randn(64, 32),
'layer1.bias': torch.randn(64),
'layer2.weight': torch.randn(10, 64),
'layer2.bias': torch.randn(10)
};
const blob = await torch.serialize_to_zip(state_dict);
// Export model for PyTorch
const model_blob = await torch.serialize_to_zip(model.state_dict());
// Download or send to Python for torch.load()See Also
- PyTorch torch.save()
- deserialize_from_zip - Load PyTorch .pt/.pth files
- serialize_to_safetensors - Serialize to safer safetensors format