BOINC AI Safetensors
Safetensors
Safetensors is a new simple format for storing tensors safely (as opposed to pickle) and that is still fast (zero-copy). Safetensors is really fast 🚀.
Installation
with pip:
Copied
pip install safetensors
with conda:
Copied
conda install -c huggingface safetensors
Usage
Load tensors
Copied
from safetensors import safe_open
tensors = {}
with safe_open("model.safetensors", framework="pt", device=0) as f:
for k in f.keys():
tensors[k] = f.get_tensor(k)
Loading only part of the tensors (interesting when running on multiple GPU)
Copied
from safetensors import safe_open
tensors = {}
with safe_open("model.safetensors", framework="pt", device=0) as f:
tensor_slice = f.get_slice("embedding")
vocab_size, hidden_dim = tensor_slice.get_shape()
tensor = tensor_slice[:, :hidden_dim]
Save tensors
Copied
import torch
from safetensors.torch import save_file
tensors = {
"embedding": torch.zeros((2, 2)),
"attention": torch.zeros((2, 3))
}
save_file(tensors, "model.safetensors")
Format
Let’s say you have safetensors file named model.safetensors
, then model.safetensors
will have the following internal format:
Featured Projects
Safetensors is being used widely at leading AI enterprises, such as Hugging Face, EleutherAI, and StabilityAI. Here is a non-exhaustive list of projects that are using safetensors:
Last updated