Serialization¶
- metatensor.save(file: str | Path | BinaryIO, data: TensorMap | TensorBlock | Labels, use_numpy=False)[source]¶
Save the given data (one of
TensorMap
,TensorBlock
, orLabels
) to the givenfile
.TensorMap
are serialized using numpy’s.npz
format, i.e. a ZIP file without compression (storage method isSTORED
), where each file is stored as a.npy
array. See the C API documentation for more information on the format.- Parameters:
file (str | Path | BinaryIO) – where to save the data. This can be a string,
pathlib.Path
containing the path to the file to load, or a file-like object that should be opened in binary mode.data (TensorMap | TensorBlock | Labels) – data to serialize and save
use_numpy – should we use numpy or the native serializer implementation? Numpy should be able to process more dtypes than the native implementation, which is limited to float64, but the native implementation is usually faster than going through numpy. This is ignored when saving
Labels
.
- metatensor.load(file: str | Path | BinaryIO, use_numpy=False) TensorMap [source]¶
Load a previously saved
TensorMap
from the given file.TensorMap
are serialized using numpy’s.npz
format, i.e. a ZIP file without compression (storage method isSTORED
), where each file is stored as a.npy
array. See the C API documentation for more information on the format.- Parameters:
file (str | Path | BinaryIO) – file to load: this can be a string, a
pathlib.Path
containing the path to the file to load, or a file-like object that should be opened in binary mode.use_numpy – should we use numpy or the native implementation? Numpy should be able to process more dtypes than the native implementation, which is limited to float64, but the native implementation is usually faster than going through numpy.
- Return type:
- metatensor.load_block(file: str | Path | BinaryIO, use_numpy=False) TensorBlock [source]¶
Load a previously saved
TensorBlock
from the given file.- Parameters:
file (str | Path | BinaryIO) – file to load: this can be a string, a
pathlib.Path
containing the path to the file to load, or a file-like object that should be opened in binary mode.use_numpy – should we use numpy or the native implementation? Numpy should be able to process more dtypes than the native implementation, which is limited to float64, but the native implementation is usually faster than going through numpy.
- Return type:
- metatensor.load_labels(file: str | Path | BinaryIO) Labels [source]¶
Load previously saved
Labels
from the given file.- Parameters:
file (str | Path | BinaryIO) – file to load: this can be a string, a
pathlib.Path
containing the path to the file to load, or a file-like object opened in binary mode.- Return type:
- metatensor.io.save_buffer(data: TensorMap | TensorBlock | Labels, use_numpy=False) memoryview [source]¶
Save the given data (one of
TensorMap
,TensorBlock
, orLabels
) to an in-memory buffer.- Parameters:
data (TensorMap | TensorBlock | Labels) – data to serialize and save
use_numpy – should we use numpy or the native serializer implementation?
- Return type:
- metatensor.io.load_buffer(buffer: bytes | bytearray | memoryview, use_numpy=False) TensorMap [source]¶
Load a previously saved
TensorMap
from an in-memory buffer.- Parameters:
buffer (bytes | bytearray | memoryview) – In-memory buffer containing a serialized
TensorMap
use_numpy – should we use numpy or the native implementation?
- Return type:
- metatensor.io.load_custom_array(path: str | Path, create_array: CreateArrayCallback) TensorMap [source]¶
Load a previously saved
TensorMap
from the given path using a custom array creation callback.This is an advanced functionality, which should not be needed by most users.
This function allows to specify the kind of array to use when loading the data through the
create_array
callback. This callback should take three arguments: a pointer to the shape, the number of elements in the shape, and a pointer to themts_array_t
to be filled.metatensor.io.create_numpy_array()
andmetatensor.io.create_torch_array()
can be used to load data into numpy and torch arrays respectively.
- metatensor.io.load_buffer_custom_array(buffer: bytes | bytearray | memoryview, create_array: CreateArrayCallback) TensorMap [source]¶
Load a previously saved
TensorMap
from the given buffer using a custom array creation callback.This is an advanced functionality, which should not be needed by most users.
This function allows to specify the kind of array to use when loading the data through the
create_array
callback. This callback should take three arguments: a pointer to the shape, the number of elements in the shape, and a pointer to themts_array_t
to be filled.metatensor.io.create_numpy_array()
andmetatensor.io.create_torch_array()
can be used to load data into numpy and torch arrays respectively.- Parameters:
buffer (bytes | bytearray | memoryview) – in-memory buffer containing a saved
TensorMap
create_array (CreateArrayCallback) – callback used to create arrays as needed
- Return type:
- metatensor.io.load_block_buffer(buffer: bytes | bytearray | memoryview, use_numpy=False) TensorBlock [source]¶
Load a previously saved
TensorBlock
from an in-memory buffer.- Parameters:
buffer (bytes | bytearray | memoryview) – In-memory buffer containing a serialized
TensorBlock
use_numpy – should we use numpy or the native implementation?
- Return type:
- metatensor.io.load_block_custom_array(path: str | Path, create_array: CreateArrayCallback) TensorBlock [source]¶
Load a previously saved
TensorBlock
from the given path using a custom array creation callback.This is an advanced functionality, which should not be needed by most users.
This function allows to specify the kind of array to use when loading the data through the
create_array
callback. This callback should take three arguments: a pointer to the shape, the number of elements in the shape, and a pointer to themts_array_t
to be filled.metatensor.io.create_numpy_array()
andmetatensor.io.create_torch_array()
can be used to load data into numpy and torch arrays respectively.- Parameters:
- Return type:
- metatensor.io.load_block_buffer_custom_array(buffer: bytes | bytearray | memoryview, create_array: CreateArrayCallback) TensorBlock [source]¶
Load a previously saved
TensorBlock
from the given buffer using a custom array creation callback.This is an advanced functionality, which should not be needed by most users.
This function allows to specify the kind of array to use when loading the data through the
create_array
callback. This callback should take three arguments: a pointer to the shape, the number of elements in the shape, and a pointer to themts_array_t
to be filled.metatensor.io.create_numpy_array()
andmetatensor.io.create_torch_array()
can be used to load data into numpy and torch arrays respectively.- Parameters:
buffer (bytes | bytearray | memoryview) – in-memory buffer containing a saved
TensorBlock
create_array (CreateArrayCallback) – callback used to create arrays as needed
- Return type:
- metatensor.io.create_numpy_array()[source]¶
Callback function that can be used with
metatensor.io.load_custom_array()
to load data in numpy arrays.
- metatensor.io.create_torch_array()[source]¶
Callback function that can be used with
metatensor.io.load_custom_array()
to load data in torch tensors. The resulting tensors are stored on CPU, and their dtype istorch.float64
.
- metatensor.io.load_labels_buffer(buffer: bytes | bytearray | memoryview) Labels [source]¶
Load previously saved
Labels
from an in-memory buffer.- Parameters:
buffer (bytes | bytearray | memoryview) – in-memory buffer containing saved
Labels
- Return type: