Serialization¶
- metatensor.load(file: str | Path | BinaryIO, use_numpy=False) TensorMap[source]¶
Load a previously saved
TensorMapfrom the given file.TensorMapare serialized using numpy’s.npzformat, i.e. a ZIP file without compression (storage method isSTORED), where each file is stored as a.npyarray. See the C API documentation for more information on the format.- Parameters:
file (str | Path | BinaryIO) – file to load: this can be a string, a
pathlib.Pathcontaining the path to the file to load, or a file-like object that should be opened in binary mode.use_numpy – should we use numpy or the native implementation? Numpy should be able to process more dtypes than the native implementation, which is limited to float64, but the native implementation is usually faster than going through numpy.
- Return type:
- metatensor.save(file: str | Path | BinaryIO, data: TensorMap | Labels, use_numpy=False)[source]¶
Save the given data (either
TensorMaporLabels) to the givenfile.TensorMapare serialized using numpy’s.npzformat, i.e. a ZIP file without compression (storage method isSTORED), where each file is stored as a.npyarray. See the C API documentation for more information on the format.- Parameters:
file (str | Path | BinaryIO) – where to save the data. This can be a string,
pathlib.Pathcontaining the path to the file to load, or a file-like object that should be opened in binary mode.use_numpy – should we use numpy or the native serializer implementation? Numpy should be able to process more dtypes than the native implementation, which is limited to float64, but the native implementation is usually faster than going through numpy. This is ignored when saving
Labels.
- metatensor.load_labels(file: str | Path | BinaryIO) Labels[source]¶
Load previously saved
Labelsfrom the given file.- Parameters:
file (str | Path | BinaryIO) – file to load: this can be a string, a
pathlib.Pathcontaining the path to the file to load, or a file-like object opened in binary mode.- Return type:
- metatensor.io.load_buffer(buffer: bytes | bytearray | memoryview, use_numpy=False) TensorMap[source]¶
Load a previously saved
TensorMapfrom an in-memory buffer.- Parameters:
buffer (bytes | bytearray | memoryview) – In-memory buffer containing a serialized
TensorMapuse_numpy – should we use numpy or the native implementation?
- Return type:
- metatensor.io.load_labels_buffer(buffer: bytes | bytearray | memoryview) Labels[source]¶
Load previously saved
Labelsfrom an in-memory buffer.- Parameters:
buffer (bytes | bytearray | memoryview) – in-memory buffer containing saved
Labels- Return type:
- metatensor.io.save_buffer(data: TensorMap | Labels, use_numpy=False) memoryview[source]¶
Save the given data (either
TensorMaporLabels) to an in-memory buffer.- Parameters:
- Return type:
- metatensor.io.load_custom_array(path: str | Path, create_array: Callable[[LP_c_ulong, c_ulong, LP_mts_array_t], None]) TensorMap[source]¶
Load a previously saved
TensorMapfrom the given path using a custom array creation callback.This is an advanced functionality, which should not be needed by most users.
This function allows to specify the kind of array to use when loading the data through the
create_arraycallback. This callback should take three arguments: a pointer to the shape, the number of elements in the shape, and a pointer to themts_array_tto be filled.metatensor.io.create_numpy_array()andmetatensor.io.create_torch_array()can be used to load data into numpy and torch arrays respectively.
- metatensor.io.load_buffer_custom_array(buffer: bytes | bytearray | memoryview, create_array: Callable[[LP_c_ulong, c_ulong, LP_mts_array_t], None]) TensorMap[source]¶
Load a previously saved
TensorMapfrom the given buffer using a custom array creation callback.This is an advanced functionality, which should not be needed by most users.
This function allows to specify the kind of array to use when loading the data through the
create_arraycallback. This callback should take three arguments: a pointer to the shape, the number of elements in the shape, and a pointer to themts_array_tto be filled.metatensor.io.create_numpy_array()andmetatensor.io.create_torch_array()can be used to load data into numpy and torch arrays respectively.
- metatensor.io.create_numpy_array()[source]¶
Callback function that can be used with
metatensor.io.load_custom_array()to load data in numpy arrays.
- metatensor.io.create_torch_array()[source]¶
Callback function that can be used with
metatensor.io.load_custom_array()to load data in torch tensors. The resulting tensors are stored on CPU, and their dtype istorch.float64.