Serialization#

metatensor.load(file: str | Path | BinaryIO, use_numpy=False) TensorMap[source]#

Load a previously saved TensorMap from the given file.

TensorMap are serialized using numpy’s .npz format, i.e. a ZIP file without compression (storage method is STORED), where each file is stored as a .npy array. See the C API documentation for more information on the format.

Parameters:
  • file (str | Path | BinaryIO) – file to load: this can be a string, a pathlib.Path containing the path to the file to load, or a file-like object that should be opened in binary mode.

  • use_numpy – should we use numpy or the native implementation? Numpy should be able to process more dtypes than the native implementation, which is limited to float64, but the native implementation is usually faster than going through numpy.

Return type:

TensorMap

metatensor.save(file: str | Path | BinaryIO, data: TensorMap | Labels, use_numpy=False)[source]#

Save the given data (either TensorMap or Labels) to the given file.

TensorMap are serialized using numpy’s .npz format, i.e. a ZIP file without compression (storage method is STORED), where each file is stored as a .npy array. See the C API documentation for more information on the format.

Parameters:
  • file (str | Path | BinaryIO) – where to save the data. This can be a string, pathlib.Path containing the path to the file to load, or a file-like object that should be opened in binary mode.

  • data (TensorMap | Labels) – data to serialize and save

  • use_numpy – should we use numpy or the native serializer implementation? Numpy should be able to process more dtypes than the native implementation, which is limited to float64, but the native implementation is usually faster than going through numpy. This is ignored when saving Labels.

metatensor.load_labels(file: str | Path | BinaryIO) Labels[source]#

Load previously saved Labels from the given file.

Parameters:

file (str | Path | BinaryIO) – file to load: this can be a string, a pathlib.Path containing the path to the file to load, or a file-like object opened in binary mode.

Return type:

Labels


metatensor.io.load_buffer(buffer: bytes | bytearray | memoryview, use_numpy=False) TensorMap[source]#

Load a previously saved TensorMap from an in-memory buffer.

Parameters:
  • buffer (bytes | bytearray | memoryview) – In-memory buffer containing a serialized TensorMap

  • use_numpy – should we use numpy or the native implementation?

Return type:

TensorMap

metatensor.io.load_labels_buffer(buffer: bytes | bytearray | memoryview) Labels[source]#

Load previously saved Labels from an in-memory buffer.

Parameters:

buffer (bytes | bytearray | memoryview) – in-memory buffer containing saved Labels

Return type:

Labels

metatensor.io.save_buffer(data: TensorMap | Labels, use_numpy=False) memoryview[source]#

Save the given data (either TensorMap or Labels) to an in-memory buffer.

Parameters:
  • data (TensorMap | Labels) – data to serialize and save

  • use_numpy – should we use numpy or the native serializer implementation?

Return type:

memoryview

metatensor.io.load_custom_array(path: str | Path, create_array: Callable[[LP_c_ulong, c_ulong, LP_mts_array_t], None]) TensorMap[source]#

Load a previously saved TensorMap from the given path using a custom array creation callback.

This is an advanced functionality, which should not be needed by most users.

This function allows to specify the kind of array to use when loading the data through the create_array callback. This callback should take three arguments: a pointer to the shape, the number of elements in the shape, and a pointer to the mts_array_t to be filled.

metatensor.io.create_numpy_array() and metatensor.io.create_torch_array() can be used to load data into numpy and torch arrays respectively.

Parameters:
  • path (str | Path) – path of the file to load

  • create_array (Callable[[LP_c_ulong, c_ulong, LP_mts_array_t], None]) – callback used to create arrays as needed

Return type:

TensorMap

metatensor.io.load_buffer_custom_array(buffer: bytes | bytearray | memoryview, create_array: Callable[[LP_c_ulong, c_ulong, LP_mts_array_t], None]) TensorMap[source]#

Load a previously saved TensorMap from the given buffer using a custom array creation callback.

This is an advanced functionality, which should not be needed by most users.

This function allows to specify the kind of array to use when loading the data through the create_array callback. This callback should take three arguments: a pointer to the shape, the number of elements in the shape, and a pointer to the mts_array_t to be filled.

metatensor.io.create_numpy_array() and metatensor.io.create_torch_array() can be used to load data into numpy and torch arrays respectively.

Parameters:
  • buffer (bytes | bytearray | memoryview) – in-memory buffer containing a saved TensorMap

  • create_array (Callable[[LP_c_ulong, c_ulong, LP_mts_array_t], None]) – callback used to create arrays as needed

Return type:

TensorMap

metatensor.io.create_numpy_array()[source]#

Callback function that can be used with metatensor.io.load_custom_array() to load data in numpy arrays.

metatensor.io.create_torch_array()[source]#

Callback function that can be used with metatensor.io.load_custom_array() to load data in torch tensors. The resulting tensors are stored on CPU, and their dtype is torch.float64.