Code organization

The code is organized in multiple modules, each in a separate directory:

  • metatensor-core/ contains the core library, implemented in Rust and exposed to the outside world through a C API. This is also where the C++ API lives, implemented as a header-only library in metatensor.hpp.

  • metatensor-torch/ contains a TorchScript extension, written in C++, using the C++ API of metatensor; as well as the corresponding tests and examples.

  • metatensor/ contains the Rust interface to metatensor, using the C API defined in metatensor-core, as well as the corresponding tests and examples.

  • python/metatensor-core/ contains the Python interface to the core metatensor types, and the corresponding tests;

  • python/metatensor-operations/ contains a set of pure Python functions to manipulate data in metatensor format, and the corresponding tests;

  • python/metatensor-learn/ contains pure Python helpers to define machine learning models, with API inspired by scikit-learn and PyTorch;

  • python/metatensor-torch/ contains the Python interface for the TorchScript version of metatensor, and the corresponding tests;

  • python/metatensor/ contains a small Python package re-exporting everything from metatensor-core and metatensor-operations. This is the main package users should interact with.

  • julia/ contains the Julia bindings to metatensor, which are currently in very early alpha stage;

Finally, docs/ contains the documentation for everything related to metatensor.


Logical organization of the modules in the metatensor repository. Blue boxes are packages that users should interact with, and purple boxes are artifacts used by the different languages. Arrows indicate dependencies and data flow from one module to another.


This sub-project should be built by cmake. When building it, a shared library (libmetatensor.dylib on macOS/metatensor.dll on Windows) and metatensor.h will be produced for consumption by other modules/the end users.

metatensor.h is automatically generated by cbindgen when building the code. All functions marked #[no_mangle] pub extern fn in the rust code will automatically be translated to the corresponding C function declaration.

The C++ API in metatensor.hpp is manually written as a header-only library, exposing the functions from metatensor.h with a cleaner C++11 interface.


This sub-project is a typical C++ project, built by cmake. It contains the TorchScript version of all metatensor core types, built using the C++ API. This sub-project depends on both the C++ API of metatensor-core; and libtorch (the C++ part of PyTorch). All the code in this sub-project is manually written.

metatensor rust crate

This is built by cargo, like a normal Rust project, and re-export a native Rust interface built on top of the C API. This is a separate module from metatensor-core so that in complex projects with multiple users of metatensor there can be a single authoritative version of the metatensor C API.

When publishing to, metatensor-core is included as a tarball of the metatensor-core/ folder (using the .crate file created by cargo package). This file should be generated before publishing using ./scripts/ The Rust declarations corresponding to the C API (./metatensor/src/ are also automatically generated using using bindgen, and should be updated with ./scripts/

Python packages

The Python API for metatensor is split into different distributions, which when installed will correspond to different sub-module of metatensor:

  • the metatensor distribution does not install any module, but has dependencies on metatensor-core and metatensor-operations. It is mainly here for user convenience, allowing to have a single install command;

  • the metatensor-core distribution contains the metatensor module;

  • the metatensor-operations distribution contains the metatensor.operations module; which depends on metatensor-core;

  • the metatensor-learn distribution contains the metatensor.learn module; which depends on metatensor-operations;

  • the metatensor-torch distribution contains the metatensor.torch module. This module re-exports metatensor-operations and metatensor-learn as metatensor.torch.operations and metatensor.torch.learn respectively;

All the Python sub-projects are built by setuptools, and fully compatible with pip and other standard Python tools.


This Python module re-export a native Python interface built on top of the C API. The C API is accessed using the standard Python ctypes module. The functions declaration in python/metatensor-core/metatensor/core/ are generated from the metatensor.h header when running ./scripts/


This Python package contains the code for the operations acting on TensorMap, and provides building blocks for machine learning models on top of the metatensor data structures.

By default, the operations uses the types from metatensor-core, and can act on either numpy or torch data. The code in is here to use the right function depending on the type of arrays stored by metatensor.

At the same time, this code is also used from metatensor-torch, using the metatensor types exposed in this module and operating only on torch data. This is achieved by re-importing the code from metatensor-operations in a new module metatensor.torch.operations. See the comments in python/metatensor-torch/metatensor/torch/ for more information.


This Python package contains the code for the machine learning helpers tools and other facilities to define new models based on metatensor data format.

Similarly to metatensor-operations, it uses types and functionalities from metatensor-core by default, and is re-exported in metatensor-torch using types and functions from there instead.


This Python package exposes to Python the types defined in the C++ metatensor-torch sub-project. It should be used to define models that are then exported using TorchScript and run without a Python interpreter.

As mentioned above, this package also re-export the code from metatensor-operations and metatensor-learn in a way compatible with TorchScript.

Finally this package also contains facilities to define atomistic machine learning models. Refer to the corresponding documentation for more information.


This is a small wrapper package for user convenience, re-exporting all types from metatensor-core and all functions from metatensor-operations.