Changelog¶
All notable changes to metatensor-learn are documented here, following the keep a changelog format. This project follows Semantic Versioning.
Unreleased¶
Added¶
A custom class
metatensor.learn.nn.Modulethat should be used instead oftorch.nn.Modulewhen the modules contains metatensor data (Labels, TensorBlock, TensorMap) as attributes. This class will properly handle moving this data to the correct dtype and device when callingmodule.to()and related functions. It will also handle putting this data in the modulestate_dict()and loading it back withload_state_dict().
Version 0.3.2 - 2025-04-25¶
Make the code compatible with metatensor-torch v0.7.6
Version 0.3.1 - 2025-02-03¶
Fixed¶
Indexing inside a
Datasetis now O(1) instead of O(N) (#790)Fixed a bug with the default
invariant_keysinmetatensor.learn.nnmodules (#785)
Version 0.3.0 - 2024-10-30¶
Added¶
Added
metatensor.learn.nn.EquivariantTransformationto apply anytorch.nn.Moduleto invariants computed from the norm over components of covariant blocks. The transformed invariants are then elementwise multiplied back to the covariant blocks. For invariant blocks, thetorch.nn.Moduleis applied as is (#744)
Changed¶
metatensor.learn.nnmodulesInvariantTanh,InvariantSiLU,InvariantReLU,InvariantLayerNorm, andEquivariantLinearhave removed and replaced parameter.invariant_key_idxsis replaced byinvariant_keys, aLabelsobject that selects for invariant blocks.metatensor.learn.nnmodulesLayerNorm,InvariantLayerNorm,Linear, andEquivariantLinearhave altered accepted types for certain parameters. Parameterseps,elementwise_affine,bias, andmeanfor the layer norm modules, andbiasfor the linear modules are affected. Previously these could be passed as list, but now can only be passed as a single value. For greater control over modules applied to individual blocks, users are encouraged to use theModuleMapmodule frommetatensor.learn.nn.
Version 0.2.3 - 2024-08-28¶
Changed¶
We now require Python >= 3.9
Dataset and DataLoader can now handle fields with a name which is not a valid Python identifier.
Version 0.2.2 - 2024-05-16¶
Added¶
Added torch-style activation function module maps to
metatensor.learn.nn:ReLU,InvariantReLU,SiLU, andInvariantSiLU(#597)Added torch-style neural network module maps to
metatensor.learn.nn:LayerNorm,InvariantLayerNorm,EquivariantLinear,Sequential,Tanh, andInvariantTanh(#513)
Fixed¶
metatensor.learn.nnmodulesLayerNormandInvariantLayerNormnow applies sample-independent transformations to input tensors.Set correct device for output of when torch default device is different than input device (#595)
Version 0.2.1 - 2024-03-01¶
Changed¶
metatensor-learnis no longer re-exported frommetatensorandmetatensor.torch, all functions are still available insidemetatensor.learnandmetatensor.torch.learn.
Fixed¶
Make sure the
Datasetclass is iterable (#500)
Version 0.2.0 - 2024-02-07¶
Changed¶
Pluralization removed for special kwarg
sample_idsofIndexedDataset->sample_id, and provided collate functionsgroupandgroup_and_joinupdated accordingly.
Fixed¶
Removal of usage of Labels.range in nn modules to support torch.jit.save (#410)
Version 0.1.0 - 2024-01-26¶
Added¶
ModuleMapandLinearmodules, following torch.nn.ModuleDict and torch.nn.Linear in PyTorch but adapted forTensorMap’s (#427)DatasetandDataLoaderfacilities, following the corresponding classes in PyTorch (#428)