torchani.models#

Provides access to all published ANI models.

Provided models are subclasses of torchani.arch.ANI. Some models have been published in previous articles, and some in TorchANI 2. If you use any of these models in your work please cite the corresponding article(s).

If for a given model you discover a bug, performance problem, or incorrect behavior in some region of chemical space, please open an issue in GitHub. The TorchANI developers will attempt to address and document issues.

Note that parameters of the ANI models are automatically downloaded and cached the first time they are instantiated. If this is an issue for your application we recommend you pre-download the parameters by instantiating the models once before use.

The models can be used directly once they are instantiated. Alternatively, they can be converted to an ASE calculator by calling ANI.ase.

Some models have an interanl set of neural networks (torchani.nn.Ensemble), and they output their averaged values. Individual members of these ensembles can be accessed by indexing, and len(ANI) can be used to query the number of networks in it.

The models also have three extra entry points for more specific use cases: atomic_energies and energies_qbcs.

All entrypoints expect a tuple of tensors (species, coords) as input, together with two optional tensors, cell and pbc. coords and cell should be in units of Angstroms, and the output energies are always in Hartrees

For more details consult the examples documentation

import torchani

model = torchani.models.ANI2x()

# Batch of molecules
# shape is (molecules, atoms) for atomic_nums and (molecules, atoms, 3) for coords
atomic_nums = torch.tensor([[8, 1, 1]])
coords = torch.tensor([[...], [...], [...]])

# Average energies over the ensemble, for all molecules
# Output shape is (molecules,)
energies = model((atomic_nums, coords)).energies

# Average atomic energies over the ensemble for the batch
# Output shape is (molecules, atoms)
atomic_energies = model.atomic_energies((atomic_nums, coords)).energies

# Individual energies of the members of the ensemble
# Output shape is (ensemble-size, molecules)
energies = model((atomic_nums, coords), ensemble_values=True).energies

# QBC factors are used for active learning, shape is (molecules,)
result = model.energies_qbcs((species, coords))
energies = result.energies
qbcs = result.qbcs

# Individual submodels of the ensemble can be obtained by indexing, they are also
# subclasses of ``ANI``, with the same functionality
submodel = model[0]

Functions

ANI1ccx

The ANI-1ccx model as in ani-1ccx_8x on GitHub and Transfer Learning Paper.

ANI1x

The ANI-1x model as in ani-1x_8x on GitHub and Active Learning Paper.

ANI2dr

Improved ANI model trained to the 2x dataset

ANI2x

The ANI-2x model as in ANI2x Paper and ANI2x Results on GitHub.

ANI2xr

Improved ANI model trained to the 2x dataset

ANImbis

Experimental ANI-2x model with MBIS charges

ANIr2s

ANIr2s_ch3cn

ANIr2s_chcl3

ANIr2s_water

SnnANI2xr

Improved ANI model trained to the 2x dataset

torchani.models.ANI1x(model_index=None, neighborlist='all_pairs', strategy='pyaev', periodic_table_index=True, device=None, dtype=None)[source]#

The ANI-1x model as in ani-1x_8x on GitHub and Active Learning Paper.

The ANI-1x model is an ensemble of 8 networks that was trained using active learning on the ANI-1x dataset, the target level of theory is wB97X/6-31G(d). It predicts energies on HCNO elements exclusively, it shouldn’t be used with other atom types.

torchani.models.ANI1ccx(model_index=None, neighborlist='all_pairs', strategy='pyaev', periodic_table_index=True, device=None, dtype=None)[source]#

The ANI-1ccx model as in ani-1ccx_8x on GitHub and Transfer Learning Paper.

The ANI-1ccx model is an ensemble of 8 networks that was trained on the ANI-1ccx dataset, using transfer learning. The target accuracy is CCSD(T)*/CBS (CCSD(T) using the DPLNO-CCSD(T) method). It predicts energies on HCNO elements exclusively, it shouldn’t be used with other atom types.

torchani.models.ANI2x(model_index=None, neighborlist='all_pairs', strategy='pyaev', periodic_table_index=True, device=None, dtype=None)[source]#

The ANI-2x model as in ANI2x Paper and ANI2x Results on GitHub.

The ANI-2x model is an ensemble of 8 networks that was trained on the ANI-2x dataset. The target level of theory is wB97X/6-31G(d). It predicts energies on HCNOFSCl elements exclusively it shouldn’t be used with other atom types.

torchani.models.ANImbis(model_index=None, neighborlist='all_pairs', strategy='pyaev', periodic_table_index=True, device=None, dtype=None)[source]#

Experimental ANI-2x model with MBIS charges

torchani.models.ANI2xr(model_index=None, neighborlist='all_pairs', strategy='pyaev', periodic_table_index=True, device=None, dtype=None)[source]#

Improved ANI model trained to the 2x dataset

Trained to the wB97X level of theory with an added repulsion potential, and smoother PES.

torchani.models.ANI2dr(model_index=None, neighborlist='all_pairs', strategy='pyaev', periodic_table_index=True, device=None, dtype=None)[source]#

Improved ANI model trained to the 2x dataset

Trained to the B973c level of theory with added repulsion and dispersion potentials, and smoother PES.

torchani.models.SnnANI2xr(model_index=None, neighborlist='all_pairs', strategy='pyaev', periodic_table_index=True, device=None, dtype=None)[source]#

Improved ANI model trained to the 2x dataset

Trained to the wB97X level of theory with an added repulsion potential, and smoother PES.