Model definition
For demonstration purposes, let us go back to an earlier example model – linear isotropic elasticity:
[Models]
[model]
type = LinearIsotropicElasticity
strain = 'forces/E'
stress = 'state/S'
coefficient_types = 'BULK_MODULUS SHEAR_MODULUS'
coefficients = '1.4e5 7.8e4'
[]
[]
Recall that both the bulk modulus and the shear modulus are treated as trainable parameters, which can be verified by inspecting the model summary:
import neml2
print(model)
std::shared_ptr< Model > load_model(const std::filesystem::path &path, const std::string &mname)
A convenient function to load an input file and get a model.
Definition Model.cxx:45
Output: @list-output:ex1
Automatic differentiation
NEML2 tensors can be used interchangeably with PyTorch tensors in a seamless fashion. Function graph traced through NEML2 tensor operations can be back-propagated using PyTorch's autograd engine.
For example, the following Python script demonstrates the calculation of \(\pdv{l}{p}\) using an arbitrary, scalar-valued loss function \(l\) obtained by a combination of NEML2 operations and PyTorch operations.
import neml2
from neml2.tensors import SR2
import torch
torch.set_default_dtype(torch.double)
model.K.requires_grad_()
model.G.requires_grad_()
strain = SR2.fill(0.1, 0.05, -0.03, 0.02, 0.06, 0.03)
stress = model.value({"forces/E": strain})["state/S"]
A = stress.torch() ** 2 + 3.5 - 1
B = stress.torch() * strain.torch()
l = torch.linalg.norm(A + B)
l.backward()
print("dl/dK =", model.K.grad.item())
print("dl/dG =", model.G.grad.item())
Output: @list-output:ex2