Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor property fitting interface #4471

Merged
merged 97 commits into from
Dec 25, 2024
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
97 commits
Select commit Hold shift + click to select a range
4a69e22
change property.npy to any name
Chengqian-Zhang Dec 8, 2024
157f70c
Init branch
Chengqian-Zhang Dec 13, 2024
6172dc4
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
0608e17
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 13, 2024
4c2033f
change | to Union
Chengqian-Zhang Dec 13, 2024
ac5fa5c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
69803d6
change sub_var_name default to []
Chengqian-Zhang Dec 13, 2024
1fcf82c
Solve pre-commit
Chengqian-Zhang Dec 13, 2024
5be3457
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
9637b6a
solve scanning github
Chengqian-Zhang Dec 13, 2024
5ce6d31
fix UT
Chengqian-Zhang Dec 13, 2024
5a3bf94
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
13e1911
delete useless file
Chengqian-Zhang Dec 13, 2024
ff4650e
Solve some UT
Chengqian-Zhang Dec 13, 2024
9637390
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
4edf26f
Solve precommit
Chengqian-Zhang Dec 13, 2024
4d35df2
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 13, 2024
7f09038
slove pre
Chengqian-Zhang Dec 13, 2024
32e6deb
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
8fec403
Solve dptest UT, dpatomicmodel UT, code scannisang
Chengqian-Zhang Dec 14, 2024
ba54bcc
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 14, 2024
b52065c
delete param and
Chengqian-Zhang Dec 15, 2024
c0f09e6
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
2566b2e
Solve UT fail caused by task_dim and property_name
Chengqian-Zhang Dec 15, 2024
3af4970
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 15, 2024
b1e834a
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
e33d6e6
Fix UT
Chengqian-Zhang Dec 15, 2024
5e1e892
Solve conflict
Chengqian-Zhang Dec 15, 2024
cdd4e18
Fix UT
Chengqian-Zhang Dec 15, 2024
86f6744
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
d75ea7a
Fix UT
Chengqian-Zhang Dec 15, 2024
0dbd2b4
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 15, 2024
1cc5c37
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
44fa4d6
Fix permutation error
Chengqian-Zhang Dec 15, 2024
3249891
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 15, 2024
8e9bbc5
Add property bias UT
Chengqian-Zhang Dec 15, 2024
1b8d92f
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
c83d91e
recover rcond doc
Chengqian-Zhang Dec 15, 2024
63cec86
recover blank
Chengqian-Zhang Dec 15, 2024
99df857
Change code according according to coderabbitai
Chengqian-Zhang Dec 15, 2024
4155294
solve pre-commit
Chengqian-Zhang Dec 15, 2024
a1a6583
Fix UT
Chengqian-Zhang Dec 15, 2024
2859f02
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
5d3c50b
change apply_bias doc
Chengqian-Zhang Dec 15, 2024
1094032
update the version compatibility
Chengqian-Zhang Dec 16, 2024
c9b7ab4
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 18, 2024
15eb6d0
delete sub_var_name
Chengqian-Zhang Dec 18, 2024
dbf394c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 18, 2024
fd42d53
recover to property key
Chengqian-Zhang Dec 18, 2024
e48eb8b
Solve conflict
Chengqian-Zhang Dec 18, 2024
5036545
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 18, 2024
a125d38
Fix conflict
Chengqian-Zhang Dec 18, 2024
bff3378
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 18, 2024
85f9166
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 18, 2024
d384f62
Fix UT
Chengqian-Zhang Dec 20, 2024
d43da75
Add document of property fitting
Chengqian-Zhang Dec 20, 2024
f4aeeaf
Delete checkpoint
Chengqian-Zhang Dec 20, 2024
4613425
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 20, 2024
f07f1b2
Add get_property_name to DeepEvalBackend
Chengqian-Zhang Dec 20, 2024
120d9d8
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 20, 2024
2dc6486
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 22, 2024
1d2d866
change doc to py
Chengqian-Zhang Dec 22, 2024
a50d363
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 22, 2024
2e18007
Add out_bias out_std doc
Chengqian-Zhang Dec 22, 2024
4ff2a23
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 23, 2024
eff7eb1
change bias method to compute_stats_do_not_distinguish_types
Chengqian-Zhang Dec 23, 2024
c9e47a3
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 23, 2024
28ef352
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
e74a6ac
change var_name to property_name
Chengqian-Zhang Dec 23, 2024
6615837
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
a88a655
change logic of extensive bias
Chengqian-Zhang Dec 23, 2024
d10a4ba
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
9219ee1
add doc for neww added parameter
Chengqian-Zhang Dec 23, 2024
35bb6a7
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 23, 2024
ec3369d
change doc for compute_stats_do_not_distinguish_types
Chengqian-Zhang Dec 23, 2024
ed59869
try to fix dptest
Chengqian-Zhang Dec 23, 2024
15846fd
change all property to property_name
Chengqian-Zhang Dec 23, 2024
34cc7ce
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
a1d2608
Fix UT
Chengqian-Zhang Dec 23, 2024
c70a887
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
c3f82aa
Delete key 'property' completely
Chengqian-Zhang Dec 23, 2024
a392718
Solve conflict
Chengqian-Zhang Dec 23, 2024
9bf75cb
Fix UT
Chengqian-Zhang Dec 23, 2024
6e428a7
Fix dptest UT
Chengqian-Zhang Dec 23, 2024
14a9a89
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 24, 2024
f963383
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 24, 2024
9e09892
Delete attribute
Chengqian-Zhang Dec 24, 2024
d097bfa
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 24, 2024
f7cf9e1
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 24, 2024
f9d1b55
Solve comment
Chengqian-Zhang Dec 24, 2024
9ae4dfd
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 24, 2024
ce24784
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 24, 2024
211ec00
Solve error
Chengqian-Zhang Dec 24, 2024
42cfe99
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 24, 2024
b473b6a
delete property_name in serialize
Chengqian-Zhang Dec 24, 2024
af8789f
Update train-fitting-property.md
Chengqian-Zhang Dec 25, 2024
6638ba6
Update train-fitting-property.md
Chengqian-Zhang Dec 25, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions deepmd/dpmodel/output_def.py
Original file line number Diff line number Diff line change
Expand Up @@ -197,6 +197,7 @@ def __init__(
r_hessian: bool = False,
magnetic: bool = False,
intensive: bool = False,
sub_var_name: list[str] = [],
) -> None:
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
self.name = name
self.shape = list(shape)
Expand All @@ -220,6 +221,7 @@ def __init__(
self.r_hessian = r_hessian
self.magnetic = magnetic
self.intensive = intensive
self.sub_var_name = sub_var_name
if self.r_hessian:
if not self.reducible:
raise ValueError("only reducible variable can calculate hessian")
Expand Down
65 changes: 48 additions & 17 deletions deepmd/pt/loss/property.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
# SPDX-License-Identifier: LGPL-3.0-or-later
import logging
from typing import (
Union,
)

import torch
import torch.nn.functional as F
Expand All @@ -24,6 +27,8 @@ def __init__(
loss_func: str = "smooth_mae",
metric: list = ["mae"],
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
beta: float = 1.00,
property_name: Union[str, list] = "property",
property_dim: Union[int, list] = 1,
**kwargs,
) -> None:
r"""Construct a layer to compute loss on property.
Expand All @@ -44,6 +49,13 @@ def __init__(
self.loss_func = loss_func
self.metric = metric
self.beta = beta
if isinstance(property_name, str):
property_name = [property_name]
if isinstance(property_dim, int):
property_dim = [property_dim]
self.property_name = property_name
assert self.task_dim == sum(property_dim)
self.property_name_dim_mapping = dict(zip(property_name, property_dim))
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved

def forward(self, input_dict, model, label, natoms, learning_rate=0.0, mae=False):
"""Return loss on properties .
Expand All @@ -69,34 +81,52 @@ def forward(self, input_dict, model, label, natoms, learning_rate=0.0, mae=False
Other losses for display.
"""
model_pred = model(**input_dict)
assert label["property"].shape[-1] == self.task_dim
assert model_pred["property"].shape[-1] == self.task_dim
nbz = model_pred["property"].shape[0]
assert model_pred["property"].shape == (nbz, self.task_dim)

concat_property = []
for property_name in self.property_name:
assert label[property_name].shape == (
nbz,
self.property_name_dim_mapping[property_name],
)
concat_property.append(label[property_name])
label["property"] = torch.cat(concat_property, dim=1)
assert label["property"].shape == (nbz, self.task_dim)

Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
out_std = model.atomic_model.out_std[0][0]
out_bias = model.atomic_model.out_bias[0][0]
assert len(out_std.shape) == 1
assert out_std.shape[0] == self.task_dim

loss = torch.zeros(1, dtype=env.GLOBAL_PT_FLOAT_PRECISION, device=env.DEVICE)[0]
more_loss = {}

# loss
if self.loss_func == "smooth_mae":
loss += F.smooth_l1_loss(
label["property"],
model_pred["property"],
(label["property"] - out_bias) / out_std,
(model_pred["property"] - out_bias) / out_std,
reduction="sum",
beta=self.beta,
)
elif self.loss_func == "mae":
loss += F.l1_loss(
label["property"], model_pred["property"], reduction="sum"
(label["property"] - out_bias) / out_std,
(model_pred["property"] - out_bias) / out_std,
reduction="sum",
)
elif self.loss_func == "mse":
loss += F.mse_loss(
label["property"],
model_pred["property"],
(label["property"] - out_bias) / out_std,
(model_pred["property"] - out_bias) / out_std,
reduction="sum",
)
elif self.loss_func == "rmse":
loss += torch.sqrt(
F.mse_loss(
label["property"],
model_pred["property"],
(label["property"] - out_bias) / out_std,
(model_pred["property"] - out_bias) / out_std,
reduction="mean",
)
)
Expand Down Expand Up @@ -138,13 +168,14 @@ def forward(self, input_dict, model, label, natoms, learning_rate=0.0, mae=False
def label_requirement(self) -> list[DataRequirementItem]:
"""Return data label requirements needed for this loss calculation."""
label_requirement = []
label_requirement.append(
DataRequirementItem(
"property",
ndof=self.task_dim,
atomic=False,
must=False,
high_prec=True,
for property_name in self.property_name:
label_requirement.append(
DataRequirementItem(
property_name,
ndof=self.property_name_dim_mapping[property_name],
atomic=False,
must=True,
high_prec=True,
)
)
)
return label_requirement
16 changes: 4 additions & 12 deletions deepmd/pt/model/atomic_model/property_atomic_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,15 +35,7 @@ def apply_out_stat(
The atom types. nf x nloc

"""
if self.fitting_net.get_bias_method() == "normal":
out_bias, out_std = self._fetch_out_stat(self.bias_keys)
for kk in self.bias_keys:
# nf x nloc x odims, out_bias: ntypes x odims
ret[kk] = ret[kk] + out_bias[kk][atype]
return ret
elif self.fitting_net.get_bias_method() == "no_bias":
return ret
else:
raise NotImplementedError(
"Only 'normal' and 'no_bias' is supported for parameter 'bias_method'."
)
out_bias, out_std = self._fetch_out_stat(self.bias_keys)
for kk in self.bias_keys:
ret[kk] = ret[kk] * out_std[kk][0] + out_bias[kk][0]
return ret
4 changes: 4 additions & 0 deletions deepmd/pt/model/task/property.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
import logging
from typing import (
Optional,
Union,
)

import torch
Expand Down Expand Up @@ -82,6 +83,7 @@ def __init__(
bias_atom_p: Optional[torch.Tensor] = None,
intensive: bool = False,
bias_method: str = "normal",
property_name: Union[str, list] = "property",
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
resnet_dt: bool = True,
numb_fparam: int = 0,
numb_aparam: int = 0,
Expand All @@ -95,6 +97,7 @@ def __init__(
self.task_dim = task_dim
self.intensive = intensive
self.bias_method = bias_method
self.property_name = property_name
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
super().__init__(
var_name="property",
ntypes=ntypes,
Expand Down Expand Up @@ -126,6 +129,7 @@ def output_def(self) -> FittingOutputDef:
r_differentiable=False,
c_differentiable=False,
intensive=self.intensive,
sub_var_name=self.property_name,
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
),
]
)
Expand Down
61 changes: 48 additions & 13 deletions deepmd/pt/utils/stat.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@
from deepmd.utils.out_stat import (
compute_stats_from_atomic,
compute_stats_from_redu,
compute_stats_property,
)
from deepmd.utils.path import (
DPPath,
Expand Down Expand Up @@ -290,6 +291,12 @@
# remove the keys that are not in the sample
keys = [keys] if isinstance(keys, str) else keys
assert isinstance(keys, list)
if "property" in atomic_output.var_defs:
sub_keys = []
for key in keys:
if atomic_output.var_defs[key].sub_var_name is not None:
sub_keys.extend(atomic_output.var_defs[key].sub_var_name)
keys.extend(sub_keys)
Fixed Show fixed Hide fixed
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
new_keys = [
ii
for ii in keys
Expand Down Expand Up @@ -373,6 +380,16 @@

# merge global/atomic bias
bias_atom_e, std_atom_e = {}, {}
keys = (
["property"]
if (
"property" in atomic_output.var_defs
and (
ii in keys for ii in atomic_output.var_defs["property"].sub_var_name
)
)
else keys
)
for kk in keys:
# use atomic bias whenever available
if kk in bias_atom_a:
Expand Down Expand Up @@ -476,26 +493,44 @@
std_atom_e = {}
for kk in keys:
if kk in stats_input:
if atomic_output is not None and atomic_output.get_data()[kk].intensive:
task_dim = stats_input[kk].shape[1]
assert merged_natoms[kk].shape == (nf[kk], ntypes)
stats_input[kk] = (
merged_natoms[kk].sum(axis=1).reshape(-1, 1) * stats_input[kk]
if "property" in atomic_output.var_defs:
bias_atom_e[kk], std_atom_e[kk] = compute_stats_property(
stats_input[kk],
merged_natoms[kk],
assigned_bias=assigned_atom_ener[kk],
)
else:
bias_atom_e[kk], std_atom_e[kk] = compute_stats_from_redu(
stats_input[kk],
merged_natoms[kk],
assigned_bias=assigned_atom_ener[kk],
rcond=rcond,
)
assert stats_input[kk].shape == (nf[kk], task_dim)
bias_atom_e[kk], std_atom_e[kk] = compute_stats_from_redu(
stats_input[kk],
merged_natoms[kk],
assigned_bias=assigned_atom_ener[kk],
rcond=rcond,
)
else:
# this key does not have global labels, skip it.
continue
if "property" in atomic_output.var_defs:
concat_bias = []
concat_std = []
for ii in atomic_output.var_defs["property"].sub_var_name:
assert ii in bias_atom_e.keys()
assert ii in std_atom_e.keys()
concat_bias.append(bias_atom_e[ii])
concat_std.append(std_atom_e[ii])
del bias_atom_e, std_atom_e
bias_atom_e = {}
std_atom_e = {}
bias_atom_e["property"] = np.concatenate(concat_bias, axis=-1)
std_atom_e["property"] = np.concatenate(concat_std, axis=-1)
std_atom_e["property"] = np.tile(
std_atom_e["property"], (bias_atom_e["property"].shape[0], 1)
)

Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
return bias_atom_e, std_atom_e

bias_atom_e, std_atom_e = _post_process_stat(bias_atom_e, std_atom_e)

# unbias_e is only used for print rmse

if model_pred is None:
unbias_e = {
kk: merged_natoms[kk] @ bias_atom_e[kk].reshape(ntypes, -1)
Expand Down
24 changes: 24 additions & 0 deletions deepmd/utils/argcheck.py
Original file line number Diff line number Diff line change
Expand Up @@ -1581,6 +1581,7 @@ def fitting_property():
doc_task_dim = "The dimension of outputs of fitting net"
doc_intensive = "Whether the fitting property is intensive"
doc_bias_method = "The method of applying the bias to each atomic output, user can select 'normal' or 'no_bias'. If 'no_bias' is used, no bias will be added to the atomic output."
doc_property_name = "TODO"
return [
Argument("numb_fparam", int, optional=True, default=0, doc=doc_numb_fparam),
Argument("numb_aparam", int, optional=True, default=0, doc=doc_numb_aparam),
Expand Down Expand Up @@ -1614,6 +1615,13 @@ def fitting_property():
Argument(
"bias_method", str, optional=True, default="normal", doc=doc_bias_method
),
Argument(
"property_name",
[str, list],
optional=True,
default="property",
doc=doc_property_name,
),
]


Expand Down Expand Up @@ -2481,6 +2489,8 @@ def loss_property():
doc_loss_func = "The loss function to minimize, such as 'mae','smooth_mae'."
doc_metric = "The metric for display. This list can include 'smooth_mae', 'mae', 'mse' and 'rmse'."
doc_beta = "The 'beta' parameter in 'smooth_mae' loss."
doc_property_name = "The names of fitting properties, which should be consistent with the property names in the dataset."
doc_property_dim = "The dimensions of fitting properties, which should be consistent with the property dimensions in the dataset."
return [
Argument(
"loss_func",
Expand All @@ -2503,6 +2513,20 @@ def loss_property():
default=1.00,
doc=doc_beta,
),
Argument(
"property_name",
[str, list],
optional=True,
default="property",
doc=doc_property_name,
),
Argument(
"property_dim",
[int, list],
optional=True,
default=1,
doc=doc_property_dim,
),
]


Expand Down
51 changes: 49 additions & 2 deletions deepmd/utils/out_stat.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,6 @@ def compute_stats_from_redu(
The assigned output bias, shape is [ntypes, *(odim0, odim1, ...)].
Set to a tensor of shape (odim0, odim1, ...) filled with nan if the bias
of the type is not assigned.
rcond
Cut-off ratio for small singular values of a.

Returns
-------
Expand Down Expand Up @@ -130,3 +128,52 @@ def compute_stats_from_atomic(
output[mask].std(axis=0) if output[mask].size > 0 else np.nan
)
return output_bias, output_std


def compute_stats_property(
output_redu: np.ndarray,
natoms: np.ndarray,
assigned_bias: Optional[np.ndarray] = None,
) -> tuple[np.ndarray, np.ndarray]:
"""Compute the output statistics.

Given the reduced output value and the number of atoms for each atom,
compute the least-squares solution as the atomic output bias and std.

Parameters
----------
output_redu
The reduced output value, shape is [nframes, *(odim0, odim1, ...)].
natoms
The number of atoms for each atom, shape is [nframes, ntypes].
assigned_bias
The assigned output bias, shape is [ntypes, *(odim0, odim1, ...)].
Set to a tensor of shape (odim0, odim1, ...) filled with nan if the bias
of the type is not assigned.

Returns
-------
np.ndarray
The computed output bias, shape is [ntypes, *(odim0, odim1, ...)].
np.ndarray
The computed output std, shape is [*(odim0, odim1, ...)].
"""
natoms = np.array(natoms) # [nf, ntypes]
nf, ntypes = natoms.shape
output_redu = np.array(output_redu)
var_shape = list(output_redu.shape[1:])
output_redu = output_redu.reshape(nf, -1)
# check shape
assert output_redu.ndim == 2
assert natoms.ndim == 2
assert output_redu.shape[0] == natoms.shape[0] # [nf,1]

Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
computed_output_bias = np.repeat(
np.mean(output_redu, axis=0)[np.newaxis, :], ntypes, axis=0
)
output_std = np.std(output_redu, axis=0)

computed_output_bias = computed_output_bias.reshape([natoms.shape[1]] + var_shape) # noqa: RUF005
output_std = output_std.reshape(var_shape)

return computed_output_bias, output_std
Fixed Show fixed Hide fixed
Loading