Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor property fitting interface #4471

Merged
merged 97 commits into from
Dec 25, 2024
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
Show all changes
97 commits
Select commit Hold shift + click to select a range
4a69e22
change property.npy to any name
Chengqian-Zhang Dec 8, 2024
157f70c
Init branch
Chengqian-Zhang Dec 13, 2024
6172dc4
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
0608e17
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 13, 2024
4c2033f
change | to Union
Chengqian-Zhang Dec 13, 2024
ac5fa5c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
69803d6
change sub_var_name default to []
Chengqian-Zhang Dec 13, 2024
1fcf82c
Solve pre-commit
Chengqian-Zhang Dec 13, 2024
5be3457
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
9637b6a
solve scanning github
Chengqian-Zhang Dec 13, 2024
5ce6d31
fix UT
Chengqian-Zhang Dec 13, 2024
5a3bf94
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
13e1911
delete useless file
Chengqian-Zhang Dec 13, 2024
ff4650e
Solve some UT
Chengqian-Zhang Dec 13, 2024
9637390
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
4edf26f
Solve precommit
Chengqian-Zhang Dec 13, 2024
4d35df2
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 13, 2024
7f09038
slove pre
Chengqian-Zhang Dec 13, 2024
32e6deb
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 13, 2024
8fec403
Solve dptest UT, dpatomicmodel UT, code scannisang
Chengqian-Zhang Dec 14, 2024
ba54bcc
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 14, 2024
b52065c
delete param and
Chengqian-Zhang Dec 15, 2024
c0f09e6
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
2566b2e
Solve UT fail caused by task_dim and property_name
Chengqian-Zhang Dec 15, 2024
3af4970
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 15, 2024
b1e834a
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
e33d6e6
Fix UT
Chengqian-Zhang Dec 15, 2024
5e1e892
Solve conflict
Chengqian-Zhang Dec 15, 2024
cdd4e18
Fix UT
Chengqian-Zhang Dec 15, 2024
86f6744
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
d75ea7a
Fix UT
Chengqian-Zhang Dec 15, 2024
0dbd2b4
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 15, 2024
1cc5c37
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
44fa4d6
Fix permutation error
Chengqian-Zhang Dec 15, 2024
3249891
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 15, 2024
8e9bbc5
Add property bias UT
Chengqian-Zhang Dec 15, 2024
1b8d92f
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
c83d91e
recover rcond doc
Chengqian-Zhang Dec 15, 2024
63cec86
recover blank
Chengqian-Zhang Dec 15, 2024
99df857
Change code according according to coderabbitai
Chengqian-Zhang Dec 15, 2024
4155294
solve pre-commit
Chengqian-Zhang Dec 15, 2024
a1a6583
Fix UT
Chengqian-Zhang Dec 15, 2024
2859f02
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 15, 2024
5d3c50b
change apply_bias doc
Chengqian-Zhang Dec 15, 2024
1094032
update the version compatibility
Chengqian-Zhang Dec 16, 2024
c9b7ab4
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 18, 2024
15eb6d0
delete sub_var_name
Chengqian-Zhang Dec 18, 2024
dbf394c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 18, 2024
fd42d53
recover to property key
Chengqian-Zhang Dec 18, 2024
e48eb8b
Solve conflict
Chengqian-Zhang Dec 18, 2024
5036545
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 18, 2024
a125d38
Fix conflict
Chengqian-Zhang Dec 18, 2024
bff3378
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 18, 2024
85f9166
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 18, 2024
d384f62
Fix UT
Chengqian-Zhang Dec 20, 2024
d43da75
Add document of property fitting
Chengqian-Zhang Dec 20, 2024
f4aeeaf
Delete checkpoint
Chengqian-Zhang Dec 20, 2024
4613425
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 20, 2024
f07f1b2
Add get_property_name to DeepEvalBackend
Chengqian-Zhang Dec 20, 2024
120d9d8
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 20, 2024
2dc6486
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 22, 2024
1d2d866
change doc to py
Chengqian-Zhang Dec 22, 2024
a50d363
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 22, 2024
2e18007
Add out_bias out_std doc
Chengqian-Zhang Dec 22, 2024
4ff2a23
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 23, 2024
eff7eb1
change bias method to compute_stats_do_not_distinguish_types
Chengqian-Zhang Dec 23, 2024
c9e47a3
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 23, 2024
28ef352
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
e74a6ac
change var_name to property_name
Chengqian-Zhang Dec 23, 2024
6615837
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
a88a655
change logic of extensive bias
Chengqian-Zhang Dec 23, 2024
d10a4ba
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
9219ee1
add doc for neww added parameter
Chengqian-Zhang Dec 23, 2024
35bb6a7
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 23, 2024
ec3369d
change doc for compute_stats_do_not_distinguish_types
Chengqian-Zhang Dec 23, 2024
ed59869
try to fix dptest
Chengqian-Zhang Dec 23, 2024
15846fd
change all property to property_name
Chengqian-Zhang Dec 23, 2024
34cc7ce
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
a1d2608
Fix UT
Chengqian-Zhang Dec 23, 2024
c70a887
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2024
c3f82aa
Delete key 'property' completely
Chengqian-Zhang Dec 23, 2024
a392718
Solve conflict
Chengqian-Zhang Dec 23, 2024
9bf75cb
Fix UT
Chengqian-Zhang Dec 23, 2024
6e428a7
Fix dptest UT
Chengqian-Zhang Dec 23, 2024
14a9a89
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 24, 2024
f963383
Merge branch 'devel' into refactor_property
Chengqian-Zhang Dec 24, 2024
9e09892
Delete attribute
Chengqian-Zhang Dec 24, 2024
d097bfa
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 24, 2024
f7cf9e1
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 24, 2024
f9d1b55
Solve comment
Chengqian-Zhang Dec 24, 2024
9ae4dfd
Merge branch 'refactor_property' of github.com:Chengqian-Zhang/deepmd…
Chengqian-Zhang Dec 24, 2024
ce24784
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 24, 2024
211ec00
Solve error
Chengqian-Zhang Dec 24, 2024
42cfe99
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 24, 2024
b473b6a
delete property_name in serialize
Chengqian-Zhang Dec 24, 2024
af8789f
Update train-fitting-property.md
Chengqian-Zhang Dec 25, 2024
6638ba6
Update train-fitting-property.md
Chengqian-Zhang Dec 25, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 2 additions & 3 deletions deepmd/dpmodel/fitting/property_fitting.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,9 +88,8 @@ def __init__(
) -> None:
self.task_dim = task_dim
self.intensive = intensive
self.property_name = property_name
super().__init__(
var_name=self.property_name,
var_name=property_name,
ntypes=ntypes,
dim_descrpt=dim_descrpt,
dim_out=task_dim,
Expand Down Expand Up @@ -131,7 +130,7 @@ def serialize(self) -> dict:
"type": "property",
"task_dim": self.task_dim,
"intensive": self.intensive,
"property_name": self.property_name,
"property_name": self.var_name,
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
}
dd["@version"] = 4

Expand Down
16 changes: 8 additions & 8 deletions deepmd/entrypoints/test.py
Original file line number Diff line number Diff line change
Expand Up @@ -779,12 +779,12 @@ def test_property(
tuple[list[np.ndarray], list[int]]
arrays with results and their shapes
"""
property_name = dp.get_property_name()
assert isinstance(property_name, str)
data.add(property_name, dp.task_dim, atomic=False, must=True, high_prec=True)
var_name = dp.get_var_name()
assert isinstance(var_name, str)
data.add(var_name, dp.task_dim, atomic=False, must=True, high_prec=True)
if has_atom_property:
data.add(
f"atom_{property_name}",
f"atom_{var_name}",
dp.task_dim,
atomic=True,
must=False,
Expand Down Expand Up @@ -840,12 +840,12 @@ def test_property(
aproperty = ret[1]
aproperty = aproperty.reshape([numb_test, natoms * dp.task_dim])

diff_property = property - test_data[property_name][:numb_test]
diff_property = property - test_data[var_name][:numb_test]
mae_property = mae(diff_property)
rmse_property = rmse(diff_property)

if has_atom_property:
diff_aproperty = aproperty - test_data[f"atom_{property_name}"][:numb_test]
diff_aproperty = aproperty - test_data[f"atom_{var_name}"][:numb_test]
mae_aproperty = mae(diff_aproperty)
rmse_aproperty = rmse(diff_aproperty)

Expand All @@ -862,7 +862,7 @@ def test_property(
detail_path = Path(detail_file)

for ii in range(numb_test):
test_out = test_data[property_name][ii].reshape(-1, 1)
test_out = test_data[var_name][ii].reshape(-1, 1)
pred_out = property[ii].reshape(-1, 1)

frame_output = np.hstack((test_out, pred_out))
Expand All @@ -876,7 +876,7 @@ def test_property(

if has_atom_property:
for ii in range(numb_test):
test_out = test_data[f"atom_{property_name}"][ii].reshape(-1, 1)
test_out = test_data[f"atom_{var_name}"][ii].reshape(-1, 1)
pred_out = aproperty[ii].reshape(-1, 1)

frame_output = np.hstack((test_out, pred_out))
Expand Down
4 changes: 2 additions & 2 deletions deepmd/infer/deep_eval.py
Original file line number Diff line number Diff line change
Expand Up @@ -274,9 +274,9 @@ def get_has_spin(self) -> bool:
"""Check if the model has spin atom types."""
return False

def get_property_name(self) -> str:
def get_var_name(self) -> str:
"""Get the name of the fitting property."""
return NotImplementedError
raise NotImplementedError

@abstractmethod
def get_ntypes_spin(self) -> int:
Expand Down
18 changes: 9 additions & 9 deletions deepmd/infer/deep_property.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ def change_output_def(self) -> None:
FittingOutputDef(
[
OutputVariableDef(
self.get_property_name(),
self.get_var_name(),
shape=[self.get_task_dim()],
reducible=True,
atomic=True,
Expand All @@ -66,11 +66,11 @@ def change_output_def(self) -> None:
)
)
self.deep_eval.output_def = self.output_def
self.deep_eval._OUTDEF_DP2BACKEND[self.get_property_name()] = (
f"atom_{self.get_property_name()}"
self.deep_eval._OUTDEF_DP2BACKEND[self.get_var_name()] = (
f"atom_{self.get_var_name()}"
)
self.deep_eval._OUTDEF_DP2BACKEND[f"{self.get_property_name()}_redu"] = (
self.get_property_name()
self.deep_eval._OUTDEF_DP2BACKEND[f"{self.get_var_name()}_redu"] = (
self.get_var_name()
)

@property
Expand Down Expand Up @@ -136,10 +136,10 @@ def eval(
aparam=aparam,
**kwargs,
)
atomic_property = results[self.get_property_name()].reshape(
atomic_property = results[self.get_var_name()].reshape(
nframes, natoms, self.get_task_dim()
)
property = results[f"{self.get_property_name()}_redu"].reshape(
property = results[f"{self.get_var_name()}_redu"].reshape(
nframes, self.get_task_dim()
)

Expand All @@ -159,9 +159,9 @@ def get_intensive(self) -> bool:
"""Get whether the property is intensive."""
return self.deep_eval.get_intensive()

def get_property_name(self) -> str:
def get_var_name(self) -> str:
"""Get the name of the fitting property."""
return self.deep_eval.get_property_name()
return self.deep_eval.get_var_name()


__all__ = ["DeepProperty"]
11 changes: 8 additions & 3 deletions deepmd/pt/infer/deep_eval.py
Original file line number Diff line number Diff line change
Expand Up @@ -184,9 +184,14 @@ def get_dim_aparam(self) -> int:
def get_intensive(self) -> bool:
return self.dp.model["Default"].get_intensive()

def get_property_name(self) -> str:
def get_var_name(self) -> str:
"""Get the name of the property."""
return self.dp.model["Default"].get_property_name()
if hasattr(self.dp.model["Default"], "get_var_name") and callable(
getattr(self.dp.model["Default"], "get_var_name")
):
return self.dp.model["Default"].get_var_name()
else:
raise NotImplementedError

@property
def model_type(self) -> type["DeepEvalWrapper"]:
Expand All @@ -204,7 +209,7 @@ def model_type(self) -> type["DeepEvalWrapper"]:
return DeepGlobalPolar
elif "wfc" in model_output_type:
return DeepWFC
elif self.dp.model["Default"].get_property_name() in model_output_type:
elif self.get_var_name() in model_output_type:
return DeepProperty
else:
raise RuntimeError("Unknown model type")
Expand Down
51 changes: 27 additions & 24 deletions deepmd/pt/loss/property.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class PropertyLoss(TaskLoss):
def __init__(
self,
task_dim,
property_name: str,
var_name: str,
loss_func: str = "smooth_mae",
metric: list = ["mae"],
Chengqian-Zhang marked this conversation as resolved.
Show resolved Hide resolved
beta: float = 1.00,
Expand All @@ -39,6 +39,8 @@ def __init__(
----------
task_dim : float
The output dimension of property fitting net.
var_name : str
The atomic property to fit, 'energy', 'dipole', and 'polar'.
loss_func : str
The loss function, such as "smooth_mae", "mae", "rmse".
metric : list
Expand All @@ -59,10 +61,10 @@ def __init__(
self.loss_func = loss_func
self.metric = metric
self.beta = beta
self.property_name = property_name
self.out_bias = out_bias
self.out_std = out_std
self.intensive = intensive
self.var_name = var_name

def forward(self, input_dict, model, label, natoms, learning_rate=0.0, mae=False):
"""Return loss on properties .
Expand All @@ -88,12 +90,13 @@ def forward(self, input_dict, model, label, natoms, learning_rate=0.0, mae=False
Other losses for display.
"""
model_pred = model(**input_dict)
nbz = model_pred[self.property_name].shape[0]
assert model_pred[self.property_name].shape == (nbz, self.task_dim)
assert label[self.property_name].shape == (nbz, self.task_dim)
var_name = self.var_name
nbz = model_pred[var_name].shape[0]
assert model_pred[var_name].shape == (nbz, self.task_dim)
assert label[var_name].shape == (nbz, self.task_dim)
if not self.intensive:
model_pred[self.property_name] = model_pred[self.property_name] / natoms
label[self.property_name] = label[self.property_name] / natoms
model_pred[var_name] = model_pred[var_name] / natoms
label[var_name] = label[var_name] / natoms

if self.out_std is None:
out_std = model.atomic_model.out_std[0][0]
Expand Down Expand Up @@ -123,28 +126,28 @@ def forward(self, input_dict, model, label, natoms, learning_rate=0.0, mae=False
# loss
if self.loss_func == "smooth_mae":
loss += F.smooth_l1_loss(
(label[self.property_name] - out_bias) / out_std,
(model_pred[self.property_name] - out_bias) / out_std,
(label[var_name] - out_bias) / out_std,
(model_pred[var_name] - out_bias) / out_std,
reduction="sum",
beta=self.beta,
)
elif self.loss_func == "mae":
loss += F.l1_loss(
(label[self.property_name] - out_bias) / out_std,
(model_pred[self.property_name] - out_bias) / out_std,
(label[var_name] - out_bias) / out_std,
(model_pred[var_name] - out_bias) / out_std,
reduction="sum",
)
elif self.loss_func == "mse":
loss += F.mse_loss(
(label[self.property_name] - out_bias) / out_std,
(model_pred[self.property_name] - out_bias) / out_std,
(label[var_name] - out_bias) / out_std,
(model_pred[var_name] - out_bias) / out_std,
reduction="sum",
)
elif self.loss_func == "rmse":
loss += torch.sqrt(
F.mse_loss(
(label[self.property_name] - out_bias) / out_std,
(model_pred[self.property_name] - out_bias) / out_std,
(label[var_name] - out_bias) / out_std,
(model_pred[var_name] - out_bias) / out_std,
reduction="mean",
)
)
Expand All @@ -154,28 +157,28 @@ def forward(self, input_dict, model, label, natoms, learning_rate=0.0, mae=False
# more loss
if "smooth_mae" in self.metric:
more_loss["smooth_mae"] = F.smooth_l1_loss(
label[self.property_name],
model_pred[self.property_name],
label[var_name],
model_pred[var_name],
reduction="mean",
beta=self.beta,
).detach()
if "mae" in self.metric:
more_loss["mae"] = F.l1_loss(
label[self.property_name],
model_pred[self.property_name],
label[var_name],
model_pred[var_name],
reduction="mean",
).detach()
if "mse" in self.metric:
more_loss["mse"] = F.mse_loss(
label[self.property_name],
model_pred[self.property_name],
label[var_name],
model_pred[var_name],
reduction="mean",
).detach()
if "rmse" in self.metric:
more_loss["rmse"] = torch.sqrt(
F.mse_loss(
label[self.property_name],
model_pred[self.property_name],
label[var_name],
model_pred[var_name],
reduction="mean",
)
).detach()
Expand All @@ -188,7 +191,7 @@ def label_requirement(self) -> list[DataRequirementItem]:
label_requirement = []
label_requirement.append(
DataRequirementItem(
self.property_name,
self.var_name,
ndof=self.task_dim,
atomic=False,
must=True,
Expand Down
26 changes: 9 additions & 17 deletions deepmd/pt/model/model/property_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@ def __init__(
def translated_output_def(self):
out_def_data = self.model_output_def().get_data()
output_def = {
f"atom_{self.get_property_name()}": out_def_data[self.get_property_name()],
self.get_property_name(): out_def_data[f"{self.get_property_name()}_redu"],
f"atom_{self.get_var_name()}": out_def_data[self.get_var_name()],
self.get_var_name(): out_def_data[f"{self.get_var_name()}_redu"],
}
if "mask" in out_def_data:
output_def["mask"] = out_def_data["mask"]
Expand All @@ -62,12 +62,8 @@ def forward(
do_atomic_virial=do_atomic_virial,
)
model_predict = {}
model_predict[f"atom_{self.get_property_name()}"] = model_ret[
self.get_property_name()
]
model_predict[self.get_property_name()] = model_ret[
f"{self.get_property_name()}_redu"
]
model_predict[f"atom_{self.get_var_name()}"] = model_ret[self.get_var_name()]
model_predict[self.get_var_name()] = model_ret[f"{self.get_var_name()}_redu"]
if "mask" in model_ret:
model_predict["mask"] = model_ret["mask"]
return model_predict
Expand All @@ -80,12 +76,12 @@ def get_task_dim(self) -> int:
@torch.jit.export
def get_intensive(self) -> bool:
"""Get whether the property is intensive."""
return self.model_output_def()[self.get_property_name()].intensive
return self.model_output_def()[self.get_var_name()].intensive

@torch.jit.export
def get_property_name(self) -> str:
def get_var_name(self) -> str:
"""Get the name of the property."""
return self.get_fitting_net().property_name
return self.get_fitting_net().var_name

@torch.jit.export
def forward_lower(
Expand All @@ -111,12 +107,8 @@ def forward_lower(
extra_nlist_sort=self.need_sorted_nlist_for_lower(),
)
model_predict = {}
model_predict[f"atom_{self.get_property_name()}"] = model_ret[
self.get_property_name()
]
model_predict[self.get_property_name()] = model_ret[
f"{self.get_property_name()}_redu"
]
model_predict[f"atom_{self.get_var_name()}"] = model_ret[self.get_var_name()]
model_predict[self.get_var_name()] = model_ret[f"{self.get_var_name()}_redu"]
if "mask" in model_ret:
model_predict["mask"] = model_ret["mask"]
return model_predict
5 changes: 2 additions & 3 deletions deepmd/pt/model/task/property.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,9 +93,8 @@ def __init__(
) -> None:
self.task_dim = task_dim
self.intensive = intensive
self.property_name = property_name
super().__init__(
var_name=self.property_name,
var_name=property_name,
ntypes=ntypes,
dim_descrpt=dim_descrpt,
dim_out=task_dim,
Expand Down Expand Up @@ -147,7 +146,7 @@ def serialize(self) -> dict:
"type": "property",
"task_dim": self.task_dim,
"intensive": self.intensive,
"property_name": self.property_name,
"property_name": self.var_name,
}
dd["@version"] = 4

Expand Down
4 changes: 2 additions & 2 deletions deepmd/pt/train/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -1240,10 +1240,10 @@ def get_loss(loss_params, start_lr, _ntypes, _model):
return TensorLoss(**loss_params)
elif loss_type == "property":
task_dim = _model.get_task_dim()
property_name = _model.get_property_name()
var_name = _model.get_var_name()
intensive = _model.get_intensive()
loss_params["task_dim"] = task_dim
loss_params["property_name"] = property_name
loss_params["var_name"] = var_name
loss_params["intensive"] = intensive
return PropertyLoss(**loss_params)
else:
Expand Down
Loading
Loading