Skip to content

Commit

Permalink
FIX TEST Higher tolerance for AdaLoRA in test (#1897)
Browse files Browse the repository at this point in the history
The test is flaky on CI, so this PR increases the tolerance to hopefully
fix the flakines. I cannot reproduce the error locally (neither on GPU
nor CPU), so I'm not 100% sure if this tolerance is enough to make the
test reliable.
  • Loading branch information
BenjaminBossan authored Jul 1, 2024
1 parent 9dc53b8 commit 62122b5
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions tests/testing_common.py
Original file line number Diff line number Diff line change
Expand Up @@ -567,6 +567,9 @@ def _test_merge_layers(self, model_id, config_cls, config_kwargs):
atol, rtol = 1e-4, 1e-4
if self.torch_device in ["mlu"]:
atol, rtol = 1e-3, 1e-3 # MLU
if config.peft_type == "ADALORA":
# AdaLoRA is a bit flaky on CI, but this cannot be reproduced locally
atol, rtol = 1e-3, 1e-3
if (config.peft_type == "IA3") and (model_id == "Conv2d"):
# for some reason, the IA³ Conv2d introduces a larger error
atol, rtol = 0.3, 0.01
Expand Down

0 comments on commit 62122b5

Please sign in to comment.