Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add perplexity unit test to guard against regressions #1

Merged
merged 25 commits into from
Jun 19, 2024

Conversation

LRL-ModelCloud
Copy link
Contributor

No description provided.

@Qubitium Qubitium changed the title Add test perplexity Add perplexity unit test to guard against regressions Jun 19, 2024
@Qubitium Qubitium merged commit aa6acf0 into main Jun 19, 2024
2 of 3 checks passed
@Qubitium Qubitium deleted the add-test-perplexity branch June 27, 2024 06:16
DeJoker pushed a commit to DeJoker/GPTQModel that referenced this pull request Jul 19, 2024
DeJoker pushed a commit to DeJoker/GPTQModel that referenced this pull request Jul 19, 2024
* add test_perplexity.py

* assert avg_perplexity < 9

* rename test method name.

* MOD test diff format

* only need pass diff format.

* cleanup code, and fix method name

* add  wiki cal datasets

* return native_ppl

* use save_quantized

* wiki text data min 128 chars

* add comments.

* use GPTQModel

* wiki text filter min chars up to 512

* need gptqmodel

* add comments

* use self.native_ppl

* set desc_act default False

* add marlin format ppl score

* mod native ppl

* Update setup.py

* Update test_perplexity.py

* mod format ppl, and increase the tolerance for PPL difference to 0.6

---------

Co-authored-by: LRL-ModelCloud <[email protected]>
Co-authored-by: Qubitium-ModelCloud <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants