-
Notifications
You must be signed in to change notification settings - Fork 323
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vector.sort
should attach limited number of warnings
#6283
Comments
Artifically limiting the number of reported warnings to 100. Also added benchmarks with random ints to investigate perf issues when dealing with warnings (future task). Closes #6283.
Artifically limiting the number of reported warnings to 100. Also added benchmarks with random ints to investigate perf issues when dealing with warnings (future task). Closes #6283.
Hubert Plociniczak reports a new STANDUP for the provided date (2023-05-05): Progress: Addressing comments regarding limiting the number of warnings; reporting overflow via additional API. Working towards a more generic solution than just Next Day: Next day I will be working on the #6283 task. Prepare final change for warnings, continue with renaming bug. |
Artifically limiting the number of reported warnings to 100. Also added benchmarks with random ints to investigate perf issues when dealing with warnings (future task). Closes #6283.
Artifically limiting the number of reported warnings to 100. Also added benchmarks with random Ints to investigate perf issues when dealing with warnings (future task). Ideally we would have a custom set-like collection that allows us internally to specify a maximal number of elements. But `EnsoHashMap` (and potentially `EnsoSet`) are still WIP when it comes to being PE-friendly. The change also allows for checking if the limit for the number of reported warnings has been reached. It will visualize by adding an additional "Warnings limit reached." to the visualization. The limit is configurable via `--warnings-limit` parameter to `run`. Closes #6283.
Hubert Plociniczak reports a new STANDUP for yesterday (2023-05-09): Progress: More tweaks to the PR. Adding more tests and ensuring things work in UI. Catching up after a break. It should be finished by 2023-05-09. Next Day: Next day I will be working on the #6283 task. Will probably need a few more hours to address remaining comments. |
Hubert Plociniczak reports a new 🔴 DELAY for yesterday (2023-05-10): Summary: There is 1 day delay in implementation of the Delay Cause: Addressing comments, more testing to make sure I don't brake things |
Hubert Plociniczak reports a new STANDUP for yesterday (2023-05-10): Progress: Final tweaks to the PR to address comments. Extensive testing that took more time. Spent the rest of the day trying to debug Jaroslav's PR #6584 which seems to trigger some nasty caching problems. It should be finished by 2023-05-10. Next Day: Next day I will be working on the #6515 task. Working on renaming. Still trying to deal with #6584. |
We will be adding a warning for each pair of incomparable values when sorting.
If I sort a vector of 1M of values, each of them being incomparable, how many of these warnings will I get?
I'm thinking that we may want to report only the first 3-10 of these and then report just a counter 'and N more values that were incomparable.'
See for example
Additional_Warnings
in the Table library which is a similar concept. When doing Table aggregations we report the first 10 issues and then just report the count of additional problems. We don't want 1000+ of the same problem just with different values in it attached, it will not add much more information.More importantly, sorting 1M elements that are comparable vs incomparable shouldn't have vastly different performance characteristics.
I'm not pushing to necessarily do it now, but ideally we should at least create a ticket to ensure that large incomparable vectors are handled well by this. Ideally with adding a test that sorts a vector of at least 10k+ values and measures the amount of warnings, possibly also a benchmark.
Originally posted by @radeusgd in #5998 (comment)
Related
The text was updated successfully, but these errors were encountered: