-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove numpy<2 constraints #70
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #70 +/- ##
=======================================
Coverage 86.89% 86.89%
=======================================
Files 9 9
Lines 1343 1343
=======================================
Hits 1167 1167
Misses 176 176 ☔ View full report in Codecov by Sentry. |
Yikes. LOL I would say maybe we wait for 0.5.1 to do this? I think it's quite unlikely to be the case that someone needs 2.0 right now? And the 0.5 releases should be much faster now that we are releasing from main, so we can flick this switch on any time. Alternative idea — add a "numpy<2" box to the plugin updater that can be ticked or unticked? |
I prefer to release bundle for 0.5.0 with these constraints. Then bundle for 0.5.1 may be without. |
We don't technically need 0.5.1 to change this. A new Fine with me waiting here, btw. I marked it as draft for now. We can revisit when time comes. In the future, these constraints will be handled by |
I actually doubt that many plugins are incompatible? |
I couldn't find any napari-dependents in conda-forge using |
I'm currently fighting with NumPy 2.0 segfaults on (the 4DNucleome/PartSeg#1145 is failing on this line https://github.com/4DNucleome/PartSeg/blob/a1f0bc6d653ec8b62f2e8cf1e2a7c887eaef3243/package/PartSegCore/segmentation/segmentation_algorithm.py#L62) |
I have also found that napari is currently crashing with numpy 2.0 amd creating Image layer with some parameters (problem with contrast limits and np.clip). Will open issue or PR tommorrow. |
PyOpenGL_accelerate is compiled against numpy<2 and error is not obvious |
# References and relevant issues napari/napari-plugin-manager#70 (comment) # Description Without this fix provided test fails with: ```python napari/layers/image/_tests/test_image.py:1033 (test_contrast_outside_range) def test_contrast_outside_range(): data = np.zeros((64, 64), dtype=np.uint8) > Image(data, contrast_limits=(0, 1000)) napari/layers/image/_tests/test_image.py:1037: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ napari/layers/base/base.py:115: in __call__ obj._post_init() napari/layers/_scalar_field/scalar_field.py:310: in _post_init self.refresh() napari/layers/base/base.py:1529: in refresh self._refresh_sync() napari/layers/base/base.py:1536: in _refresh_sync self._update_thumbnail() napari/layers/image/image.py:618: in _update_thumbnail downsampled = np.clip(downsampled, low, high) ../../.pyenv/versions/napari_3.11/lib/python3.11/site-packages/numpy/_core/fromnumeric.py:2247: in clip return _wrapfunc(a, 'clip', a_min, a_max, out=out, **kwargs) ../../.pyenv/versions/napari_3.11/lib/python3.11/site-packages/numpy/_core/fromnumeric.py:57: in _wrapfunc return bound(*args, **kwds) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ a = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=uint8) min = 0, max = 1000, out = None, kwargs = {} def _clip(a, min=None, max=None, out=None, **kwargs): if min is None and max is None: raise ValueError("One of max or min must be given") if min is None: return um.minimum(a, max, out=out, **kwargs) elif max is None: return um.maximum(a, min, out=out, **kwargs) else: > return um.clip(a, min, max, out=out, **kwargs) E OverflowError: Python integer 1000 out of bounds for uint8 ```
Closes #58
This will be impossible to deal with without an update. If a plugin is incompatible with numpy v2 then a repodata patch should be issued, but we can't govern all the possible incompatibilities (a plugin might require v2+, and we won't be able to fix that).