Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loop over write chunks added for raw encoding in writer #106

Merged
merged 1 commit into from
Dec 31, 2019
Merged

loop over write chunks added for raw encoding in writer #106

merged 1 commit into from
Dec 31, 2019

Conversation

GFleishman
Copy link
Contributor

raw data encoding in writer did not have loop over chunks implemented. For me, in python 3.6.7, writing nrrds with raw encoding larger than ~2GB failed. Implementing this loop and writing in chunks resolved the issue. Some programs, notably in my case Paraview, do not read NRRD files with compression (which is silly) but for the time being I need to write nrrds with raw encoding.

@codecov-io
Copy link

codecov-io commented Dec 24, 2019

Codecov Report

Merging #106 into master will not change coverage.
The diff coverage is 100%.

Impacted file tree graph

@@          Coverage Diff          @@
##           master   #106   +/-   ##
=====================================
  Coverage     100%   100%           
=====================================
  Files           6      6           
  Lines         385    391    +6     
  Branches      125    126    +1     
=====================================
+ Hits          385    391    +6
Impacted Files Coverage Δ
nrrd/writer.py 100% <100%> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 55706e2...422d7f7. Read the comment docs.

@addisonElliott
Copy link
Collaborator

Hello, thanks for the PR. Logically, it all looks good and seems like a good thing to have. It's good to be consistent between NRRD's with vs. without encoding (i.e. writing in 2GB chunks).

Just so I'm clear, is the writing >4GB chunks just an issue in Python itself? When the chunked writing for encoded NRRDs was introduced, it seemed to be to circumvent a bug in Python but I was unclear whether that bug was still present in recent versions of Python.

@GFleishman
Copy link
Contributor Author

GFleishman commented Dec 30, 2019 via email

@addisonElliott addisonElliott merged commit 27e867b into mhe:master Dec 31, 2019
@addisonElliott
Copy link
Collaborator

Thanks for the PR!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants