You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran the update_metadata script on the giant BCCAQ2 files to rename some metadata attributes. This resulted in an error in the file data. Affected files have normal data for the first few thousand timesteps, but subsequently have a weird data offet, resulting in maps that look like this:
My best guess for the mechanism here is that the offset is caused by a failure to correctly move the data further down the file when adding length (longer attribute names?) to the metadata header. Perhaps because these are netCDF Classic files of size 56 G, and netCDF classic is designed for files smaller than 2G.
You are, I think, allowed to have netCDF classic files longer than 2G if all but one of the variables fits completely with the 2G, which would be the case here. But that may be a grey area that some libraries don't work well with, or something. Maybe only 2G of the data was "scooted down"?
Diagnose issue, and have update_metadata warn the user / refuse to run if it seems like it applies.
The text was updated successfully, but these errors were encountered:
Original guess seems to be correct; this issue only appears with netCDF files in classic format longer than the 4GB (note: not 2GB) limit. I think the right thing to do here is have the script check file size and netCDF format, and exit with an error message if they're the dangerous combination.
I ran the update_metadata script on the giant BCCAQ2 files to rename some metadata attributes. This resulted in an error in the file data. Affected files have normal data for the first few thousand timesteps, but subsequently have a weird data offet, resulting in maps that look like this:
My best guess for the mechanism here is that the offset is caused by a failure to correctly move the data further down the file when adding length (longer attribute names?) to the metadata header. Perhaps because these are netCDF Classic files of size 56 G, and netCDF classic is designed for files smaller than 2G.
You are, I think, allowed to have netCDF classic files longer than 2G if all but one of the variables fits completely with the 2G, which would be the case here. But that may be a grey area that some libraries don't work well with, or something. Maybe only 2G of the data was "scooted down"?
Diagnose issue, and have update_metadata warn the user / refuse to run if it seems like it applies.
The text was updated successfully, but these errors were encountered: