-
Notifications
You must be signed in to change notification settings - Fork 490
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The same filename with the same checksum (i.e. MD5) shouldn't be able to appear twice in the same dataset #3571
Comments
Wow, yes, it looks like this bug is in the production v. 4.6. The duplicate check is there. It just appears to be failing for whatever reason. I'll investigate. |
…, in a published dataset with no existing draft. (#3571)
The good news is, I believe I have fixed this, in the 2290 branch. tl;dr version: I accept full responsibility for this one. I was testing this stuff extensively - but I guess I must have been working with existing drafts only (?). |
I was also told about a similar issue with the file replace; that it was possible to replace a file with a duplicate of a file already in the dataset. That I tested, but COULD NOT reproduce. Tested published files with no drafts and published files with existing drafts... let me know if I'm missing something. Note that if true, with file replace, that it would indeed be a "similar", parallel issue, but not the same thing that I just fixed. Because file replace has its own duplicate logic, separate from the code path that regular uploads go through. |
Since 505fba0 appears in the 2290-file-replace branch I'm adding the 4.6.1 milestone and moving this into Development at https://waffle.io/IQSS/dataverse |
OK, I'm moving this into code review. The actual fix was a one line variety. In EditDatafilesPage.java, changed this: if (fm.getId() != null && fm.getDataFile() != null) { to if (fm.getDataFile() != null && fm.getDataFile().getId() != null) {
|
I'm grabbing this issue to do the code review. |
OK, looks good, closing |
The same filename with the same checksum (i.e. MD5) shouldn't be able to appear twice in the same dataset. That's my understanding anyway.
This was tested on v. 4.6 build 62-c913662 on https://demo.dataverse.org
Here's a screenshot:
The text was updated successfully, but these errors were encountered: