Skip to content

Commit

Permalink
Merge pull request #11026 from IQSS/11018-update-dataverse-fix
Browse files Browse the repository at this point in the history
Updates the updateDataverse API endpoint to support cases where an "inherit from parent" configuration is desired
  • Loading branch information
ofahimIQSS authored Nov 26, 2024
2 parents f95c1a0 + b63b1ff commit 1c17c3e
Show file tree
Hide file tree
Showing 8 changed files with 99 additions and 16 deletions.
8 changes: 8 additions & 0 deletions doc/release-notes/11018-update-dataverse-endpoint-update.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
The updateDataverse API endpoint has been updated to support an "inherit from parent" configuration for metadata blocks, facets, and input levels.

When it comes to omitting any of these fields in the request JSON:

- Omitting ``facetIds`` or ``metadataBlockNames`` causes the Dataverse collection to inherit the corresponding configuration from its parent.
- Omitting ``inputLevels`` removes any existing custom input levels in the Dataverse collection.

Previously, not setting these fields meant keeping the existing ones in the Dataverse.
14 changes: 11 additions & 3 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -120,9 +120,17 @@ You should expect an HTTP 200 response and JSON beginning with "status":"OK" fol

Same as in :ref:`create-dataverse-api`, the request JSON supports an optional ``metadataBlocks`` object, with the following supported sub-objects:

- ``metadataBlockNames``: The names of the metadata blocks you want to add to the Dataverse collection.
- ``inputLevels``: The names of the fields in each metadata block for which you want to add a custom configuration regarding their inclusion or requirement when creating and editing datasets in the new Dataverse collection. Note that if the corresponding metadata blocks names are not specified in the ``metadataBlockNames``` field, they will be added automatically to the Dataverse collection.
- ``facetIds``: The names of the fields to use as facets for browsing datasets and collections in the new Dataverse collection. Note that the order of the facets is defined by their order in the provided JSON array.
- ``metadataBlockNames``: The names of the metadata blocks to be assigned to the Dataverse collection.
- ``inputLevels``: The names of the fields in each metadata block for which you want to add a custom configuration regarding their inclusion or requirement when creating and editing datasets in the Dataverse collection. Note that if the corresponding metadata blocks names are not specified in the ``metadataBlockNames``` field, they will be added automatically to the Dataverse collection.
- ``facetIds``: The names of the fields to use as facets for browsing datasets and collections in the Dataverse collection. Note that the order of the facets is defined by their order in the provided JSON array.

Note that setting any of these fields overwrites the previous configuration.

When it comes to omitting these fields in the JSON:

- Omitting ``facetIds`` or ``metadataBlockNames`` causes the Dataverse collection to inherit the corresponding configuration from its parent.
- Omitting ``inputLevels`` removes any existing custom input levels in the Dataverse collection.
- Omitting the entire ``metadataBlocks`` object in the request JSON would exclude the three sub-objects, resulting in the application of the two changes described above.

To obtain an example of how these objects are included in the JSON file, download :download:`dataverse-complete-optional-params.json <../_static/api/dataverse-complete-optional-params.json>` file and modify it to suit your needs.

Expand Down
4 changes: 4 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/Dataverse.java
Original file line number Diff line number Diff line change
Expand Up @@ -595,6 +595,10 @@ public void setMetadataBlocks(List<MetadataBlock> metadataBlocks) {
this.metadataBlocks = new ArrayList<>(metadataBlocks);
}

public void clearMetadataBlocks() {
this.metadataBlocks.clear();
}

public List<DatasetFieldType> getCitationDatasetFieldTypes() {
return citationDatasetFieldTypes;
}
Expand Down
2 changes: 1 addition & 1 deletion src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ public Response updateDataverse(@Context ContainerRequestContext crc, String bod
List<DatasetFieldType> facets = parseFacets(body);

AuthenticatedUser u = getRequestAuthenticatedUserOrDie(crc);
dataverse = execCommand(new UpdateDataverseCommand(dataverse, facets, null, createDataverseRequest(u), inputLevels, metadataBlocks, updatedDataverseDTO));
dataverse = execCommand(new UpdateDataverseCommand(dataverse, facets, null, createDataverseRequest(u), inputLevels, metadataBlocks, updatedDataverseDTO, true));
return ok(json(dataverse));

} catch (WrappedResponse ww) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,15 @@ abstract class AbstractWriteDataverseCommand extends AbstractCommand<Dataverse>
private final List<DataverseFieldTypeInputLevel> inputLevels;
private final List<DatasetFieldType> facets;
protected final List<MetadataBlock> metadataBlocks;
private final boolean resetRelationsOnNullValues;

public AbstractWriteDataverseCommand(Dataverse dataverse,
Dataverse affectedDataverse,
DataverseRequest request,
List<DatasetFieldType> facets,
List<DataverseFieldTypeInputLevel> inputLevels,
List<MetadataBlock> metadataBlocks) {
List<MetadataBlock> metadataBlocks,
boolean resetRelationsOnNullValues) {
super(request, affectedDataverse);
this.dataverse = dataverse;
if (facets != null) {
Expand All @@ -43,42 +45,60 @@ public AbstractWriteDataverseCommand(Dataverse dataverse,
} else {
this.metadataBlocks = null;
}
this.resetRelationsOnNullValues = resetRelationsOnNullValues;
}

@Override
public Dataverse execute(CommandContext ctxt) throws CommandException {
dataverse = innerExecute(ctxt);

processMetadataBlocks();
processFacets(ctxt);
processInputLevels(ctxt);

return ctxt.dataverses().save(dataverse);
}

private void processMetadataBlocks() {
if (metadataBlocks != null && !metadataBlocks.isEmpty()) {
dataverse.setMetadataBlockRoot(true);
dataverse.setMetadataBlocks(metadataBlocks);
} else if (resetRelationsOnNullValues) {
dataverse.setMetadataBlockRoot(false);
dataverse.clearMetadataBlocks();
}
}

private void processFacets(CommandContext ctxt) {
if (facets != null) {
ctxt.facets().deleteFacetsFor(dataverse);

if (!facets.isEmpty()) {
dataverse.setFacetRoot(true);
}

int i = 0;
for (DatasetFieldType df : facets) {
ctxt.facets().create(i++, df, dataverse);
for (int i = 0; i < facets.size(); i++) {
ctxt.facets().create(i, facets.get(i), dataverse);
}
} else if (resetRelationsOnNullValues) {
ctxt.facets().deleteFacetsFor(dataverse);
dataverse.setFacetRoot(false);
}
}

private void processInputLevels(CommandContext ctxt) {
if (inputLevels != null) {
if (!inputLevels.isEmpty()) {
dataverse.addInputLevelsMetadataBlocksIfNotPresent(inputLevels);
}
ctxt.fieldTypeInputLevels().deleteFacetsFor(dataverse);
for (DataverseFieldTypeInputLevel inputLevel : inputLevels) {
inputLevels.forEach(inputLevel -> {
inputLevel.setDataverse(dataverse);
ctxt.fieldTypeInputLevels().create(inputLevel);
}
});
} else if (resetRelationsOnNullValues) {
ctxt.fieldTypeInputLevels().deleteFacetsFor(dataverse);
}

return ctxt.dataverses().save(dataverse);
}

abstract protected Dataverse innerExecute(CommandContext ctxt) throws IllegalCommandException;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ public CreateDataverseCommand(Dataverse created,
List<DatasetFieldType> facets,
List<DataverseFieldTypeInputLevel> inputLevels,
List<MetadataBlock> metadataBlocks) {
super(created, created.getOwner(), request, facets, inputLevels, metadataBlocks);
super(created, created.getOwner(), request, facets, inputLevels, metadataBlocks, false);
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ public UpdateDataverseCommand(Dataverse dataverse,
List<Dataverse> featuredDataverses,
DataverseRequest request,
List<DataverseFieldTypeInputLevel> inputLevels) {
this(dataverse, facets, featuredDataverses, request, inputLevels, null, null);
this(dataverse, facets, featuredDataverses, request, inputLevels, null, null, false);
}

public UpdateDataverseCommand(Dataverse dataverse,
Expand All @@ -41,8 +41,9 @@ public UpdateDataverseCommand(Dataverse dataverse,
DataverseRequest request,
List<DataverseFieldTypeInputLevel> inputLevels,
List<MetadataBlock> metadataBlocks,
DataverseDTO updatedDataverseDTO) {
super(dataverse, dataverse, request, facets, inputLevels, metadataBlocks);
DataverseDTO updatedDataverseDTO,
boolean resetRelationsOnNullValues) {
super(dataverse, dataverse, request, facets, inputLevels, metadataBlocks, resetRelationsOnNullValues);
if (featuredDataverses != null) {
this.featuredDataverseList = new ArrayList<>(featuredDataverses);
} else {
Expand Down
42 changes: 42 additions & 0 deletions src/test/java/edu/harvard/iq/dataverse/api/DataversesIT.java
Original file line number Diff line number Diff line change
Expand Up @@ -1379,6 +1379,48 @@ public void testUpdateDataverse() {
Response getDataverseResponse = UtilIT.listDataverseFacets(oldDataverseAlias, apiToken);
getDataverseResponse.then().assertThat().statusCode(NOT_FOUND.getStatusCode());

// Update the dataverse without setting metadata blocks, facets, or input levels
updateDataverseResponse = UtilIT.updateDataverse(
newAlias,
newAlias,
newName,
newAffiliation,
newDataverseType,
newContactEmails,
null,
null,
null,
apiToken
);
updateDataverseResponse.then().assertThat().statusCode(OK.getStatusCode());

// Assert that the metadata blocks are inherited from the parent
listMetadataBlocksResponse = UtilIT.listMetadataBlocks(newAlias, false, false, apiToken);
listMetadataBlocksResponse
.then().assertThat()
.statusCode(OK.getStatusCode())
.body("data.size()", equalTo(1))
.body("data[0].name", equalTo("citation"));

// Assert that the facets are inherited from the parent
String[] rootFacetIds = new String[]{"authorName", "subject", "keywordValue", "dateOfDeposit"};
listDataverseFacetsResponse = UtilIT.listDataverseFacets(newAlias, apiToken);
String actualFacetName1 = listDataverseFacetsResponse.then().extract().path("data[0]");
String actualFacetName2 = listDataverseFacetsResponse.then().extract().path("data[1]");
String actualFacetName3 = listDataverseFacetsResponse.then().extract().path("data[2]");
String actualFacetName4 = listDataverseFacetsResponse.then().extract().path("data[3]");
assertThat(rootFacetIds, hasItemInArray(actualFacetName1));
assertThat(rootFacetIds, hasItemInArray(actualFacetName2));
assertThat(rootFacetIds, hasItemInArray(actualFacetName3));
assertThat(rootFacetIds, hasItemInArray(actualFacetName4));

// Assert that the dataverse should not have any input level
listDataverseInputLevelsResponse = UtilIT.listDataverseInputLevels(newAlias, apiToken);
listDataverseInputLevelsResponse
.then().assertThat()
.statusCode(OK.getStatusCode())
.body("data.size()", equalTo(0));

// Should return error when the dataverse to edit does not exist
updateDataverseResponse = UtilIT.updateDataverse(
"unexistingDataverseAlias",
Expand Down

0 comments on commit 1c17c3e

Please sign in to comment.