Skip to content

Commit

Permalink
Update from Dev
Browse files Browse the repository at this point in the history
  • Loading branch information
sekmiller committed Nov 26, 2024
2 parents 51e1ad7 + 1c17c3e commit 5bfc435
Show file tree
Hide file tree
Showing 16 changed files with 141 additions and 31 deletions.
1 change: 1 addition & 0 deletions conf/solr/schema.xml
Original file line number Diff line number Diff line change
Expand Up @@ -234,6 +234,7 @@
<field name="datasetValid" type="boolean" stored="true" indexed="true" multiValued="false"/>

<field name="license" type="string" stored="true" indexed="true" multiValued="false"/>
<field name="fileCount" type="plong" stored="true" indexed="true" multiValued="false"/>

<!--
METADATA SCHEMA FIELDS
Expand Down
8 changes: 8 additions & 0 deletions doc/release-notes/11018-update-dataverse-endpoint-update.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
The updateDataverse API endpoint has been updated to support an "inherit from parent" configuration for metadata blocks, facets, and input levels.

When it comes to omitting any of these fields in the request JSON:

- Omitting ``facetIds`` or ``metadataBlockNames`` causes the Dataverse collection to inherit the corresponding configuration from its parent.
- Omitting ``inputLevels`` removes any existing custom input levels in the Dataverse collection.

Previously, not setting these fields meant keeping the existing ones in the Dataverse.
15 changes: 15 additions & 0 deletions doc/release-notes/8941-adding-fileCount-in-solr.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
## Release Highlights

### Adding fileCount as SOLR field

A new search field called `fileCount` can be searched to discover the number of files per dataset. (#10598)

## Upgrade Instructions

1. Update your Solr `schema.xml` to include the new field.
For details, please see https://guides.dataverse.org/en/latest/admin/metadatacustomization.html#updating-the-solr-schema

2. Reindex Solr.
Once the schema.xml is updated, Solr must be restarted and a reindex initiated.
For details, see https://guides.dataverse.org/en/latest/admin/solr-search-index.html but here is the reindex command:
`curl http://localhost:8080/api/admin/index`
14 changes: 11 additions & 3 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -120,9 +120,17 @@ You should expect an HTTP 200 response and JSON beginning with "status":"OK" fol

Same as in :ref:`create-dataverse-api`, the request JSON supports an optional ``metadataBlocks`` object, with the following supported sub-objects:

- ``metadataBlockNames``: The names of the metadata blocks you want to add to the Dataverse collection.
- ``inputLevels``: The names of the fields in each metadata block for which you want to add a custom configuration regarding their inclusion or requirement when creating and editing datasets in the new Dataverse collection. Note that if the corresponding metadata blocks names are not specified in the ``metadataBlockNames``` field, they will be added automatically to the Dataverse collection.
- ``facetIds``: The names of the fields to use as facets for browsing datasets and collections in the new Dataverse collection. Note that the order of the facets is defined by their order in the provided JSON array.
- ``metadataBlockNames``: The names of the metadata blocks to be assigned to the Dataverse collection.
- ``inputLevels``: The names of the fields in each metadata block for which you want to add a custom configuration regarding their inclusion or requirement when creating and editing datasets in the Dataverse collection. Note that if the corresponding metadata blocks names are not specified in the ``metadataBlockNames``` field, they will be added automatically to the Dataverse collection.
- ``facetIds``: The names of the fields to use as facets for browsing datasets and collections in the Dataverse collection. Note that the order of the facets is defined by their order in the provided JSON array.

Note that setting any of these fields overwrites the previous configuration.

When it comes to omitting these fields in the JSON:

- Omitting ``facetIds`` or ``metadataBlockNames`` causes the Dataverse collection to inherit the corresponding configuration from its parent.
- Omitting ``inputLevels`` removes any existing custom input levels in the Dataverse collection.
- Omitting the entire ``metadataBlocks`` object in the request JSON would exclude the three sub-objects, resulting in the application of the two changes described above.

To obtain an example of how these objects are included in the JSON file, download :download:`dataverse-complete-optional-params.json <../_static/api/dataverse-complete-optional-params.json>` file and modify it to suit your needs.

Expand Down
4 changes: 4 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/Dataverse.java
Original file line number Diff line number Diff line change
Expand Up @@ -595,6 +595,10 @@ public void setMetadataBlocks(List<MetadataBlock> metadataBlocks) {
this.metadataBlocks = new ArrayList<>(metadataBlocks);
}

public void clearMetadataBlocks() {
this.metadataBlocks.clear();
}

public List<DatasetFieldType> getCitationDatasetFieldTypes() {
return citationDatasetFieldTypes;
}
Expand Down
2 changes: 1 addition & 1 deletion src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ public Response updateDataverse(@Context ContainerRequestContext crc, String bod
List<DatasetFieldType> facets = parseFacets(body);

AuthenticatedUser u = getRequestAuthenticatedUserOrDie(crc);
dataverse = execCommand(new UpdateDataverseCommand(dataverse, facets, null, createDataverseRequest(u), inputLevels, metadataBlocks, updatedDataverseDTO));
dataverse = execCommand(new UpdateDataverseCommand(dataverse, facets, null, createDataverseRequest(u), inputLevels, metadataBlocks, updatedDataverseDTO, true));
return ok(json(dataverse));

} catch (WrappedResponse ww) {
Expand Down
11 changes: 1 addition & 10 deletions src/main/java/edu/harvard/iq/dataverse/api/Search.java
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,7 @@ public Response search(
JsonArrayBuilder itemsArrayBuilder = Json.createArrayBuilder();
List<SolrSearchResult> solrSearchResults = solrQueryResponse.getSolrSearchResults();
for (SolrSearchResult solrSearchResult : solrSearchResults) {
itemsArrayBuilder.add(solrSearchResult.json(showRelevance, showEntityIds, showApiUrls, metadataFields, getDatasetFileCount(solrSearchResult)));
itemsArrayBuilder.add(solrSearchResult.json(showRelevance, showEntityIds, showApiUrls, metadataFields));
}

JsonObjectBuilder spelling_alternatives = Json.createObjectBuilder();
Expand Down Expand Up @@ -229,15 +229,6 @@ public Response search(
}
}

private Long getDatasetFileCount(SolrSearchResult solrSearchResult) {
DvObject dvObject = solrSearchResult.getEntity();
if (dvObject.isInstanceofDataset()) {
DatasetVersion datasetVersion = ((Dataset) dvObject).getVersionFromId(solrSearchResult.getDatasetVersionId());
return datasetVersionFilesServiceBean.getFileMetadataCount(datasetVersion);
}
return null;
}

private User getUser(ContainerRequestContext crc) throws WrappedResponse {
User userToExecuteSearchAs = GuestUser.get();
try {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,15 @@ abstract class AbstractWriteDataverseCommand extends AbstractCommand<Dataverse>
private final List<DataverseFieldTypeInputLevel> inputLevels;
private final List<DatasetFieldType> facets;
protected final List<MetadataBlock> metadataBlocks;
private final boolean resetRelationsOnNullValues;

public AbstractWriteDataverseCommand(Dataverse dataverse,
Dataverse affectedDataverse,
DataverseRequest request,
List<DatasetFieldType> facets,
List<DataverseFieldTypeInputLevel> inputLevels,
List<MetadataBlock> metadataBlocks) {
List<MetadataBlock> metadataBlocks,
boolean resetRelationsOnNullValues) {
super(request, affectedDataverse);
this.dataverse = dataverse;
if (facets != null) {
Expand All @@ -43,17 +45,31 @@ public AbstractWriteDataverseCommand(Dataverse dataverse,
} else {
this.metadataBlocks = null;
}
this.resetRelationsOnNullValues = resetRelationsOnNullValues;
}

@Override
public Dataverse execute(CommandContext ctxt) throws CommandException {
dataverse = innerExecute(ctxt);

processMetadataBlocks();
processFacets(ctxt);
processInputLevels(ctxt);

return ctxt.dataverses().save(dataverse);
}

private void processMetadataBlocks() {
if (metadataBlocks != null && !metadataBlocks.isEmpty()) {
dataverse.setMetadataBlockRoot(true);
dataverse.setMetadataBlocks(metadataBlocks);
} else if (resetRelationsOnNullValues) {
dataverse.setMetadataBlockRoot(false);
dataverse.clearMetadataBlocks();
}
}

private void processFacets(CommandContext ctxt) {
if (facets != null) {
ctxt.facets().deleteFacetsFor(dataverse);
dataverse.setDataverseFacets(new ArrayList<>());
Expand All @@ -62,25 +78,28 @@ public Dataverse execute(CommandContext ctxt) throws CommandException {
dataverse.setFacetRoot(true);
}

int i = 0;
for (DatasetFieldType df : facets) {
ctxt.facets().create(i++, df, dataverse);
for (int i = 0; i < facets.size(); i++) {
ctxt.facets().create(i, facets.get(i), dataverse);
}
} else if (resetRelationsOnNullValues) {
ctxt.facets().deleteFacetsFor(dataverse);
dataverse.setFacetRoot(false);
}
}

private void processInputLevels(CommandContext ctxt) {
if (inputLevels != null) {
if (!inputLevels.isEmpty()) {
dataverse.addInputLevelsMetadataBlocksIfNotPresent(inputLevels);
}
ctxt.fieldTypeInputLevels().deleteFacetsFor(dataverse);
dataverse.setDataverseFieldTypeInputLevels(new ArrayList<>());
for (DataverseFieldTypeInputLevel inputLevel : inputLevels) {
inputLevels.forEach(inputLevel -> {
inputLevel.setDataverse(dataverse);
ctxt.fieldTypeInputLevels().create(inputLevel);
}
});
} else if (resetRelationsOnNullValues) {
ctxt.fieldTypeInputLevels().deleteFacetsFor(dataverse);
}

return ctxt.dataverses().save(dataverse);
}

abstract protected Dataverse innerExecute(CommandContext ctxt) throws IllegalCommandException;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ public CreateDataverseCommand(Dataverse created,
List<DatasetFieldType> facets,
List<DataverseFieldTypeInputLevel> inputLevels,
List<MetadataBlock> metadataBlocks) {
super(created, created.getOwner(), request, facets, inputLevels, metadataBlocks);
super(created, created.getOwner(), request, facets, inputLevels, metadataBlocks, false);
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ public UpdateDataverseCommand(Dataverse dataverse,
List<Dataverse> featuredDataverses,
DataverseRequest request,
List<DataverseFieldTypeInputLevel> inputLevels) {
this(dataverse, facets, featuredDataverses, request, inputLevels, null, null);
this(dataverse, facets, featuredDataverses, request, inputLevels, null, null, false);
}

public UpdateDataverseCommand(Dataverse dataverse,
Expand All @@ -41,8 +41,9 @@ public UpdateDataverseCommand(Dataverse dataverse,
DataverseRequest request,
List<DataverseFieldTypeInputLevel> inputLevels,
List<MetadataBlock> metadataBlocks,
DataverseDTO updatedDataverseDTO) {
super(dataverse, dataverse, request, facets, inputLevels, metadataBlocks);
DataverseDTO updatedDataverseDTO,
boolean resetRelationsOnNullValues) {
super(dataverse, dataverse, request, facets, inputLevels, metadataBlocks, resetRelationsOnNullValues);
if (featuredDataverses != null) {
this.featuredDataverseList = new ArrayList<>(featuredDataverses);
} else {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,9 @@ public class IndexServiceBean {
@EJB
DatasetFieldServiceBean datasetFieldService;

@Inject
DatasetVersionFilesServiceBean datasetVersionFilesServiceBean;

public static final String solrDocIdentifierDataverse = "dataverse_";
public static final String solrDocIdentifierFile = "datafile_";
public static final String solrDocIdentifierDataset = "dataset_";
Expand Down Expand Up @@ -1018,6 +1021,8 @@ public SolrInputDocuments toSolrDocs(IndexableDataset indexableDataset, Set<Long
solrInputDocument.addField(SearchFields.DATASET_CITATION, datasetVersion.getCitation(false));
solrInputDocument.addField(SearchFields.DATASET_CITATION_HTML, datasetVersion.getCitation(true));

solrInputDocument.addField(SearchFields.FILE_COUNT, datasetVersionFilesServiceBean.getFileMetadataCount(datasetVersion));

if (datasetVersion.isInReview()) {
solrInputDocument.addField(SearchFields.PUBLICATION_STATUS, IN_REVIEW_STRING);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -291,5 +291,6 @@ more targeted results for just datasets. The format is YYYY (i.e.
public static final String DATASET_VALID = "datasetValid";

public static final String DATASET_LICENSE = "license";
public static final String FILE_COUNT = "fileCount";

}
Original file line number Diff line number Diff line change
Expand Up @@ -497,7 +497,8 @@ public SolrQueryResponse search(
Long retentionEndDate = (Long) solrDocument.getFieldValue(SearchFields.RETENTION_END_DATE);
//
Boolean datasetValid = (Boolean) solrDocument.getFieldValue(SearchFields.DATASET_VALID);

Long fileCount = (Long) solrDocument.getFieldValue(SearchFields.FILE_COUNT);

List<String> matchedFields = new ArrayList<>();

SolrSearchResult solrSearchResult = new SolrSearchResult(query, name);
Expand Down Expand Up @@ -570,6 +571,7 @@ public SolrQueryResponse search(
solrSearchResult.setDeaccessionReason(deaccessionReason);
solrSearchResult.setDvTree(dvTree);
solrSearchResult.setDatasetValid(datasetValid);
solrSearchResult.setFileCount(fileCount);

if (Boolean.TRUE.equals((Boolean) solrDocument.getFieldValue(SearchFields.IS_HARVESTED))) {
solrSearchResult.setHarvested(true);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,10 @@ public class SolrSearchResult {
private String citation;
private String citationHtml;
private String datasetType;
/**
* Only Dataset can have a file count
*/
private Long fileCount;
/**
* Files and datasets might have a UNF. Dataverses don't.
*/
Expand Down Expand Up @@ -456,10 +460,10 @@ public JsonObjectBuilder getJsonForMyData(boolean isValid) {
} // getJsonForMydata

public JsonObjectBuilder json(boolean showRelevance, boolean showEntityIds, boolean showApiUrls) {
return json(showRelevance, showEntityIds, showApiUrls, null, null);
return json(showRelevance, showEntityIds, showApiUrls, null);
}

public JsonObjectBuilder json(boolean showRelevance, boolean showEntityIds, boolean showApiUrls, List<String> metadataFields, Long datasetFileCount) {
public JsonObjectBuilder json(boolean showRelevance, boolean showEntityIds, boolean showApiUrls, List<String> metadataFields) {
if (this.type == null) {
return jsonObjectBuilder();
}
Expand Down Expand Up @@ -597,7 +601,7 @@ public JsonObjectBuilder json(boolean showRelevance, boolean showEntityIds, bool
subjects.add(subject);
}
nullSafeJsonBuilder.add("subjects", subjects);
nullSafeJsonBuilder.add("fileCount", datasetFileCount);
nullSafeJsonBuilder.add("fileCount", this.fileCount);
nullSafeJsonBuilder.add("versionId", dv.getId());
nullSafeJsonBuilder.add("versionState", dv.getVersionState().toString());
if (this.isPublishedState()) {
Expand Down Expand Up @@ -1348,4 +1352,12 @@ public boolean isValid(Predicate<SolrSearchResult> canUpdateDataset) {
}
return !canUpdateDataset.test(this);
}

public Long getFileCount() {
return fileCount;
}

public void setFileCount(Long fileCount) {
this.fileCount = fileCount;
}
}
42 changes: 42 additions & 0 deletions src/test/java/edu/harvard/iq/dataverse/api/DataversesIT.java
Original file line number Diff line number Diff line change
Expand Up @@ -1379,6 +1379,48 @@ public void testUpdateDataverse() {
Response getDataverseResponse = UtilIT.listDataverseFacets(oldDataverseAlias, apiToken);
getDataverseResponse.then().assertThat().statusCode(NOT_FOUND.getStatusCode());

// Update the dataverse without setting metadata blocks, facets, or input levels
updateDataverseResponse = UtilIT.updateDataverse(
newAlias,
newAlias,
newName,
newAffiliation,
newDataverseType,
newContactEmails,
null,
null,
null,
apiToken
);
updateDataverseResponse.then().assertThat().statusCode(OK.getStatusCode());

// Assert that the metadata blocks are inherited from the parent
listMetadataBlocksResponse = UtilIT.listMetadataBlocks(newAlias, false, false, apiToken);
listMetadataBlocksResponse
.then().assertThat()
.statusCode(OK.getStatusCode())
.body("data.size()", equalTo(1))
.body("data[0].name", equalTo("citation"));

// Assert that the facets are inherited from the parent
String[] rootFacetIds = new String[]{"authorName", "subject", "keywordValue", "dateOfDeposit"};
listDataverseFacetsResponse = UtilIT.listDataverseFacets(newAlias, apiToken);
String actualFacetName1 = listDataverseFacetsResponse.then().extract().path("data[0]");
String actualFacetName2 = listDataverseFacetsResponse.then().extract().path("data[1]");
String actualFacetName3 = listDataverseFacetsResponse.then().extract().path("data[2]");
String actualFacetName4 = listDataverseFacetsResponse.then().extract().path("data[3]");
assertThat(rootFacetIds, hasItemInArray(actualFacetName1));
assertThat(rootFacetIds, hasItemInArray(actualFacetName2));
assertThat(rootFacetIds, hasItemInArray(actualFacetName3));
assertThat(rootFacetIds, hasItemInArray(actualFacetName4));

// Assert that the dataverse should not have any input level
listDataverseInputLevelsResponse = UtilIT.listDataverseInputLevels(newAlias, apiToken);
listDataverseInputLevelsResponse
.then().assertThat()
.statusCode(OK.getStatusCode())
.body("data.size()", equalTo(0));

// Should return error when the dataverse to edit does not exist
updateDataverseResponse = UtilIT.updateDataverse(
"unexistingDataverseAlias",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,7 @@ public void setUp() {
indexService.dataverseService = Mockito.mock(DataverseServiceBean.class);
indexService.datasetFieldService = Mockito.mock(DatasetFieldServiceBean.class);
indexService.datasetVersionService = Mockito.mock(DatasetVersionServiceBean.class);
indexService.datasetVersionFilesServiceBean = Mockito.mock(DatasetVersionFilesServiceBean.class);
BrandingUtil.injectServices(indexService.dataverseService, indexService.settingsService);

Mockito.when(indexService.dataverseService.findRootDataverse()).thenReturn(dataverse);
Expand Down

0 comments on commit 5bfc435

Please sign in to comment.