Skip to content

Commit

Permalink
Merge tag 'v6.3' into mpdl-develop
Browse files Browse the repository at this point in the history
  • Loading branch information
haarli committed Jul 4, 2024
2 parents d257906 + 8c99a74 commit f19b06d
Show file tree
Hide file tree
Showing 244 changed files with 7,384 additions and 2,237 deletions.
101 changes: 101 additions & 0 deletions .github/workflows/maven_cache_management.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
name: Maven Cache Management

on:
# Every push to develop should trigger cache rejuvenation (dependencies might have changed)
push:
branches:
- develop
# According to https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy
# all caches are deleted after 7 days of no access. Make sure we rejuvenate every 7 days to keep it available.
schedule:
- cron: '23 2 * * 0' # Run for 'develop' every Sunday at 02:23 UTC (3:23 CET, 21:23 ET)
# Enable manual cache management
workflow_dispatch:
# Delete branch caches once a PR is merged
pull_request:
types:
- closed

env:
COMMON_CACHE_KEY: "dataverse-maven-cache"
COMMON_CACHE_PATH: "~/.m2/repository"

jobs:
seed:
name: Drop and Re-Seed Local Repository
runs-on: ubuntu-latest
if: ${{ github.event_name != 'pull_request' }}
permissions:
# Write permission needed to delete caches
# See also: https://docs.github.com/en/rest/actions/cache?apiVersion=2022-11-28#delete-a-github-actions-cache-for-a-repository-using-a-cache-id
actions: write
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Determine Java version from Parent POM
run: echo "JAVA_VERSION=$(grep '<target.java.version>' modules/dataverse-parent/pom.xml | cut -f2 -d'>' | cut -f1 -d'<')" >> ${GITHUB_ENV}
- name: Set up JDK ${{ env.JAVA_VERSION }}
uses: actions/setup-java@v4
with:
java-version: ${{ env.JAVA_VERSION }}
distribution: temurin
- name: Seed common cache
run: |
mvn -B -f modules/dataverse-parent dependency:go-offline dependency:resolve-plugins
# This non-obvious order is due to the fact that the download via Maven above will take a very long time (7-8 min).
# Jobs should not be left without a cache. Deleting and saving in one go leaves only a small chance for a cache miss.
- name: Drop common cache
run: |
gh extension install actions/gh-actions-cache
echo "🛒 Fetching list of cache keys"
cacheKeys=$(gh actions-cache list -R ${{ github.repository }} -B develop | cut -f 1 )
## Setting this to not fail the workflow while deleting cache keys.
set +e
echo "🗑️ Deleting caches..."
for cacheKey in $cacheKeys
do
gh actions-cache delete $cacheKey -R ${{ github.repository }} -B develop --confirm
done
echo "✅ Done"
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Save the common cache
uses: actions/cache@v4
with:
path: ${{ env.COMMON_CACHE_PATH }}
key: ${{ env.COMMON_CACHE_KEY }}
enableCrossOsArchive: true

# Let's delete feature branch caches once their PR is merged - we only have 10 GB of space before eviction kicks in
deplete:
name: Deplete feature branch caches
runs-on: ubuntu-latest
if: ${{ github.event_name == 'pull_request' }}
permissions:
# `actions:write` permission is required to delete caches
# See also: https://docs.github.com/en/rest/actions/cache?apiVersion=2022-11-28#delete-a-github-actions-cache-for-a-repository-using-a-cache-id
actions: write
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Cleanup caches
run: |
gh extension install actions/gh-actions-cache
BRANCH=refs/pull/${{ github.event.pull_request.number }}/merge
echo "🛒 Fetching list of cache keys"
cacheKeysForPR=$(gh actions-cache list -R ${{ github.repository }} -B $BRANCH | cut -f 1 )
## Setting this to not fail the workflow while deleting cache keys.
set +e
echo "🗑️ Deleting caches..."
for cacheKey in $cacheKeysForPR
do
gh actions-cache delete $cacheKey -R ${{ github.repository }} -B $BRANCH --confirm
done
echo "✅ Done"
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ oauth-credentials.md
/src/main/webapp/oauth2/newAccount.html
scripts/api/setup-all.sh*
scripts/api/setup-all.*.log
src/main/resources/edu/harvard/iq/dataverse/openapi/

# ctags generated tag file
tags
Expand Down
66 changes: 3 additions & 63 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,67 +1,7 @@
# Contributing to Dataverse

Thank you for your interest in contributing to Dataverse! We are open to contributions from everyone. You don't need permission to participate. Just jump in. If you have questions, please reach out using one or more of the channels described below.
Thank you for your interest in contributing to Dataverse! We are open to contributions from everyone.

We aren't just looking for developers. There are many ways to contribute to Dataverse. We welcome contributions of ideas, bug reports, usability research/feedback, documentation, code, and more!
Please see our [Contributor Guide][] for how you can help!

## Ideas/Feature Requests

Your idea or feature request might already be captured in the Dataverse [issue tracker] on GitHub but if not, the best way to bring it to the community's attention is by posting on the [dataverse-community Google Group][] or bringing it up on a [Community Call][]. You're also welcome to make some noise in [chat.dataverse.org][] or cram your idea into 280 characters and mention [@dataverseorg][] on Twitter. To discuss your idea privately, please email it to [email protected]

There's a chance your idea is already on our roadmap, which is available at https://www.iq.harvard.edu/roadmap-dataverse-project

[chat.dataverse.org]: http://chat.dataverse.org
[issue tracker]: https://github.com/IQSS/dataverse/issues
[@dataverseorg]: https://twitter.com/dataverseorg

## Usability testing

Please email us at [email protected] if you are interested in participating in usability testing.

## Bug Reports/Issues

An issue is a bug (a feature is no longer behaving the way it should) or a feature (something new to Dataverse that helps users complete tasks). You can browse the Dataverse [issue tracker] on GitHub by open or closed issues or by milestones.

Before submitting an issue, please search the existing issues by using the search bar at the top of the page. If there is an existing open issue that matches the issue you want to report, please add a comment to it.

If there is no pre-existing issue or it has been closed, please click on the "New Issue" button, log in, and write in what the issue is (unless it is a security issue which should be reported privately to [email protected]).

If you do not receive a reply to your new issue or comment in a timely manner, please email [email protected] with a link to the issue.

### Writing an Issue

For the subject of an issue, please start it by writing the feature or functionality it relates to, i.e. "Create Account:..." or "Dataset Page:...". In the body of the issue, please outline the issue you are reporting with as much detail as possible. In order for the Dataverse development team to best respond to the issue, we need as much information about the issue as you can provide. Include steps to reproduce bugs. Indicate which version you're using, which is shown at the bottom of the page. We love screenshots!

### Issue Attachments

You can attach certain files (images, screenshots, logs, etc.) by dragging and dropping, selecting them, or pasting from the clipboard. Files must be one of GitHub's [supported attachment formats] such as png, gif, jpg, txt, pdf, zip, etc. (Pro tip: A file ending in .log can be renamed to .txt so you can upload it.) If there's no easy way to attach your file, please include a URL that points to the file in question.

[supported attachment formats]: https://help.github.com/articles/file-attachments-on-issues-and-pull-requests/

## Documentation

The source for the documentation at http://guides.dataverse.org/en/latest/ is in the GitHub repo under the "[doc][]" folder. If you find a typo or inaccuracy or something to clarify, please send us a pull request! For more on the tools used to write docs, please see the [documentation][] section of the Developer Guide.

[doc]: https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source
[documentation]: http://guides.dataverse.org/en/latest/developers/documentation.html

## Code/Pull Requests

We love code contributions. Developers are not limited to the main Dataverse code in this git repo. You can help with API client libraries in your favorite language that are mentioned in the [API Guide][] or create a new library. You can help work on configuration management code that's mentioned in the [Installation Guide][]. The Installation Guide also covers a relatively new concept called "external tools" that allows developers to create their own tools that are available from within an installation of Dataverse.

[API Guide]: http://guides.dataverse.org/en/latest/api
[Installation Guide]: http://guides.dataverse.org/en/latest/installation

If you are interested in working on the main Dataverse code, great! Before you start coding, please reach out to us either on the [dataverse-community Google Group][], the [dataverse-dev Google Group][], [chat.dataverse.org][], or via [email protected] to make sure the effort is well coordinated and we avoid merge conflicts. We maintain a list of [community contributors][] and [dev efforts][] the community is working on so please let us know if you'd like to be added or removed from either list.

Please read http://guides.dataverse.org/en/latest/developers/version-control.html to understand how we use the "git flow" model of development and how we will encourage you to create a GitHub issue (if it doesn't exist already) to associate with your pull request. That page also includes tips on making a pull request.

After making your pull request, your goal should be to help it advance through our kanban board at https://github.com/orgs/IQSS/projects/34 . If no one has moved your pull request to the code review column in a timely manner, please reach out. Note that once a pull request is created for an issue, we'll remove the issue from the board so that we only track one card (the pull request).

Thanks for your contribution!

[dataverse-community Google Group]: https://groups.google.com/group/dataverse-community
[Community Call]: https://dataverse.org/community-calls
[dataverse-dev Google Group]: https://groups.google.com/group/dataverse-dev
[community contributors]: https://docs.google.com/spreadsheets/d/1o9DD-MQ0WkrYaEFTD5rF_NtyL8aUISgURsAXSL7Budk/edit?usp=sharing
[dev efforts]: https://github.com/orgs/IQSS/projects/34/views/6
[Contributor Guide]: https://guides.dataverse.org/en/latest/contributor/index.html
1 change: 0 additions & 1 deletion Dockerfile

This file was deleted.

3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Dataverse is an [open source][] software platform for sharing, finding, citing,

We maintain a demo site at [demo.dataverse.org][] which you are welcome to use for testing and evaluating Dataverse.

To install Dataverse, please see our [Installation Guide][] which will prompt you to download our [latest release][].
To install Dataverse, please see our [Installation Guide][] which will prompt you to download our [latest release][]. Docker users should consult the [Container Guide][].

To discuss Dataverse with the community, please join our [mailing list][], participate in a [community call][], chat with us at [chat.dataverse.org][], or attend our annual [Dataverse Community Meeting][].

Expand All @@ -28,6 +28,7 @@ Dataverse is a trademark of President and Fellows of Harvard College and is regi
[Dataverse community]: https://dataverse.org/developers
[Installation Guide]: https://guides.dataverse.org/en/latest/installation/index.html
[latest release]: https://github.com/IQSS/dataverse/releases
[Container Guide]: https://guides.dataverse.org/en/latest/container/index.html
[features]: https://dataverse.org/software-features
[project board]: https://github.com/orgs/IQSS/projects/34
[roadmap]: https://www.iq.harvard.edu/roadmap-dataverse-project
Expand Down
File renamed without changes.
5 changes: 4 additions & 1 deletion conf/solr/9.3.0/schema.xml → conf/solr/schema.xml
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,8 @@
<field name="publicationStatus" type="string" stored="true" indexed="true" multiValued="true"/>
<field name="externalStatus" type="string" stored="true" indexed="true" multiValued="false"/>
<field name="embargoEndDate" type="plong" stored="true" indexed="true" multiValued="false"/>

<field name="retentionEndDate" type="plong" stored="true" indexed="true" multiValued="false"/>

<field name="subtreePaths" type="string" stored="true" indexed="true" multiValued="true"/>

<field name="fileName" type="text_en" stored="true" indexed="true" multiValued="true"/>
Expand Down Expand Up @@ -340,6 +341,7 @@
<field name="journalVolumeIssue" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="keyword" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="keywordValue" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="keywordTermURI" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="keywordVocabulary" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="keywordVocabularyURI" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="kindOfData" type="text_en" multiValued="true" stored="true" indexed="true"/>
Expand Down Expand Up @@ -591,6 +593,7 @@
<copyField source="journalVolumeIssue" dest="_text_" maxChars="3000"/>
<copyField source="keyword" dest="_text_" maxChars="3000"/>
<copyField source="keywordValue" dest="_text_" maxChars="3000"/>
<copyField source="keywordTermURI" dest="_text_" maxChars="3000"/>
<copyField source="keywordVocabulary" dest="_text_" maxChars="3000"/>
<copyField source="keywordVocabularyURI" dest="_text_" maxChars="3000"/>
<copyField source="kindOfData" dest="_text_" maxChars="3000"/>
Expand Down
4 changes: 2 additions & 2 deletions conf/solr/9.3.0/solrconfig.xml → conf/solr/solrconfig.xml
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@
have some sort of hard autoCommit to limit the log size.
-->
<autoCommit>
<maxTime>${solr.autoCommit.maxTime:15000}</maxTime>
<maxTime>${solr.autoCommit.maxTime:30000}</maxTime>
<openSearcher>false</openSearcher>
</autoCommit>

Expand All @@ -301,7 +301,7 @@
-->

<autoSoftCommit>
<maxTime>${solr.autoSoftCommit.maxTime:-1}</maxTime>
<maxTime>${solr.autoSoftCommit.maxTime:1000}</maxTime>
</autoSoftCommit>

<!-- Update Related Event Listeners
Expand Down
File renamed without changes.
12 changes: 8 additions & 4 deletions doc/release-notes/6.2-release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -417,12 +417,16 @@ In the following commands we assume that Payara 6 is installed in `/usr/local/pa

As noted above, deployment of the war file might take several minutes due a database migration script required for the new storage quotas feature.

6\. Restart Payara
6\. For installations with internationalization:

- Please remember to update translations via [Dataverse language packs](https://github.com/GlobalDataverseCommunityConsortium/dataverse-language-packs).

7\. Restart Payara

- `service payara stop`
- `service payara start`

7\. Update the following Metadata Blocks to reflect the incremental improvements made to the handling of core metadata fields:
8\. Update the following Metadata Blocks to reflect the incremental improvements made to the handling of core metadata fields:

```
wget https://github.com/IQSS/dataverse/releases/download/v6.2/geospatial.tsv
Expand All @@ -442,7 +446,7 @@ wget https://github.com/IQSS/dataverse/releases/download/v6.2/biomedical.tsv
curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/tab-separated-values" -X POST --upload-file scripts/api/data/metadatablocks/biomedical.tsv
```

8\. For installations with custom or experimental metadata blocks:
9\. For installations with custom or experimental metadata blocks:

- Stop Solr instance (usually `service solr stop`, depending on Solr installation/OS, see the [Installation Guide](https://guides.dataverse.org/en/6.2/installation/prerequisites.html#solr-init-script))

Expand All @@ -455,7 +459,7 @@ curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/ta
- Restart Solr instance (usually `service solr restart` depending on solr/OS)

9\. Reindex Solr:
10\. Reindex Solr:

For details, see https://guides.dataverse.org/en/6.2/admin/solr-search-index.html but here is the reindex command:

Expand Down
Loading

0 comments on commit f19b06d

Please sign in to comment.