diff --git a/doc/release-notes/5.13-release-notes.md b/doc/release-notes/5.13-release-notes.md
new file mode 100644
index 00000000000..0463b7d18a3
--- /dev/null
+++ b/doc/release-notes/5.13-release-notes.md
@@ -0,0 +1,262 @@
+# Dataverse Software 5.13
+
+This release brings new features, enhancements, and bug fixes to the Dataverse software. Thank you to all of the community members who contributed code, suggestions, bug reports, and other assistance across the project.
+
+## Release Highlights
+
+### Schema.org Improvements (Some Backward Incompatibility)
+
+The Schema.org metadata used as an export format and also embedded in dataset pages has been updated to improve compliance with Schema.org's schema and Google's recommendations for Google Dataset Search.
+
+Please be advised that these improvements have the chance to break integrations that rely on the old, less compliant structure. For details see the "backward incompatibility" section below. (Issue #7349)
+
+### Folder Uploads via Web UI (dvwebloader, S3 only)
+
+For installations using S3 for storage and with direct upload enabled, a new tool called [DVWebloader](https://github.com/gdcc/dvwebloader) can be enabled that allows web users to upload a folder with a hierarchy of files and subfolders while retaining the relative paths of files (similarly to how the DVUploader tool does it on the command line, but with the convenience of using the browser UI). See [Folder Upload](https://guides.dataverse.org/en/5.13/user/dataset-management.html#folder-upload) in the User Guide for details. (PR #9096)
+
+### Long Descriptions of Collections (Dataverses) are Now Truncated
+
+Like datasets, long descriptions of collections (dataverses) are now truncated by default but can be expanded with a "read full description" button. (PR #9222)
+
+### License Sorting
+
+Licenses as shown in the dropdown in UI can be now sorted by the superusers. See [Sorting Licenses](https://guides.dataverse.org/en/5.13/installation/config.html#sorting-licenses) section of the Installation Guide for details. (PR #8697)
+
+### Metadata Field Production Location Now Repeatable, Facetable, and Enabled for Advanced Search
+
+Depositors can now click the plus sign to enter multiple instances of the metadata field "Production Location" in the citation metadata block. Additionally this field now appears on the Advanced Search page and can be added to the list of search facets. (PR #9254)
+
+### Support for NetCDF and HDF5 Files
+
+ NetCDF and HDF5 files are now detected based on their content rather than just their file extension. Both "classic" NetCDF 3 files and more modern NetCDF 4 files are detected based on content. Detection for older HDF4 files is only done through the file extension ".hdf", as before.
+
+For NetCDF and HDF5 files, an attempt will be made to extract metadata in NcML (XML) format and save it as an auxiliary file. There is a new NcML previewer available in the [dataverse-previewers](https://github.com/gdcc/dataverse-previewers) repo.
+
+An [extractNcml](https://guides.dataverse.org/en/5.13/api/native-api.html#extract-ncml) API endpoint has been added, especially for installations with existing NetCDF and HDF5 files. After upgrading, they can iterate through these files and try to extract an NcML file.
+
+See the [NetCDF and HDF5](https://guides.dataverse.org/en/5.13/user/dataset-management.html#netcdf-and-hdf5) section of the User Guide for details. (PR #9239)
+
+### Support for .eln Files (Electronic Laboratory Notebooks)
+
+The [.eln file format](https://github.com/TheELNConsortium/TheELNFileFormat) is used by Electronic Laboratory Notebooks as an exchange format for experimental protocols, results, sample descriptions, etc...
+
+### Improved Security for External Tools
+
+External tools can now be configured to use signed URLs to access the Dataverse API as an alternative to API tokens. This eliminates the need for tools to have access to the user's API token in order to access draft or restricted datasets and datafiles. Signed URLs can be transferred via POST or via a callback when triggering a tool via GET. See [Authorization Options](https://guides.dataverse.org/en/5.13/api/external-tools.html#authorization-options) in the External Tools documentation for details. (PR #9001)
+
+### Geospatial Search (API Only)
+
+Geospatial search is supported via the Search API using two new [parameters](https://guides.dataverse.org/en/5.13/api/search.html#parameters): `geo_point` and `geo_radius`.
+
+The fields that are geospatially indexed are "West Longitude", "East Longitude", "North Latitude", and "South Latitude" from the "Geographic Bounding Box" field in the geospatial metadata block. (PR #8239)
+
+### Reproducibility and Code Execution with Binder
+
+Binder has been added to the list of external tools that can be added to a Dataverse installation. From the dataset page, you can launch Binder, which spins up a computational environment in which you can explore the code and data in the dataset, or write new code, such as a Jupyter notebook. (PR #9341)
+
+### CodeMeta (Software) Metadata Support (Experimental)
+
+Experimental support for research software metadata deposits has been added.
+
+By adding a metadata block for [CodeMeta](https://codemeta.github.io), we take another step toward adding first class support of diverse FAIR objects, such as research software and computational workflows.
+
+There is more work underway to make Dataverse installations around the world "research software ready."
+
+**Note:** Like the metadata block for computational workflows before, CodeMeta is listed under [Experimental Metadata](https://guides.dataverse.org/en/5.13/user/appendix.html#experimental-metadata) in the guides. Experimental means it's brand new, opt-in, and might need future tweaking based on experience of usage in the field. We hope for feedback from installations on the new metadata block to optimize and lift it from the experimental stage. (PR #7877)
+
+### Mechanism Added for Stopping a Harvest in Progress
+
+It is now possible for a sysadmin to stop a long-running harvesting job. See [Harvesting Clients](https://guides.dataverse.org/en/5.13/admin/harvestclients.html#how-to-stop-a-harvesting-run-in-progress) in the Admin Guide for more information. (PR #9187)
+
+### API Endpoint Listing Metadata Block Details has been Extended
+
+The API endpoint `/api/metadatablocks/{block_id}` has been extended to include the following fields:
+
+- `controlledVocabularyValues` - All possible values for fields with a controlled vocabulary. For example, the values "Agricultural Sciences", "Arts and Humanities", etc. for the "Subject" field.
+- `isControlledVocabulary`: Whether or not this field has a controlled vocabulary.
+- `multiple`: Whether or not the field supports multiple values.
+
+See [Metadata Blocks](https://guides.dataverse.org/en/5.13/api/native-api.html#metadata-blocks-api) in the API Guide for details. (PR #9213)
+
+### Advanced Database Settings
+
+You can now enable advanced database connection pool configurations useful for debugging and monitoring as well as other settings. Of particular interest may be `sslmode=require`. See the new [Database Persistence](https://guides.dataverse.org/en/5.13/installation/config.html#database-persistence) section of the Installation Guide for details. (PR #8915)
+
+### Support for Cleaning up Leftover Files in Dataset Storage
+
+Experimental feature: the leftover files stored in the Dataset storage location that are not in the file list of that Dataset, but are named following the Dataverse technical convention for dataset files, can be removed with the new [Cleanup Storage of a Dataset](https://guides.dataverse.org/en/5.13/api/native-api.html#cleanup-storage-of-a-dataset) API endpoint.
+
+### OAI Server Bug Fixed
+
+A bug introduced in 5.12 was preventing the Dataverse OAI server from serving incremental harvesting requests from clients. It was fixed in this release (PR #9316).
+
+## Major Use Cases and Infrastructure Enhancements
+
+Changes and fixes in this release not already mentioned above include:
+
+- Administrators can configure an alternative storage location where files uploaded via the UI are temporarily stored during the transfer from client to server. (PR #8983, See also [Configuration Guide](http://guides.dataverse.org/en/5.13/installation/config.html#temporary-upload-file-storage))
+- To improve performance, Dataverse estimates download counts. This release includes an update that makes the estimate more accurate. (PR #8972)
+- Direct upload and out-of-band uploads can now be used to replace multiple files with one API call (complementing the prior ability to add multiple new files). (PR #9018)
+- A persistent identifier, [CSRT](https://www.cstr.cn/search/specification/), is added to the Related Publication field's ID Type child field. For datasets published with CSRT IDs, Dataverse will also include them in the datasets' Schema.org metadata exports. (Issue #8838)
+- Datasets that are part of linked dataverse collections will now be displayed in their linking dataverse collections.
+
+## New JVM Options and MicroProfile Config Options
+
+The following JVM option is now available:
+
+- `dataverse.personOrOrg.assumeCommaInPersonName` - the default is false
+
+The following MicroProfile Config options are now available (these can be treated as JVM options):
+
+- `dataverse.files.uploads` - alternative storage location of generated temporary files for UI file uploads
+- `dataverse.api.signing-secret` - used by signed URLs
+- `dataverse.solr.host`
+- `dataverse.solr.port`
+- `dataverse.solr.protocol`
+- `dataverse.solr.core`
+- `dataverse.solr.path`
+- `dataverse.rserve.host`
+
+The following existing JVM options are now available via MicroProfile Config:
+
+- `dataverse.siteUrl`
+- `dataverse.fqdn`
+- `dataverse.files.directory`
+- `dataverse.rserve.host`
+- `dataverse.rserve.port`
+- `dataverse.rserve.user`
+- `dataverse.rserve.password`
+- `dataverse.rserve.tempdir`
+
+## Notes for Developers and Integrators
+
+See the "Backward Incompatibilities" section below.
+
+## Backward Incompatibilities
+
+### Schema.org
+
+The following changes have been made to Schema.org exports (necessary for the improvements mentioned above):
+
+- Descriptions are now joined and truncated to less than 5K characters.
+- The "citation"/"text" key has been replaced by a "citation"/"name" key.
+- File entries now have the mimetype reported as 'encodingFormat' rather than 'fileFormat' to better conform with the Schema.org specification for DataDownload entries. Download URLs are now sent for all files unless the dataverse.files.hide-schema-dot-org-download-urls setting is set to true.
+- Author/creators now have an @type of Person or Organization and any affiliation (affiliation for Person, parentOrganization for Organization) is now an object of @type Organization
+
+### License Files
+
+License files are now required to contain the new "sortOrder" column. When attempting to create a new license without this field, an error would be returned. See [Configuring Licenses](https://guides.dataverse.org/en/5.13/installation/config.html#configuring-licenses) section of the Installation Guide for reference.
+
+## Complete List of Changes
+
+For the complete list of code changes in this release, see the [5.13 milestone](https://github.com/IQSS/dataverse/milestone/107?closed=1) on GitHub.
+
+## Installation
+
+If this is a new installation, please see our [Installation Guide](https://guides.dataverse.org/en/5.13/installation/). Please don't be shy about [asking for help](https://guides.dataverse.org/en/5.13/installation/intro.html#getting-help) if you need it!
+
+After your installation has gone into production, you are welcome to add it to our [map of installations](https://dataverse.org/installations) by opening an issue in the [dataverse-installations](https://github.com/IQSS/dataverse-installations) repo.
+
+## Upgrade Instructions
+
+0\. These instructions assume that you've already successfully upgraded from version 4.x to 5.0 of the Dataverse software following the instructions in the [release notes for version 5.0](https://github.com/IQSS/dataverse/releases/tag/v5.0). After upgrading from the 4.x series to 5.0, you should progress through the other 5.x releases before attempting the upgrade to 5.13.
+
+If you are running Payara as a non-root user (and you should be!), **remember not to execute the commands below as root**. Use `sudo` to change to that user first. For example, `sudo -i -u dataverse` if `dataverse` is your dedicated application user.
+
+In the following commands we assume that Payara 5 is installed in `/usr/local/payara5`. If not, adjust as needed.
+
+`export PAYARA=/usr/local/payara5`
+
+(or `setenv PAYARA /usr/local/payara5` if you are using a `csh`-like shell)
+
+1\. Undeploy the previous version.
+
+- `$PAYARA/bin/asadmin list-applications`
+- `$PAYARA/bin/asadmin undeploy dataverse<-version>`
+
+2\. Stop Payara and remove the generated directory
+
+- `service payara stop`
+- `rm -rf $PAYARA/glassfish/domains/domain1/generated`
+
+3\. Start Payara
+
+- `service payara start`
+
+4\. Deploy this version.
+
+- `$PAYARA/bin/asadmin deploy dataverse-5.13.war`
+
+5\. Restart Payara
+
+- `service payara stop`
+- `service payara start`
+
+6\. Reload citation metadata block
+
+- `wget https://github.com/IQSS/dataverse/releases/download/v5.13/citation.tsv`
+- `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"`
+
+If you are running an English-only installation, you are finished with the citation block. Otherwise, download the updated citation.properties file and place in the [`dataverse.lang.directory`](https://guides.dataverse.org/en/5.13/installation/config.html#configuring-the-lang-directory).
+
+- `wget https://github.com/IQSS/dataverse/releases/download/v5.13/citation.properties`
+- `cp citation.properties /home/dataverse/langBundles`
+
+7\. Replace Solr schema.xml to allow multiple production locations and support for geospatial indexing to be used. See specific instructions below for those installations without custom metadata blocks (1a) and those with custom metadata blocks (1b).
+
+Note: with this release support for indexing of the experimental workflow metadata block has been removed from the standard schema.xml.
+If you are using the workflow metadata block be sure to follow the instructions in step 7b) below to maintain support for indexing workflow metadata.
+
+7a\. For installations without custom or experimental metadata blocks:
+
+- Stop Solr instance (usually service solr stop, depending on Solr installation/OS, see the [Installation Guide](https://guides.dataverse.org/en/5.13/installation/prerequisites.html#solr-init-script)
+
+- Replace schema.xml
+
+ - `cp /tmp/dvinstall/schema.xml /usr/local/solr/solr-8.11.1/server/solr/collection1/conf`
+
+- Start solr instance (usually service solr start, depending on Solr/OS)
+
+7b\. For installations with custom or experimental metadata blocks:
+
+- Stop solr instance (usually service solr stop, depending on solr installation/OS, see the [Installation Guide](https://guides.dataverse.org/en/5.13/installation/prerequisites.html#solr-init-script)
+
+- Edit the following line to your schema.xml (to indicate that productionPlace is now multiValued='true"):
+
+ ``
+
+- Add the following lines to your schema.xml to add support for geospatial indexing:
+
+ ``
+ ``
+ ``
+ ``
+ ``
+ ``
+ ``
+
+- Restart Solr instance (usually service solr start, depending on solr/OS)
+
+### Optional Upgrade Step: Reindex Linked Dataverse Collections
+
+Datasets that are part of linked dataverse collections will now be displayed in
+their linking dataverse collections. In order to fix the display of collections
+that have already been linked you must re-index the linked collections. This
+query will provide a list of commands to re-index the effected collections:
+
+```
+select 'curl http://localhost:8080/api/admin/index/dataverses/'
+|| tmp.dvid from (select distinct dataverse_id as dvid
+from dataverselinkingdataverse) as tmp
+```
+
+The result of the query will be a list of re-index commands such as:
+
+`curl http://localhost:8080/api/admin/index/dataverses/633`
+
+where '633' is the id of the linked collection.
+
+### Optional Upgrade Step: Run File Detection on .eln Files
+
+Now that .eln files are recognized, you can run the [Redetect File Type](https://guides.dataverse.org/en/5.13/api/native-api.html#redetect-file-type) API on them to switch them from "unknown" to "ELN Archive". Afterward, you can reindex these files to make them appear in search facets.
diff --git a/doc/release-notes/6656-file-uploads.md b/doc/release-notes/6656-file-uploads.md
deleted file mode 100644
index a2430a5d0a8..00000000000
--- a/doc/release-notes/6656-file-uploads.md
+++ /dev/null
@@ -1 +0,0 @@
-new JVM option: dataverse.files.uploads
diff --git a/doc/release-notes/6807-binder.md b/doc/release-notes/6807-binder.md
deleted file mode 100644
index 1843f7300a2..00000000000
--- a/doc/release-notes/6807-binder.md
+++ /dev/null
@@ -1 +0,0 @@
-Binder has been added to the list of external tools that can be added to a Dataverse installation. From the dataset page, you can launch Binder, which spins up a computational environment in which you can explore the code and data in the dataset, or write new code, such as a Jupyter notebook.
diff --git a/doc/release-notes/7349-1-schema.org-updates.md b/doc/release-notes/7349-1-schema.org-updates.md
deleted file mode 100644
index 2934a596001..00000000000
--- a/doc/release-notes/7349-1-schema.org-updates.md
+++ /dev/null
@@ -1,3 +0,0 @@
-The Schema.org metadata export and the schema.org metadata embedded in dataset pages has been updated to improve compliance with Schema.org's schema and Google's recommendations.
-
-Backward compatibility - descriptions are now joined and truncated to less than 5K characters.
\ No newline at end of file
diff --git a/doc/release-notes/7349-2-schema.org-updates.md b/doc/release-notes/7349-2-schema.org-updates.md
deleted file mode 100644
index 41f2dfb766a..00000000000
--- a/doc/release-notes/7349-2-schema.org-updates.md
+++ /dev/null
@@ -1,3 +0,0 @@
-The Schema.org metadata export and the schema.org metadata embedded in dataset pages has been updated to improve compliance with Schema.org's schema and Google's recommendations.
-
-Backward compatibility - the "citation"/"text" key has been replaced by a "citation"/"name" key.
\ No newline at end of file
diff --git a/doc/release-notes/7349-3-schema.org-updates.md b/doc/release-notes/7349-3-schema.org-updates.md
deleted file mode 100644
index dc18fe2a24a..00000000000
--- a/doc/release-notes/7349-3-schema.org-updates.md
+++ /dev/null
@@ -1,3 +0,0 @@
-The Schema.org metadata export and the schema.org metadata embedded in dataset pages has been updated to improve compliance with Schema.org's schema and Google's recommendations.
-
-Backward compatibility - file entries now have the mimetype reported as 'encodingFormat' rather than 'fileFormat' to better conform with the Schema.org specification for DataDownload entries. Download URLs are now sent for all files unless the dataverse.files.hide-schema-dot-org-download-urls setting is set to true.
diff --git a/doc/release-notes/7349-4-schema.org-updates.md b/doc/release-notes/7349-4-schema.org-updates.md
deleted file mode 100644
index 2c78243dc29..00000000000
--- a/doc/release-notes/7349-4-schema.org-updates.md
+++ /dev/null
@@ -1,5 +0,0 @@
-The Schema.org metadata export and the schema.org metadata embedded in dataset pages has been updated to improve compliance with Schema.org's schema and Google's recommendations.
-
-New jvm-option: dataverse.personOrOrg.assumeCommaInPersonName, default is false
-
-Backward compatibility - author/creators now have an @type of Person or Organization and any affiliation (affiliation for Person, parentOrganization for Organization) is now an object of @type Organization
\ No newline at end of file
diff --git a/doc/release-notes/7715-signed-urls-for-external-tools.md b/doc/release-notes/7715-signed-urls-for-external-tools.md
deleted file mode 100644
index c2d3859c053..00000000000
--- a/doc/release-notes/7715-signed-urls-for-external-tools.md
+++ /dev/null
@@ -1,3 +0,0 @@
-# Improved Security for External Tools
-
-This release adds support for configuring external tools to use signed URLs to access the Dataverse API. This eliminates the need for tools to have access to the user's apiToken in order to access draft or restricted datasets and datafiles. Signed URLS can be transferred via POST or via a callback when triggering a tool via GET.
\ No newline at end of file
diff --git a/doc/release-notes/7844-codemeta.md b/doc/release-notes/7844-codemeta.md
deleted file mode 100644
index 4d98c1f840f..00000000000
--- a/doc/release-notes/7844-codemeta.md
+++ /dev/null
@@ -1,14 +0,0 @@
-# Experimental CodeMeta Schema Support
-
-With this release, we are adding "experimental" (see note below) support for research software metadata deposits.
-
-By adding a metadata block for [CodeMeta](https://codemeta.github.io), we take another step extending the Dataverse
-scope being a research data repository towards first class support of diverse F.A.I.R. objects, currently focusing
-on research software and computational workflows.
-
-There is more work underway to make Dataverse installations around the world "research software ready". We hope
-for feedback from installations on the new metadata block to optimize and lift it from the experimental stage.
-
-**Note:** like the metadata block for computational workflows before, this schema is flagged as "experimental".
-"Experimental" means it's brand new, opt-in, and might need future tweaking based on experience of usage in the field.
-These blocks are listed here: https://guides.dataverse.org/en/latest/user/appendix.html#experimental-metadata
\ No newline at end of file
diff --git a/doc/release-notes/7940-stop-harvest-in-progress b/doc/release-notes/7940-stop-harvest-in-progress
deleted file mode 100644
index cb27a900f15..00000000000
--- a/doc/release-notes/7940-stop-harvest-in-progress
+++ /dev/null
@@ -1,4 +0,0 @@
-## Mechanism added for stopping a harvest in progress
-
-It is now possible for an admin to stop a long-running harvesting job. See [Harvesting Clients](https://guides.dataverse.org/en/latest/admin/harvestclients.html) guide for more information.
-
diff --git a/doc/release-notes/7980-enhanced-dsd.md b/doc/release-notes/7980-enhanced-dsd.md
deleted file mode 100644
index d69f201e565..00000000000
--- a/doc/release-notes/7980-enhanced-dsd.md
+++ /dev/null
@@ -1,10 +0,0 @@
-### Default Values for Database Connections Fixed
-
-Introduced in Dataverse release 5.3 a regression might have hit you:
-the announced default values for the database connection never actually worked.
-
-With the update to Payara 5.2022.3 it was possible to introduce working
-defaults. The documentation has been changed accordingly.
-
-Together with this change, you can now enable advanced connection pool
-configurations useful for debugging and monitoring. Of particular interest may be `sslmode=require`. See the docs for details.
diff --git a/doc/release-notes/8239-geospatial-indexing.md b/doc/release-notes/8239-geospatial-indexing.md
deleted file mode 100644
index 165cb9031ba..00000000000
--- a/doc/release-notes/8239-geospatial-indexing.md
+++ /dev/null
@@ -1,5 +0,0 @@
-Support for indexing the "Geographic Bounding Box" fields ("West Longitude", "East Longitude", "North Latitude", and "South Latitude") from the Geospatial metadata block has been added.
-
-Geospatial search is supported but only via API using two new parameters: `geo_point` and `geo_radius`.
-
-A Solr schema update is required.
diff --git a/doc/release-notes/8671-sorting-licenses.md b/doc/release-notes/8671-sorting-licenses.md
deleted file mode 100644
index 4ceb9ec056f..00000000000
--- a/doc/release-notes/8671-sorting-licenses.md
+++ /dev/null
@@ -1,7 +0,0 @@
-## License sorting
-
-Licenses as shown in the dropdown in UI can be now sorted by the superusers. See [Configuring Licenses](https://guides.dataverse.org/en/5.10/installation/config.html#configuring-licenses) section of the Installation Guide for reference.
-
-## Backward Incompatibilities
-
-License files are now required to contain the new "sortOrder" column. When attempting to create a new license without this field, an error would be returned. See [Configuring Licenses](https://guides.dataverse.org/en/5.10/installation/config.html#configuring-licenses) section of the Installation Guide for reference.
\ No newline at end of file
diff --git a/doc/release-notes/8724-display-child-datasets-of-linked-dv.md b/doc/release-notes/8724-display-child-datasets-of-linked-dv.md
deleted file mode 100644
index 5b1b9c8ae20..00000000000
--- a/doc/release-notes/8724-display-child-datasets-of-linked-dv.md
+++ /dev/null
@@ -1,14 +0,0 @@
-Datasets that are part of linked dataverse collections will now be displayed in
-their linking dataverse collections. In order to fix the display of collections
-that have already been linked you must re-index the linked collections. This
-query will provide a list of commands to re-index the effected collections:
-
-select 'curl http://localhost:8080/api/admin/index/dataverses/'
-|| tmp.dvid from (select distinct dataverse_id as dvid
-from dataverselinkingdataverse) as tmp
-
-The result of the query will be a list of re-index commands such as:
-
-curl http://localhost:8080/api/admin/index/dataverses/633
-
-where '633' is the id of the linked collection.
diff --git a/doc/release-notes/8838-cstr.md b/doc/release-notes/8838-cstr.md
deleted file mode 100644
index d6bcd33f412..00000000000
--- a/doc/release-notes/8838-cstr.md
+++ /dev/null
@@ -1,13 +0,0 @@
-### CSRT PID Types Added to Related Publication ID Type field
-
-A persistent identifier, [CSRT](https://www.cstr.cn/search/specification/), is added to the Related Publication field's ID Type child field. For datasets published with CSRT IDs, Dataverse will also include them in the datasets' Schema.org metadata exports.
-
-The CSRT
-
-### Required Upgrade Steps
-
-Update the Citation metadata block:
-
-- `wget https://github.com/IQSS/dataverse/releases/download/v#.##/citation.tsv`
-- `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"`
-- Add the updated citation.properties file to the appropriate directory
\ No newline at end of file
diff --git a/doc/release-notes/8840-improved-download-estimate.md b/doc/release-notes/8840-improved-download-estimate.md
deleted file mode 100644
index cb264b7e683..00000000000
--- a/doc/release-notes/8840-improved-download-estimate.md
+++ /dev/null
@@ -1 +0,0 @@
-To improve performance, Dataverse estimates download counts. This release includes an update that makes the estimate more accurate.
\ No newline at end of file
diff --git a/doc/release-notes/8944-metadatablocks.md b/doc/release-notes/8944-metadatablocks.md
deleted file mode 100644
index 35bb7808e59..00000000000
--- a/doc/release-notes/8944-metadatablocks.md
+++ /dev/null
@@ -1,5 +0,0 @@
-The API endpoint `/api/metadatablocks/{block_id}` has been extended to include the following fields:
-
-- `controlledVocabularyValues` - All possible values for fields with a controlled vocabulary. For example, the values "Agricultural Sciences", "Arts and Humanities", etc. for the "Subject" field.
-- `isControlledVocabulary`: Whether or not this field has a controlled vocabulary.
-- `multiple`: Whether or not the field supports multiple values.
diff --git a/doc/release-notes/9005-replaceFiles-api-call b/doc/release-notes/9005-replaceFiles-api-call
deleted file mode 100644
index d1a86efb745..00000000000
--- a/doc/release-notes/9005-replaceFiles-api-call
+++ /dev/null
@@ -1,3 +0,0 @@
-9005
-
-Direct upload and out-of-band uploads can now be used to replace multiple files with one API call (complementing the prior ability to add multiple new files)
diff --git a/doc/release-notes/9096-folder-upload.md b/doc/release-notes/9096-folder-upload.md
deleted file mode 100644
index 0345cd6c334..00000000000
--- a/doc/release-notes/9096-folder-upload.md
+++ /dev/null
@@ -1 +0,0 @@
-Dataverse can now support upload of an entire folder tree of files and retain the relative paths of files as directory path metadata for the uploaded files, if the installation is configured with S3 direct upload.
diff --git a/doc/release-notes/9117-file-type-detection.md b/doc/release-notes/9117-file-type-detection.md
deleted file mode 100644
index 462eaace8ed..00000000000
--- a/doc/release-notes/9117-file-type-detection.md
+++ /dev/null
@@ -1,5 +0,0 @@
-NetCDF and HDF5 files are now detected based on their content rather than just their file extension.
-
-Both "classic" NetCDF 3 files and more modern NetCDF 4 files are detected based on content.
-
-Detection for HDF4 files is only done through the file extension ".hdf", as before.
diff --git a/doc/release-notes/9130-cleanup-storage.md b/doc/release-notes/9130-cleanup-storage.md
deleted file mode 100644
index 052123d5dd5..00000000000
--- a/doc/release-notes/9130-cleanup-storage.md
+++ /dev/null
@@ -1,3 +0,0 @@
-### Support for cleaning up files in datasets' storage
-
-Experimental feature: the leftover files stored in the Dataset storage location that are not in the file list of that Dataset, but are named following the Dataverse technical convetion for dataset files, can be removed with the new native API call [Cleanup storage of a Dataset](https://guides.dataverse.org/en/latest/api/native-api.html#cleanup-storage-api).
\ No newline at end of file
diff --git a/doc/release-notes/9153-extract-metadata.md b/doc/release-notes/9153-extract-metadata.md
deleted file mode 100644
index be21c5ed739..00000000000
--- a/doc/release-notes/9153-extract-metadata.md
+++ /dev/null
@@ -1,3 +0,0 @@
-For NetCDF and HDF5 files, an attempt will be made to extract metadata in NcML (XML) format and save it as an auxiliary file.
-
-An "extractNcml" API endpoint has been added, especially for installations with existing NetCDF and HDF5 files. After upgrading, they can iterate through these files and try to extract an NcML file.
diff --git a/doc/release-notes/9224-remove-workflow-fields-from-index.md b/doc/release-notes/9224-remove-workflow-fields-from-index.md
deleted file mode 100644
index 4b324ce7b57..00000000000
--- a/doc/release-notes/9224-remove-workflow-fields-from-index.md
+++ /dev/null
@@ -1,12 +0,0 @@
-# Optional: remove Workflow Schema fields from Solr index
-
-In Dataverse 5.12 we added a new experimental metadata schema block for workflow deposition.
-We included the fields within the standard Solr schema we provide. With this version, we
-removed it from the schema. If you are deploying the block to your installation, make sure to
-update your index.
-
-If you already added these fields, you can delete them from your index when not using the schema.
-Make sure to [reindex after changing the schema](https://guides.dataverse.org/en/latest/admin/solr-search-index.html?highlight=reindex#reindex-in-place.
-
-Remember: depending on the size of your installation, reindexing may take serious time to complete.
-You should do this in off-hours.
\ No newline at end of file
diff --git a/doc/release-notes/9253-productionPlace.md b/doc/release-notes/9253-productionPlace.md
deleted file mode 100644
index 644b6fae2fb..00000000000
--- a/doc/release-notes/9253-productionPlace.md
+++ /dev/null
@@ -1,42 +0,0 @@
-## Metadata field Production Location now repeatable, facetable, and enabled for Advanced Search
-This enhancement allows depositors to define multiple instances of the metadata field Production Location in the Citation Metadata block, users to filter search results using the filter facets, and using the field in the Advanced Search option.
-
-## Major Use Cases and Infrastructure Enhancements
-* Data contained in a dataset may have been produced at multiple places. Making the field Production Location repeatable will make it possible to reflect this fact in the dataset metadata. Making the field facetable and enabled for Advanced Search will allow us to customize Dataverse collections more appropriately. (Issue #9253, PR #9254)
-
-### Additional Upgrade Steps
-
-Update the Citation metadata block:
-
-- `wget https://github.com/IQSS/dataverse/releases/download/v5.13/citation.tsv`
-- `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"`
-
-## Additional Release Steps
-
-1\. Replace Solr schema.xml to allow multiple production locations to be used. See specific instructions below for those installations without custom metadata blocks (1a) and those with custom metadata blocks (1b).
-
-1a\.
-
-For installations without Custom Metadata Blocks:
-
--stop solr instance (usually service solr stop, depending on solr installation/OS, see the [Installation Guide](https://guides.dataverse.org/en/5.13/installation/prerequisites.html#solr-init-script)
-
--replace schema.xml
-
-cp /tmp/dvinstall/schema.xml /usr/local/solr/solr-8.11.1/server/solr/collection1/conf
-
--start solr instance (usually service solr start, depending on solr/OS)
-
-
-1b\.
-
-For installations with Custom Metadata Blocks:
-
--stop solr instance (usually service solr stop, depending on solr installation/OS, see the [Installation Guide](https://guides.dataverse.org/en/5.13/installation/prerequisites.html#solr-init-script)
-
-- edit the following line to your schema.xml (to indicate that productionPlace is now multiValued='true"):
-
- ``
-
-- restart solr instance (usually service solr start, depending on solr/OS)
-
diff --git a/doc/release-notes/9363-eln.md b/doc/release-notes/9363-eln.md
deleted file mode 100644
index c9529b3aab5..00000000000
--- a/doc/release-notes/9363-eln.md
+++ /dev/null
@@ -1,7 +0,0 @@
-Support for .eln files has been added.
-
-See file format specification: https://github.com/TheELNConsortium/TheELNFileFormat
-
-The .eln file format is used by Electronic Laboratory Notebooks as an exchange format for experimental protocols, results, sample descriptions, etc...
-
-In order for existing .eln files to show as zip files (instead of unknown), you must run the file re-detection API on them and reindex them into Solr.