diff --git a/README.md b/README.md
index b78682d647a..2333787d4b9 100644
--- a/README.md
+++ b/README.md
@@ -1,7 +1,7 @@
Dataverse®
===============
-Dataverse is an [open source][] software platform for sharing, finding, citing, and preserving research data (developed by the [Dataverse team](https://dataverse.org/about) at the [Institute for Quantitative Social Science](http://iq.harvard.edu/) and the [Dataverse community][]).
+Dataverse is an [open source][] software platform for sharing, finding, citing, and preserving research data (developed by the [Data Science and Products team](https://www.iq.harvard.edu/people/people/data-science-products) at the [Institute for Quantitative Social Science](https://iq.harvard.edu/) and the [Dataverse community][]).
[dataverse.org][] is our home on the web and shows a map of Dataverse installations around the world, a list of [features][], [integrations][] that have been made possible through [REST APIs][], our development [roadmap][], and more.
@@ -26,15 +26,15 @@ Dataverse is a trademark of President and Fellows of Harvard College and is regi
[dataverse.org]: https://dataverse.org
[demo.dataverse.org]: https://demo.dataverse.org
[Dataverse community]: https://dataverse.org/developers
-[Installation Guide]: http://guides.dataverse.org/en/latest/installation/index.html
+[Installation Guide]: https://guides.dataverse.org/en/latest/installation/index.html
[latest release]: https://github.com/IQSS/dataverse/releases
[features]: https://dataverse.org/software-features
[roadmap]: https://www.iq.harvard.edu/roadmap-dataverse-project
[integrations]: https://dataverse.org/integrations
-[REST APIs]: http://guides.dataverse.org/en/latest/api/index.html
+[REST APIs]: https://guides.dataverse.org/en/latest/api/index.html
[Contributing Guide]: CONTRIBUTING.md
[mailing list]: https://groups.google.com/group/dataverse-community
[community call]: https://dataverse.org/community-calls
-[chat.dataverse.org]: http://chat.dataverse.org
+[chat.dataverse.org]: https://chat.dataverse.org
[Dataverse Community Meeting]: https://dataverse.org/events
[open source]: LICENSE.md
diff --git a/doc/sphinx-guides/source/_static/navbarscroll.js b/doc/sphinx-guides/source/_static/navbarscroll.js
index 66c9d4d7995..735f80870cd 100644
--- a/doc/sphinx-guides/source/_static/navbarscroll.js
+++ b/doc/sphinx-guides/source/_static/navbarscroll.js
@@ -1,6 +1,6 @@
/*
Use to fix hidden section headers behind the navbar when using links with targets
- See: http://stackoverflow.com/questions/10732690/offsetting-an-html-anchor-to-adjust-for-fixed-header
+ See: https://stackoverflow.com/questions/10732690/offsetting-an-html-anchor-to-adjust-for-fixed-header
*/
$jqTheme(document).ready(function() {
$jqTheme('a[href*="#"]:not([href="#"])').on('click', function() {
diff --git a/doc/sphinx-guides/source/_templates/navbar.html b/doc/sphinx-guides/source/_templates/navbar.html
index 538cccf74d7..c7b81dcb937 100644
--- a/doc/sphinx-guides/source/_templates/navbar.html
+++ b/doc/sphinx-guides/source/_templates/navbar.html
@@ -15,7 +15,7 @@
- Dataverse Project
+ Dataverse Project
@@ -24,15 +24,15 @@
About
-
+
Community
@@ -49,18 +49,18 @@
Software
-
+
Contact
diff --git a/doc/sphinx-guides/source/admin/monitoring.rst b/doc/sphinx-guides/source/admin/monitoring.rst
index a4affda1302..e902d5fdcc9 100644
--- a/doc/sphinx-guides/source/admin/monitoring.rst
+++ b/doc/sphinx-guides/source/admin/monitoring.rst
@@ -14,7 +14,7 @@ In production you'll want to monitor the usual suspects such as CPU, memory, fre
Munin
+++++
-http://munin-monitoring.org says, "A default installation provides a lot of graphs with almost no work." From RHEL or CentOS 7, you can try the following steps.
+https://munin-monitoring.org says, "A default installation provides a lot of graphs with almost no work." From RHEL or CentOS 7, you can try the following steps.
Enable the EPEL yum repo (if you haven't already):
diff --git a/doc/sphinx-guides/source/api/client-libraries.rst b/doc/sphinx-guides/source/api/client-libraries.rst
index 4aa5b935e27..9612616f127 100755
--- a/doc/sphinx-guides/source/api/client-libraries.rst
+++ b/doc/sphinx-guides/source/api/client-libraries.rst
@@ -24,7 +24,7 @@ Java
https://github.com/IQSS/dataverse-client-java is the official Java library for Dataverse APIs.
-`Richard Adams `_ from `ResearchSpace `_ created and maintains this library.
+`Richard Adams `_ from `ResearchSpace `_ created and maintains this library.
Javascript
----------
@@ -54,7 +54,7 @@ There are multiple Python modules for interacting with Dataverse APIs.
`python-dvuploader `_ implements Jim Myers' excellent `dv-uploader `_ as a Python module. It offers parallel direct uploads to Dataverse backend storage, streams files directly instead of buffering them in memory, and supports multi-part uploads, chunking data accordingly.
-`pyDataverse `_ primarily allows developers to manage Dataverse collections, datasets and datafiles. Its intention is to help with data migrations and DevOps activities such as testing and configuration management. The module is developed by `Stefan Kasberger `_ from `AUSSDA - The Austrian Social Science Data Archive `_.
+`pyDataverse `_ primarily allows developers to manage Dataverse collections, datasets and datafiles. Its intention is to help with data migrations and DevOps activities such as testing and configuration management. The module is developed by `Stefan Kasberger `_ from `AUSSDA - The Austrian Social Science Data Archive `_.
`UBC's Dataverse Utilities `_ are a set of Python console utilities which allow one to upload datasets from a tab-separated-value spreadsheet, bulk release multiple datasets, bulk delete unpublished datasets, quickly duplicate records. replace licenses, and more. For additional information see their `PyPi page `_.
@@ -68,8 +68,7 @@ R
https://github.com/IQSS/dataverse-client-r is the official R package for Dataverse APIs. The latest release can be installed from `CRAN `_.
The R client can search and download datasets. It is useful when automatically (instead of manually) downloading data files as part of a script. For bulk edit and upload operations, we currently recommend pyDataverse.
-The package is currently maintained by `Shiro Kuriwaki `_. It was originally created by `Thomas Leeper `_ and then formerly maintained by `Will Beasley `_.
-
+The package is currently maintained by `Shiro Kuriwaki `_. It was originally created by `Thomas Leeper `_ and then formerly maintained by `Will Beasley `_.
Ruby
----
diff --git a/doc/sphinx-guides/source/api/external-tools.rst b/doc/sphinx-guides/source/api/external-tools.rst
index d12e4b17549..c72b52637c7 100644
--- a/doc/sphinx-guides/source/api/external-tools.rst
+++ b/doc/sphinx-guides/source/api/external-tools.rst
@@ -11,7 +11,7 @@ Introduction
External tools are additional applications the user can access or open from your Dataverse installation to preview, explore, and manipulate data files and datasets. The term "external" is used to indicate that the tool is not part of the main Dataverse Software.
-Once you have created the external tool itself (which is most of the work!), you need to teach a Dataverse installation how to construct URLs that your tool needs to operate. For example, if you've deployed your tool to fabulousfiletool.com your tool might want the ID of a file and the siteUrl of the Dataverse installation like this: https://fabulousfiletool.com?fileId=42&siteUrl=http://demo.dataverse.org
+Once you have created the external tool itself (which is most of the work!), you need to teach a Dataverse installation how to construct URLs that your tool needs to operate. For example, if you've deployed your tool to fabulousfiletool.com your tool might want the ID of a file and the siteUrl of the Dataverse installation like this: https://fabulousfiletool.com?fileId=42&siteUrl=https://demo.dataverse.org
In short, you will be creating a manifest in JSON format that describes not only how to construct URLs for your tool, but also what types of files your tool operates on, where it should appear in the Dataverse installation web interfaces, etc.
@@ -102,7 +102,7 @@ Terminology
httpMethod Either ``GET`` or ``POST``.
- queryParameters **Key/value combinations** that can be appended to the toolUrl. For example, once substitution takes place (described below) the user may be redirected to ``https://fabulousfiletool.com?fileId=42&siteUrl=http://demo.dataverse.org``.
+ queryParameters **Key/value combinations** that can be appended to the toolUrl. For example, once substitution takes place (described below) the user may be redirected to ``https://fabulousfiletool.com?fileId=42&siteUrl=https://demo.dataverse.org``.
query parameter keys An **arbitrary string** to associate with a value that is populated with a reserved word (described below). As the author of the tool, you have control over what "key" you would like to be passed to your tool. For example, if you want to have your tool receive and operate on the query parameter "dataverseFileId=42" instead of just "fileId=42", that's fine.
diff --git a/doc/sphinx-guides/source/api/getting-started.rst b/doc/sphinx-guides/source/api/getting-started.rst
index a6f6c259a25..a50f12d1381 100644
--- a/doc/sphinx-guides/source/api/getting-started.rst
+++ b/doc/sphinx-guides/source/api/getting-started.rst
@@ -9,7 +9,7 @@ If you are a researcher or curator who wants to automate parts of your workflow,
Servers You Can Test With
-------------------------
-Rather than using a production Dataverse installation, API users are welcome to use http://demo.dataverse.org for testing. You can email support@dataverse.org if you have any trouble with this server.
+Rather than using a production Dataverse installation, API users are welcome to use https://demo.dataverse.org for testing. You can email support@dataverse.org if you have any trouble with this server.
If you would rather have full control over your own test server, deployments to AWS, Docker, and more are covered in the :doc:`/developers/index` and the :doc:`/installation/index`.
diff --git a/doc/sphinx-guides/source/api/intro.rst b/doc/sphinx-guides/source/api/intro.rst
index 933932cd7b9..6c61bb8c20d 100755
--- a/doc/sphinx-guides/source/api/intro.rst
+++ b/doc/sphinx-guides/source/api/intro.rst
@@ -237,7 +237,7 @@ Dataverse Software API questions are on topic in all the usual places:
- The dataverse-community Google Group: https://groups.google.com/forum/#!forum/dataverse-community
- The Dataverse Project community calls: https://dataverse.org/community-calls
-- The Dataverse Project chat room: http://chat.dataverse.org
+- The Dataverse Project chat room: https://chat.dataverse.org
- The Dataverse Project ticketing system: support@dataverse.org
After your question has been answered, you are welcome to help improve the :doc:`faq` section of this guide.
diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst
index 7ae1194ebc6..1992390410c 100644
--- a/doc/sphinx-guides/source/api/native-api.rst
+++ b/doc/sphinx-guides/source/api/native-api.rst
@@ -9,7 +9,7 @@ The Dataverse Software exposes most of its GUI functionality via a REST-based AP
.. _CORS: https://www.w3.org/TR/cors/
-.. warning:: The Dataverse Software's API is versioned at the URI - all API calls may include the version number like so: ``http://server-address/api/v1/...``. Omitting the ``v1`` part would default to the latest API version (currently 1). When writing scripts/applications that will be used for a long time, make sure to specify the API version, so they don't break when the API is upgraded.
+.. warning:: The Dataverse Software's API is versioned at the URI - all API calls may include the version number like so: ``https://server-address/api/v1/...``. Omitting the ``v1`` part would default to the latest API version (currently 1). When writing scripts/applications that will be used for a long time, make sure to specify the API version, so they don't break when the API is upgraded.
.. contents:: |toctitle|
:local:
@@ -503,7 +503,7 @@ The fully expanded example above (without environment variables) looks like this
curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT "https://demo.dataverse.org/api/dataverses/root/metadatablocks/isRoot"
-.. note:: Previous endpoints ``$SERVER/api/dataverses/$id/metadatablocks/:isRoot`` and ``POST http://$SERVER/api/dataverses/$id/metadatablocks/:isRoot?key=$apiKey`` are deprecated, but supported.
+.. note:: Previous endpoints ``$SERVER/api/dataverses/$id/metadatablocks/:isRoot`` and ``POST https://$SERVER/api/dataverses/$id/metadatablocks/:isRoot?key=$apiKey`` are deprecated, but supported.
.. _create-dataset-command:
@@ -795,7 +795,7 @@ Getting its draft version:
export SERVER_URL=https://demo.dataverse.org
export PERSISTENT_IDENTIFIER=doi:10.5072/FK2/J8SJZB
- curl -H "X-Dataverse-key:$API_TOKEN" "http://$SERVER/api/datasets/:persistentId/versions/:draft?persistentId=$PERSISTENT_IDENTIFIER"
+ curl -H "X-Dataverse-key:$API_TOKEN" "https://$SERVER/api/datasets/:persistentId/versions/:draft?persistentId=$PERSISTENT_IDENTIFIER"
The fully expanded example above (without environment variables) looks like this:
@@ -2857,7 +2857,7 @@ The fully expanded example above (without environment variables) looks like this
Currently the following methods are used to detect file types:
- The file type detected by the browser (or sent via API).
-- JHOVE: http://jhove.openpreservation.org
+- JHOVE: https://jhove.openpreservation.org
- The file extension (e.g. ".ipybn") is used, defined in a file called ``MimeTypeDetectionByFileExtension.properties``.
- The file name (e.g. "Dockerfile") is used, defined in a file called ``MimeTypeDetectionByFileName.properties``.
@@ -3202,7 +3202,7 @@ The fully expanded example above (without environment variables) looks like this
curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X POST \
-F 'jsonData={"description":"My description bbb.","provFreeform":"Test prov freeform","categories":["Data"],"dataFileTags":["Survey"],"restrict":false}' \
- "http://demo.dataverse.org/api/files/24/metadata"
+ "https://demo.dataverse.org/api/files/24/metadata"
A curl example using a ``PERSISTENT_ID``
diff --git a/doc/sphinx-guides/source/api/sword.rst b/doc/sphinx-guides/source/api/sword.rst
index 11b43e98774..51391784bde 100755
--- a/doc/sphinx-guides/source/api/sword.rst
+++ b/doc/sphinx-guides/source/api/sword.rst
@@ -9,19 +9,19 @@ SWORD_ stands for "Simple Web-service Offering Repository Deposit" and is a "pro
About
-----
-Introduced in Dataverse Network (DVN) `3.6 `_, the SWORD API was formerly known as the "Data Deposit API" and ``data-deposit/v1`` appeared in the URLs. For backwards compatibility these URLs continue to work (with deprecation warnings). Due to architectural changes and security improvements (especially the introduction of API tokens) in Dataverse Software 4.0, a few backward incompatible changes were necessarily introduced and for this reason the version has been increased to ``v1.1``. For details, see :ref:`incompatible`.
+Introduced in Dataverse Network (DVN) `3.6 `_, the SWORD API was formerly known as the "Data Deposit API" and ``data-deposit/v1`` appeared in the URLs. For backwards compatibility these URLs continue to work (with deprecation warnings). Due to architectural changes and security improvements (especially the introduction of API tokens) in Dataverse Software 4.0, a few backward incompatible changes were necessarily introduced and for this reason the version has been increased to ``v1.1``. For details, see :ref:`incompatible`.
-The Dataverse Software implements most of SWORDv2_, which is specified at http://swordapp.github.io/SWORDv2-Profile/SWORDProfile.html . Please reference the `SWORDv2 specification`_ for expected HTTP status codes (i.e. 201, 204, 404, etc.), headers (i.e. "Location"), etc.
+The Dataverse Software implements most of SWORDv2_, which is specified at https://swordapp.github.io/SWORDv2-Profile/SWORDProfile.html . Please reference the `SWORDv2 specification`_ for expected HTTP status codes (i.e. 201, 204, 404, etc.), headers (i.e. "Location"), etc.
As a profile of AtomPub, XML is used throughout SWORD. As of Dataverse Software 4.0 datasets can also be created via JSON using the "native" API. SWORD is limited to the dozen or so fields listed below in the crosswalk, but the native API allows you to populate all metadata fields available in a Dataverse installation.
-.. _SWORD: http://en.wikipedia.org/wiki/SWORD_%28protocol%29
+.. _SWORD: https://en.wikipedia.org/wiki/SWORD_%28protocol%29
.. _SWORDv2: http://swordapp.org/sword-v2/sword-v2-specifications/
.. _RFC 5023: https://tools.ietf.org/html/rfc5023
-.. _SWORDv2 specification: http://swordapp.github.io/SWORDv2-Profile/SWORDProfile.html
+.. _SWORDv2 specification: https://swordapp.github.io/SWORDv2-Profile/SWORDProfile.html
.. _sword-auth:
@@ -86,7 +86,7 @@ New features as of v1.1
- "Contact E-mail" is automatically populated from dataset owner's email.
-- "Subject" uses our controlled vocabulary list of subjects. This list is in the Citation Metadata of our User Guide > `Metadata References `_. Otherwise, if a term does not match our controlled vocabulary list, it will put any subject terms in "Keyword". If Subject is empty it is automatically populated with "N/A".
+- "Subject" uses our controlled vocabulary list of subjects. This list is in the Citation Metadata of our User Guide > `Metadata References `_. Otherwise, if a term does not match our controlled vocabulary list, it will put any subject terms in "Keyword". If Subject is empty it is automatically populated with "N/A".
- Zero-length files are now allowed (but not necessarily encouraged).
@@ -127,7 +127,7 @@ Dublin Core Terms (DC Terms) Qualified Mapping - Dataverse Project DB Element Cr
+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
|dcterms:creator | authorName (LastName, FirstName) | Y | Author(s) for the Dataset. |
+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
-|dcterms:subject | subject (Controlled Vocabulary) OR keyword | Y | Controlled Vocabulary list is in our User Guide > `Metadata References `_. |
+|dcterms:subject | subject (Controlled Vocabulary) OR keyword | Y | Controlled Vocabulary list is in our User Guide > `Metadata References `_. |
+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
|dcterms:description | dsDescriptionValue | Y | Describing the purpose, scope or nature of the Dataset. Can also use dcterms:abstract. |
+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
diff --git a/doc/sphinx-guides/source/conf.py b/doc/sphinx-guides/source/conf.py
index 7ff17eb45ed..0660ec3b071 100755
--- a/doc/sphinx-guides/source/conf.py
+++ b/doc/sphinx-guides/source/conf.py
@@ -432,7 +432,7 @@
# Example configuration for intersphinx: refer to the Python standard library.
-intersphinx_mapping = {'http://docs.python.org/': None}
+intersphinx_mapping = {'https://docs.python.org/': None}
# Suppress "WARNING: unknown mimetype for ..." https://github.com/IQSS/dataverse/issues/3391
suppress_warnings = ['epub.unknown_project_files']
rst_prolog = """
diff --git a/doc/sphinx-guides/source/developers/classic-dev-env.rst b/doc/sphinx-guides/source/developers/classic-dev-env.rst
index 062a1bb36f3..d7b7f281634 100755
--- a/doc/sphinx-guides/source/developers/classic-dev-env.rst
+++ b/doc/sphinx-guides/source/developers/classic-dev-env.rst
@@ -46,7 +46,7 @@ On Linux, you are welcome to use the OpenJDK available from package managers.
Install Netbeans or Maven
~~~~~~~~~~~~~~~~~~~~~~~~~
-NetBeans IDE is recommended, and can be downloaded from http://netbeans.org . Developers may use any editor or IDE. We recommend NetBeans because it is free, works cross platform, has good support for Jakarta EE projects, and includes a required build tool, Maven.
+NetBeans IDE is recommended, and can be downloaded from https://netbeans.org . Developers may use any editor or IDE. We recommend NetBeans because it is free, works cross platform, has good support for Jakarta EE projects, and includes a required build tool, Maven.
Below we describe how to build the Dataverse Software war file with Netbeans but if you prefer to use only Maven, you can find installation instructions in the :doc:`tools` section.
@@ -86,7 +86,7 @@ On Mac, run this command:
``brew install jq``
-On Linux, install ``jq`` from your package manager or download a binary from http://stedolan.github.io/jq/
+On Linux, install ``jq`` from your package manager or download a binary from https://stedolan.github.io/jq/
Install Payara
~~~~~~~~~~~~~~
@@ -134,7 +134,7 @@ On Linux, you should just install PostgreSQL using your favorite package manager
Install Solr
^^^^^^^^^^^^
-`Solr `_ 9.3.0 is required.
+`Solr `_ 9.3.0 is required.
To install Solr, execute the following commands:
@@ -144,7 +144,7 @@ To install Solr, execute the following commands:
``cd /usr/local/solr``
-``curl -O http://archive.apache.org/dist/solr/solr/9.3.0/solr-9.3.0.tgz``
+``curl -O https://archive.apache.org/dist/solr/solr/9.3.0/solr-9.3.0.tgz``
``tar xvfz solr-9.3.0.tgz``
diff --git a/doc/sphinx-guides/source/developers/documentation.rst b/doc/sphinx-guides/source/developers/documentation.rst
index 78b0970aaa5..d07b5b63f72 100755
--- a/doc/sphinx-guides/source/developers/documentation.rst
+++ b/doc/sphinx-guides/source/developers/documentation.rst
@@ -36,7 +36,7 @@ If you would like to read more about the Dataverse Project's use of GitHub, plea
Building the Guides with Sphinx
-------------------------------
-The Dataverse guides are written using Sphinx (http://sphinx-doc.org). We recommend installing Sphinx on your localhost or using a Sphinx Docker container to build the guides locally so you can get an accurate preview of your changes.
+The Dataverse guides are written using Sphinx (https://sphinx-doc.org). We recommend installing Sphinx on your localhost or using a Sphinx Docker container to build the guides locally so you can get an accurate preview of your changes.
In case you decide to use a Sphinx Docker container to build the guides, you can skip the next two installation sections, but you will need to have Docker installed.
@@ -62,7 +62,7 @@ In some parts of the documentation, graphs are rendered as images using the Sphi
Building the guides requires the ``dot`` executable from GraphViz.
-This requires having `GraphViz `_ installed and either having ``dot`` on the path or
+This requires having `GraphViz `_ installed and either having ``dot`` on the path or
`adding options to the make call `_.
Editing and Building the Guides
@@ -71,7 +71,7 @@ Editing and Building the Guides
To edit the existing documentation:
- Create a branch (see :ref:`how-to-make-a-pull-request`).
-- In ``doc/sphinx-guides/source`` you will find the .rst files that correspond to http://guides.dataverse.org.
+- In ``doc/sphinx-guides/source`` you will find the .rst files that correspond to https://guides.dataverse.org.
- Using your preferred text editor, open and edit the necessary files, or create new ones.
Once you are done, you can preview the changes by building the guides locally. As explained, you can build the guides with Sphinx locally installed, or with a Docker container.
diff --git a/doc/sphinx-guides/source/developers/intro.rst b/doc/sphinx-guides/source/developers/intro.rst
index 4a64c407fc1..a01a8066897 100755
--- a/doc/sphinx-guides/source/developers/intro.rst
+++ b/doc/sphinx-guides/source/developers/intro.rst
@@ -2,7 +2,7 @@
Introduction
============
-Welcome! `The Dataverse Project `_ is an `open source `_ project that loves `contributors `_!
+Welcome! `The Dataverse Project `_ is an `open source `_ project that loves `contributors `_!
.. contents:: |toctitle|
:local:
@@ -19,7 +19,7 @@ To get started, you'll want to set up your :doc:`dev-environment` and make sure
Getting Help
------------
-If you have any questions at all, please reach out to other developers via the channels listed in https://github.com/IQSS/dataverse/blob/develop/CONTRIBUTING.md such as http://chat.dataverse.org, the `dataverse-dev `_ mailing list, `community calls `_, or support@dataverse.org.
+If you have any questions at all, please reach out to other developers via the channels listed in https://github.com/IQSS/dataverse/blob/develop/CONTRIBUTING.md such as https://chat.dataverse.org, the `dataverse-dev `_ mailing list, `community calls `_, or support@dataverse.org.
.. _core-technologies:
diff --git a/doc/sphinx-guides/source/developers/testing.rst b/doc/sphinx-guides/source/developers/testing.rst
index dab8110b20b..abecaa09fad 100755
--- a/doc/sphinx-guides/source/developers/testing.rst
+++ b/doc/sphinx-guides/source/developers/testing.rst
@@ -46,7 +46,7 @@ The main takeaway should be that we care about unit testing enough to measure th
Writing Unit Tests with JUnit
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-We are aware that there are newer testing tools such as TestNG, but we use `JUnit `_ because it's tried and true.
+We are aware that there are newer testing tools such as TestNG, but we use `JUnit `_ because it's tried and true.
We support JUnit 5 based testing and require new tests written with it.
(Since Dataverse 6.0, we migrated all of our tests formerly based on JUnit 4.)
@@ -272,11 +272,11 @@ Remember, it’s only a test (and it's not graded)! Some guidelines to bear in m
- Map out which logical functions you want to test
- Understand what’s being tested and ensure it’s repeatable
- Assert the conditions of success / return values for each operation
- * A useful resource would be `HTTP status codes `_
+ * A useful resource would be `HTTP status codes `_
- Let the code do the labor; automate everything that happens when you run your test file.
- Just as with any development, if you’re stuck: ask for help!
-To execute existing integration tests on your local Dataverse installation, a helpful command line tool to use is `Maven `_. You should have Maven installed as per the `Development Environment `_ guide, but if not it’s easily done via Homebrew: ``brew install maven``.
+To execute existing integration tests on your local Dataverse installation, a helpful command line tool to use is `Maven `_. You should have Maven installed as per the `Development Environment `_ guide, but if not it’s easily done via Homebrew: ``brew install maven``.
Once installed, you may run commands with ``mvn [options] [] []``.
@@ -525,7 +525,7 @@ Future Work on Integration Tests
- Automate testing of dataverse-client-python: https://github.com/IQSS/dataverse-client-python/issues/10
- Work with @leeper on testing the R client: https://github.com/IQSS/dataverse-client-r
- Review and attempt to implement "API Test Checklist" from @kcondon at https://docs.google.com/document/d/199Oq1YwQ4pYCguaeW48bIN28QAitSk63NbPYxJHCCAE/edit?usp=sharing
-- Generate code coverage reports for **integration** tests: https://github.com/pkainulainen/maven-examples/issues/3 and http://www.petrikainulainen.net/programming/maven/creating-code-coverage-reports-for-unit-and-integration-tests-with-the-jacoco-maven-plugin/
+- Generate code coverage reports for **integration** tests: https://github.com/pkainulainen/maven-examples/issues/3 and https://www.petrikainulainen.net/programming/maven/creating-code-coverage-reports-for-unit-and-integration-tests-with-the-jacoco-maven-plugin/
- Consistent logging of API Tests. Show test name at the beginning and end and status codes returned.
- expected passing and known/expected failing integration tests: https://github.com/IQSS/dataverse/issues/4438
diff --git a/doc/sphinx-guides/source/developers/tools.rst b/doc/sphinx-guides/source/developers/tools.rst
index d45f6c43846..9d2740fab6a 100755
--- a/doc/sphinx-guides/source/developers/tools.rst
+++ b/doc/sphinx-guides/source/developers/tools.rst
@@ -28,20 +28,20 @@ With Maven installed you can run ``mvn package`` and ``mvn test`` from the comma
PlantUML
++++++++
-PlantUML is used to create diagrams in the guides and other places. Download it from http://plantuml.com and check out an example script at https://github.com/IQSS/dataverse/blob/v4.6.1/doc/Architecture/components.sh . Note that for this script to work, you'll need the ``dot`` program, which can be installed on Mac with ``brew install graphviz``.
+PlantUML is used to create diagrams in the guides and other places. Download it from https://plantuml.com and check out an example script at https://github.com/IQSS/dataverse/blob/v4.6.1/doc/Architecture/components.sh . Note that for this script to work, you'll need the ``dot`` program, which can be installed on Mac with ``brew install graphviz``.
Eclipse Memory Analyzer Tool (MAT)
++++++++++++++++++++++++++++++++++
The Memory Analyzer Tool (MAT) from Eclipse can help you analyze heap dumps, showing you "leak suspects" such as seen at https://github.com/payara/Payara/issues/350#issuecomment-115262625
-It can be downloaded from http://www.eclipse.org/mat
+It can be downloaded from https://www.eclipse.org/mat
If the heap dump provided to you was created with ``gcore`` (such as with ``gcore -o /tmp/app.core $app_pid``) rather than ``jmap``, you will need to convert the file before you can open it in MAT. Using ``app.core.13849`` as example of the original 33 GB file, here is how you could convert it into a 26 GB ``app.core.13849.hprof`` file. Please note that this operation took almost 90 minutes:
``/usr/java7/bin/jmap -dump:format=b,file=app.core.13849.hprof /usr/java7/bin/java app.core.13849``
-A file of this size may not "just work" in MAT. When you attempt to open it you may see something like "An internal error occurred during: "Parsing heap dump from '/tmp/heapdumps/app.core.13849.hprof'". Java heap space". If so, you will need to increase the memory allocated to MAT. On Mac OS X, this can be done by editing ``MemoryAnalyzer.app/Contents/MacOS/MemoryAnalyzer.ini`` and increasing the value "-Xmx1024m" until it's high enough to open the file. See also http://wiki.eclipse.org/index.php/MemoryAnalyzer/FAQ#Out_of_Memory_Error_while_Running_the_Memory_Analyzer
+A file of this size may not "just work" in MAT. When you attempt to open it you may see something like "An internal error occurred during: "Parsing heap dump from '/tmp/heapdumps/app.core.13849.hprof'". Java heap space". If so, you will need to increase the memory allocated to MAT. On Mac OS X, this can be done by editing ``MemoryAnalyzer.app/Contents/MacOS/MemoryAnalyzer.ini`` and increasing the value "-Xmx1024m" until it's high enough to open the file. See also https://wiki.eclipse.org/index.php/MemoryAnalyzer/FAQ#Out_of_Memory_Error_while_Running_the_Memory_Analyzer
PageKite
++++++++
@@ -58,7 +58,7 @@ The first time you run ``./pagekite.py`` a file at ``~/.pagekite.rc`` will be
created. You can edit this file to configure PageKite to serve up port 8080
(the default app server HTTP port) or the port of your choosing.
-According to https://pagekite.net/support/free-for-foss/ PageKite (very generously!) offers free accounts to developers writing software the meets http://opensource.org/docs/definition.php such as the Dataverse Project.
+According to https://pagekite.net/support/free-for-foss/ PageKite (very generously!) offers free accounts to developers writing software the meets https://opensource.org/docs/definition.php such as the Dataverse Project.
MSV
+++
diff --git a/doc/sphinx-guides/source/developers/unf/index.rst b/doc/sphinx-guides/source/developers/unf/index.rst
index 2423877348f..856de209e82 100644
--- a/doc/sphinx-guides/source/developers/unf/index.rst
+++ b/doc/sphinx-guides/source/developers/unf/index.rst
@@ -27,7 +27,7 @@ with Dataverse Software 2.0 and throughout the 3.* lifecycle, UNF v.5
UNF v.6. Two parallel implementation, in R and Java, will be
available, for cross-validation.
-Learn more: Micah Altman and Gary King. 2007. “A Proposed Standard for the Scholarly Citation of Quantitative Data.” D-Lib Magazine, 13. Publisher’s Version Copy at http://j.mp/2ovSzoT
+Learn more: Micah Altman and Gary King. 2007. “A Proposed Standard for the Scholarly Citation of Quantitative Data.” D-Lib Magazine, 13. Publisher’s Version Copy at https://j.mp/2ovSzoT
**Contents:**
diff --git a/doc/sphinx-guides/source/developers/unf/unf-v3.rst b/doc/sphinx-guides/source/developers/unf/unf-v3.rst
index 3f0018d7fa5..98c07b398e0 100644
--- a/doc/sphinx-guides/source/developers/unf/unf-v3.rst
+++ b/doc/sphinx-guides/source/developers/unf/unf-v3.rst
@@ -34,11 +34,11 @@ For example, the number pi at five digits is represented as -3.1415e+, and the n
1. Terminate character strings representing nonmissing values with a POSIX end-of-line character.
-2. Encode each character string with `Unicode bit encoding `_. Versions 3 through 4 use UTF-32BE; Version 4.1 uses UTF-8.
+2. Encode each character string with `Unicode bit encoding `_. Versions 3 through 4 use UTF-32BE; Version 4.1 uses UTF-8.
3. Combine the vector of character strings into a single sequence, with each character string separated by a POSIX end-of-line character and a null byte.
-4. Compute a hash on the resulting sequence using the standard MD5 hashing algorithm for Version 3 and using `SHA256 `_ for Version 4. The resulting hash is `base64 `_ encoded to support readability.
+4. Compute a hash on the resulting sequence using the standard MD5 hashing algorithm for Version 3 and using `SHA256 `_ for Version 4. The resulting hash is `base64 `_ encoded to support readability.
5. Calculate the UNF for each lower-level data object, using a consistent UNF version and level of precision across the individual UNFs being combined.
@@ -49,4 +49,4 @@ For example, the number pi at five digits is represented as -3.1415e+, and the n
8. Combine UNFs from multiple variables to form a single UNF for an entire data frame, and then combine UNFs for a set of data frames to form a single UNF that represents an entire research study.
Learn more:
-Software for computing UNFs is available in an R Module, which includes a Windows standalone tool and code for Stata and SAS languages. Also see the following for more details: Micah Altman and Gary King. 2007. "A Proposed Standard for the Scholarly Citation of Quantitative Data," D-Lib Magazine, Vol. 13, No. 3/4 (March). (Abstract: `HTML `_ | Article: `PDF `_)
+Software for computing UNFs is available in an R Module, which includes a Windows standalone tool and code for Stata and SAS languages. Also see the following for more details: Micah Altman and Gary King. 2007. "A Proposed Standard for the Scholarly Citation of Quantitative Data," D-Lib Magazine, Vol. 13, No. 3/4 (March). (Abstract: `HTML `_ | Article: `PDF `_)
diff --git a/doc/sphinx-guides/source/developers/unf/unf-v6.rst b/doc/sphinx-guides/source/developers/unf/unf-v6.rst
index 9648bae47c8..b2495ff3dd9 100644
--- a/doc/sphinx-guides/source/developers/unf/unf-v6.rst
+++ b/doc/sphinx-guides/source/developers/unf/unf-v6.rst
@@ -156,7 +156,7 @@ For example, to specify a non-default precision the parameter it is specified us
| Allowed values are {``128`` , ``192`` , ``196`` , ``256``} with ``128`` being the default.
| ``R1`` - **truncate** numeric values to ``N`` digits, **instead of rounding**, as previously described.
-`Dr. Micah Altman's classic UNF v5 paper `_ mentions another optional parameter ``T###``, for specifying rounding of date and time values (implemented as stripping the values of entire components - fractional seconds, seconds, minutes, hours... etc., progressively) - but it doesn't specify its syntax. It is left as an exercise for a curious reader to contact the author and work out the details, if so desired. (Not implemented in UNF Version 6 by the Dataverse Project).
+`Dr. Micah Altman's classic UNF v5 paper `_ mentions another optional parameter ``T###``, for specifying rounding of date and time values (implemented as stripping the values of entire components - fractional seconds, seconds, minutes, hours... etc., progressively) - but it doesn't specify its syntax. It is left as an exercise for a curious reader to contact the author and work out the details, if so desired. (Not implemented in UNF Version 6 by the Dataverse Project).
Note: we do not recommend truncating character strings at fewer bytes than the default ``128`` (the ``X`` parameter). At the very least this number **must** be high enough so that the printable UNFs of individual variables or files are not truncated, when calculating combined UNFs of files or datasets, respectively.
diff --git a/doc/sphinx-guides/source/developers/version-control.rst b/doc/sphinx-guides/source/developers/version-control.rst
index aacc245af5a..31fc0a4e602 100644
--- a/doc/sphinx-guides/source/developers/version-control.rst
+++ b/doc/sphinx-guides/source/developers/version-control.rst
@@ -24,7 +24,7 @@ The goals of the Dataverse Software branching strategy are:
- allow for concurrent development
- only ship stable code
-We follow a simplified "git flow" model described at http://nvie.com/posts/a-successful-git-branching-model/ involving a "master" branch, a "develop" branch, and feature branches such as "1234-bug-fix".
+We follow a simplified "git flow" model described at https://nvie.com/posts/a-successful-git-branching-model/ involving a "master" branch, a "develop" branch, and feature branches such as "1234-bug-fix".
Branches
~~~~~~~~
diff --git a/doc/sphinx-guides/source/index.rst b/doc/sphinx-guides/source/index.rst
index f6eda53d718..e4eeea9b6d0 100755
--- a/doc/sphinx-guides/source/index.rst
+++ b/doc/sphinx-guides/source/index.rst
@@ -45,7 +45,7 @@ Other Resources
Additional information about the Dataverse Project itself
including presentations, information about upcoming releases, data
management and citation, and announcements can be found at
-`http://dataverse.org/ `__
+`https://dataverse.org/ `__
**User Group**
@@ -68,7 +68,7 @@ The support email address is `support@dataverse.org `__
-or use `GitHub pull requests `__,
+or use `GitHub pull requests `__,
if you have some code, scripts or documentation that you'd like to share.
If you have a **security issue** to report, please email `security@dataverse.org `__. See also :ref:`reporting-security-issues`.
diff --git a/doc/sphinx-guides/source/installation/config.rst b/doc/sphinx-guides/source/installation/config.rst
index 1cc26debc64..13a7367de44 100644
--- a/doc/sphinx-guides/source/installation/config.rst
+++ b/doc/sphinx-guides/source/installation/config.rst
@@ -142,7 +142,7 @@ The need to redirect port HTTP (port 80) to HTTPS (port 443) for security has al
Your decision to proxy or not should primarily be driven by which features of the Dataverse Software you'd like to use. If you'd like to use Shibboleth, the decision is easy because proxying or "fronting" Payara with Apache is required. The details are covered in the :doc:`shibboleth` section.
-Even if you have no interest in Shibboleth, you may want to front your Dataverse installation with Apache or nginx to simply the process of installing SSL certificates. There are many tutorials on the Internet for adding certs to Apache, including a some `notes used by the Dataverse Project team `_, but the process of adding a certificate to Payara is arduous and not for the faint of heart. The Dataverse Project team cannot provide much help with adding certificates to Payara beyond linking to `tips `_ on the web.
+Even if you have no interest in Shibboleth, you may want to front your Dataverse installation with Apache or nginx to simply the process of installing SSL certificates. There are many tutorials on the Internet for adding certs to Apache, including a some `notes used by the Dataverse Project team `_, but the process of adding a certificate to Payara is arduous and not for the faint of heart. The Dataverse Project team cannot provide much help with adding certificates to Payara beyond linking to `tips `_ on the web.
Still not convinced you should put Payara behind another web server? Even if you manage to get your SSL certificate into Payara, how are you going to run Payara on low ports such as 80 and 443? Are you going to run Payara as root? Bad idea. This is a security risk. Under "Additional Recommendations" under "Securing Your Installation" above you are advised to configure Payara to run as a user other than root.
@@ -154,7 +154,7 @@ If you really don't want to front Payara with any proxy (not recommended), you c
``./asadmin set server-config.network-config.network-listeners.network-listener.http-listener-2.port=443``
-What about port 80? Even if you don't front your Dataverse installation with Apache, you may want to let Apache run on port 80 just to rewrite HTTP to HTTPS as described above. You can use a similar command as above to change the HTTP port that Payara uses from 8080 to 80 (substitute ``http-listener-1.port=80``). Payara can be used to enforce HTTPS on its own without Apache, but configuring this is an exercise for the reader. Answers here may be helpful: http://stackoverflow.com/questions/25122025/glassfish-v4-java-7-port-unification-error-not-able-to-redirect-http-to
+What about port 80? Even if you don't front your Dataverse installation with Apache, you may want to let Apache run on port 80 just to rewrite HTTP to HTTPS as described above. You can use a similar command as above to change the HTTP port that Payara uses from 8080 to 80 (substitute ``http-listener-1.port=80``). Payara can be used to enforce HTTPS on its own without Apache, but configuring this is an exercise for the reader. Answers here may be helpful: https://stackoverflow.com/questions/25122025/glassfish-v4-java-7-port-unification-error-not-able-to-redirect-http-to
If you are running an installation with Apache and Payara on the same server, and would like to restrict Payara from responding to any requests to port 8080 from external hosts (in other words, not through Apache), you can restrict the AJP listener to localhost only with:
@@ -278,7 +278,7 @@ change the ``:Protocol`` setting, as it defaults to DOI usage.
- :ref:`:IndependentHandleService <:IndependentHandleService>` (optional)
- :ref:`:HandleAuthHandle <:HandleAuthHandle>` (optional)
-Note: If you are **minting your own handles** and plan to set up your own handle service, please refer to `Handle.Net documentation `_.
+Note: If you are **minting your own handles** and plan to set up your own handle service, please refer to `Handle.Net documentation `_.
.. _permalinks:
@@ -555,7 +555,7 @@ Multiple file stores should specify different directories (which would nominally
Swift Storage
+++++++++++++
-Rather than storing data files on the filesystem, you can opt for an experimental setup with a `Swift Object Storage `_ backend. Each dataset that users create gets a corresponding "container" on the Swift side, and each data file is saved as a file within that container.
+Rather than storing data files on the filesystem, you can opt for an experimental setup with a `Swift Object Storage `_ backend. Each dataset that users create gets a corresponding "container" on the Swift side, and each data file is saved as a file within that container.
**In order to configure a Swift installation,** you need to complete these steps to properly modify the JVM options:
@@ -571,7 +571,7 @@ First, run all the following create commands with your Swift endpoint informatio
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files..username.endpoint1=your-username"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files..endpoint.endpoint1=your-swift-endpoint"
-``auth_type`` can either be ``keystone``, ``keystone_v3``, or it will assumed to be ``basic``. ``auth_url`` should be your keystone authentication URL which includes the tokens (e.g. for keystone, ``https://openstack.example.edu:35357/v2.0/tokens`` and for keystone_v3, ``https://openstack.example.edu:35357/v3/auth/tokens``). ``swift_endpoint`` is a URL that looks something like ``http://rdgw.swift.example.org/swift/v1``.
+``auth_type`` can either be ``keystone``, ``keystone_v3``, or it will assumed to be ``basic``. ``auth_url`` should be your keystone authentication URL which includes the tokens (e.g. for keystone, ``https://openstack.example.edu:35357/v2.0/tokens`` and for keystone_v3, ``https://openstack.example.edu:35357/v3/auth/tokens``). ``swift_endpoint`` is a URL that looks something like ``https://rdgw.swift.example.org/swift/v1``.
Then create a password alias by running (without changes):
@@ -667,7 +667,7 @@ You'll need an AWS account with an associated S3 bucket for your installation to
**Make note** of the **bucket's name** and the **region** its data is hosted in.
To **create a user** with full S3 access and nothing more for security reasons, we recommend using IAM
-(Identity and Access Management). See `IAM User Guide `_
+(Identity and Access Management). See `IAM User Guide `_
for more info on this process.
To use programmatic access, **Generate the user keys** needed for a Dataverse installation afterwards by clicking on the created user.
@@ -738,7 +738,7 @@ Additional profiles can be added to these files by appending the relevant inform
aws_access_key_id =
aws_secret_access_key =
-Place these two files in a folder named ``.aws`` under the home directory for the user running your Dataverse Installation on Payara. (From the `AWS Command Line Interface Documentation `_:
+Place these two files in a folder named ``.aws`` under the home directory for the user running your Dataverse Installation on Payara. (From the `AWS Command Line Interface Documentation `_:
"In order to separate credentials from less sensitive options, region and output format are stored in a separate file
named config in the same folder")
@@ -865,7 +865,7 @@ You may provide the values for these via any `supported MicroProfile Config API
Reported Working S3-Compatible Storage
######################################
-`Minio v2018-09-12 `_
+`Minio v2018-09-12 `_
Set ``dataverse.files..path-style-access=true``, as Minio works path-based. Works pretty smooth, easy to setup.
**Can be used for quick testing, too:** just use the example values above. Uses the public (read: unsecure and
possibly slow) https://play.minio.io:9000 service.
@@ -2979,7 +2979,7 @@ Note: by default, the URL is composed from the settings ``:GuidesBaseUrl`` and `
:GuidesBaseUrl
++++++++++++++
-Set ``:GuidesBaseUrl`` to override the default value "http://guides.dataverse.org". If you are interested in writing your own version of the guides, you may find the :doc:`/developers/documentation` section of the Developer Guide helpful.
+Set ``:GuidesBaseUrl`` to override the default value "https://guides.dataverse.org". If you are interested in writing your own version of the guides, you may find the :doc:`/developers/documentation` section of the Developer Guide helpful.
``curl -X PUT -d http://dataverse.example.edu http://localhost:8080/api/admin/settings/:GuidesBaseUrl``
@@ -3000,14 +3000,14 @@ Set ``:NavbarSupportUrl`` to a fully-qualified URL which will be used for the "S
Note that this will override the default behaviour for the "Support" menu option, which is to display the Dataverse collection 'feedback' dialog.
-``curl -X PUT -d http://dataverse.example.edu/supportpage.html http://localhost:8080/api/admin/settings/:NavbarSupportUrl``
+``curl -X PUT -d https://dataverse.example.edu/supportpage.html http://localhost:8080/api/admin/settings/:NavbarSupportUrl``
:MetricsUrl
+++++++++++
Make the metrics component on the root Dataverse collection a clickable link to a website where you present metrics on your Dataverse installation, perhaps one of the community-supported tools mentioned in the :doc:`/admin/reporting-tools-and-queries` section of the Admin Guide.
-``curl -X PUT -d http://metrics.dataverse.example.edu http://localhost:8080/api/admin/settings/:MetricsUrl``
+``curl -X PUT -d https://metrics.dataverse.example.edu http://localhost:8080/api/admin/settings/:MetricsUrl``
.. _:MaxFileUploadSizeInBytes:
diff --git a/doc/sphinx-guides/source/installation/installation-main.rst b/doc/sphinx-guides/source/installation/installation-main.rst
index 021a97415e3..46c1b0b0af3 100755
--- a/doc/sphinx-guides/source/installation/installation-main.rst
+++ b/doc/sphinx-guides/source/installation/installation-main.rst
@@ -100,7 +100,7 @@ The supplied site URL will be saved under the JVM option :ref:`dataverse.siteUrl
The Dataverse Software uses JHOVE_ to help identify the file format (CSV, PNG, etc.) for files that users have uploaded. The installer places files called ``jhove.conf`` and ``jhoveConfig.xsd`` into the directory ``/usr/local/payara6/glassfish/domains/domain1/config`` by default and makes adjustments to the jhove.conf file based on the directory into which you chose to install Payara.
-.. _JHOVE: http://jhove.openpreservation.org
+.. _JHOVE: https://jhove.openpreservation.org
Logging In
----------
@@ -120,7 +120,7 @@ Use the following credentials to log in:
- username: dataverseAdmin
- password: admin
-Congratulations! You have a working Dataverse installation. Soon you'll be tweeting at `@dataverseorg `_ asking to be added to the map at http://dataverse.org :)
+Congratulations! You have a working Dataverse installation. Soon you'll be tweeting at `@dataverseorg `_ asking to be added to the map at https://dataverse.org :)
Trouble? See if you find an answer in the troubleshooting section below.
@@ -204,7 +204,7 @@ Be sure you save the changes made here and then restart your Payara server to te
UnknownHostException While Deploying
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-If you are seeing "Caused by: java.net.UnknownHostException: myhost: Name or service not known" in server.log and your hostname is "myhost" the problem is likely that "myhost" doesn't appear in ``/etc/hosts``. See also http://stackoverflow.com/questions/21817809/glassfish-exception-during-deployment-project-with-stateful-ejb/21850873#21850873
+If you are seeing "Caused by: java.net.UnknownHostException: myhost: Name or service not known" in server.log and your hostname is "myhost" the problem is likely that "myhost" doesn't appear in ``/etc/hosts``. See also https://stackoverflow.com/questions/21817809/glassfish-exception-during-deployment-project-with-stateful-ejb/21850873#21850873
.. _fresh-reinstall:
diff --git a/doc/sphinx-guides/source/installation/intro.rst b/doc/sphinx-guides/source/installation/intro.rst
index 67fc774bdbd..6d77a1209b2 100644
--- a/doc/sphinx-guides/source/installation/intro.rst
+++ b/doc/sphinx-guides/source/installation/intro.rst
@@ -2,7 +2,7 @@
Introduction
============
-Welcome! Thanks for installing `The Dataverse Project `_!
+Welcome! Thanks for installing `The Dataverse Project `_!
.. contents:: |toctitle|
:local:
@@ -36,7 +36,7 @@ Getting Help
To get help installing or configuring a Dataverse installation, please try one or more of:
- posting to the `dataverse-community `_ Google Group.
-- asking at http://chat.dataverse.org
+- asking at https://chat.dataverse.org
- emailing support@dataverse.org to open a private ticket at https://help.hmdc.harvard.edu
Information to Send to Support When Installation Fails
diff --git a/doc/sphinx-guides/source/installation/oauth2.rst b/doc/sphinx-guides/source/installation/oauth2.rst
index 8dffde87cc2..7a0e938b572 100644
--- a/doc/sphinx-guides/source/installation/oauth2.rst
+++ b/doc/sphinx-guides/source/installation/oauth2.rst
@@ -11,7 +11,7 @@ As explained under "Auth Modes" in the :doc:`config` section, OAuth2 is one of t
`OAuth2 `_ is an authentication protocol that allows systems to share user data, while letting the users control what data is being shared. When you see buttons stating "login with Google" or "login through Facebook", OAuth2 is probably involved. For the purposes of this section, we will shorten "OAuth2" to just "OAuth." OAuth can be compared and contrasted with :doc:`shibboleth`.
-The Dataverse Software supports four OAuth providers: `ORCID `_, `Microsoft Azure Active Directory (AD) `_, `GitHub `_, and `Google `_.
+The Dataverse Software supports four OAuth providers: `ORCID `_, `Microsoft Azure Active Directory (AD) `_, `GitHub `_, and `Google `_.
In addition :doc:`oidc` are supported, using a standard based on OAuth2.
diff --git a/doc/sphinx-guides/source/installation/oidc.rst b/doc/sphinx-guides/source/installation/oidc.rst
index 34710f260e8..d132fd2953d 100644
--- a/doc/sphinx-guides/source/installation/oidc.rst
+++ b/doc/sphinx-guides/source/installation/oidc.rst
@@ -51,7 +51,7 @@ Just like with :doc:`oauth2` you need to obtain a *Client ID* and a *Client Secr
You need to apply for credentials out-of-band.
The Dataverse installation will discover all necessary metadata for a given provider on its own (this is `part of the standard
-`_).
+`_).
To enable this, you need to specify an *Issuer URL* when creating the configuration for your provider (see below).
diff --git a/doc/sphinx-guides/source/installation/prerequisites.rst b/doc/sphinx-guides/source/installation/prerequisites.rst
index cff20ccab1d..afba25b4d73 100644
--- a/doc/sphinx-guides/source/installation/prerequisites.rst
+++ b/doc/sphinx-guides/source/installation/prerequisites.rst
@@ -26,7 +26,7 @@ Installing Java
The Dataverse Software should run fine with only the Java Runtime Environment (JRE) installed, but installing the Java Development Kit (JDK) is recommended so that useful tools for troubleshooting production environments are available. We recommend using Oracle JDK or OpenJDK.
-The Oracle JDK can be downloaded from http://www.oracle.com/technetwork/java/javase/downloads/index.html
+The Oracle JDK can be downloaded from https://www.oracle.com/technetwork/java/javase/downloads/index.html
On a RHEL/derivative, install OpenJDK (devel version) using yum::
@@ -265,7 +265,7 @@ Installing jq
or you may install it manually::
# cd /usr/bin
- # wget http://stedolan.github.io/jq/download/linux64/jq
+ # wget https://stedolan.github.io/jq/download/linux64/jq
# chmod +x jq
# jq --version
diff --git a/doc/sphinx-guides/source/installation/shibboleth.rst b/doc/sphinx-guides/source/installation/shibboleth.rst
index cd0fbda77a6..3a2e1b99c70 100644
--- a/doc/sphinx-guides/source/installation/shibboleth.rst
+++ b/doc/sphinx-guides/source/installation/shibboleth.rst
@@ -76,7 +76,7 @@ A ``jk-connector`` network listener should have already been set up when you ran
You can verify this with ``./asadmin list-network-listeners``.
-This enables the `AJP protocol `_ used in Apache configuration files below.
+This enables the `AJP protocol `_ used in Apache configuration files below.
SSLEngine Warning Workaround
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -93,7 +93,7 @@ Configure Apache
Enforce HTTPS
~~~~~~~~~~~~~
-To prevent attacks such as `FireSheep `_, HTTPS should be enforced. https://wiki.apache.org/httpd/RewriteHTTPToHTTPS provides a good method. You **could** copy and paste that those "rewrite rule" lines into Apache's main config file at ``/etc/httpd/conf/httpd.conf`` but using Apache's "virtual hosts" feature is recommended so that you can leave the main configuration file alone and drop a host-specific file into place.
+To prevent attacks such as `FireSheep `_, HTTPS should be enforced. https://wiki.apache.org/httpd/RewriteHTTPToHTTPS provides a good method. You **could** copy and paste that those "rewrite rule" lines into Apache's main config file at ``/etc/httpd/conf/httpd.conf`` but using Apache's "virtual hosts" feature is recommended so that you can leave the main configuration file alone and drop a host-specific file into place.
Below is an example of how "rewrite rule" lines look within a ``VirtualHost`` block. Download a :download:`sample file <../_static/installation/files/etc/httpd/conf.d/dataverse.example.edu.conf>` , edit it to substitute your own hostname under ``ServerName``, and place it at ``/etc/httpd/conf.d/dataverse.example.edu.conf`` or a filename that matches your hostname. The file must be in ``/etc/httpd/conf.d`` and must end in ".conf" to be included in Apache's configuration.
@@ -235,7 +235,7 @@ Run semodule
Silent is golden. No output is expected. This will place a file in ``/etc/selinux/targeted/modules/active/modules/shibboleth.pp`` and include "shibboleth" in the output of ``semodule -l``. See the ``semodule`` man page if you ever want to remove or disable the module you just added.
-Congrats! You've made the creator of http://stopdisablingselinux.com proud. :)
+Congrats! You've made the creator of https://stopdisablingselinux.com proud. :)
Restart Apache and Shibboleth
-----------------------------
diff --git a/doc/sphinx-guides/source/style/foundations.rst b/doc/sphinx-guides/source/style/foundations.rst
index 31e0c314a05..cc193666868 100755
--- a/doc/sphinx-guides/source/style/foundations.rst
+++ b/doc/sphinx-guides/source/style/foundations.rst
@@ -9,7 +9,7 @@ Foundation elements are the very basic building blocks to create a page in Datav
Grid Layout
===========
-`Bootstrap `__ provides a responsive, fluid, 12-column grid system that we use to organize our page layouts.
+`Bootstrap `__ provides a responsive, fluid, 12-column grid system that we use to organize our page layouts.
We use the fixed-width ``.container`` class which provides responsive widths (i.e. auto, 750px, 970px or 1170px) based on media queries for the page layout, with a series of rows and columns for the content.
@@ -42,7 +42,7 @@ The grid layout uses ``.col-sm-*`` classes for horizontal groups of columns, ins
Typography
==========
-The typeface, text size, and line-height are set in the `Bootstrap CSS `__. We use Bootstrap's global default ``font-size`` of **14px**, with a ``line-height`` of **1.428**, which is applied to the ```` and all paragraphs.
+The typeface, text size, and line-height are set in the `Bootstrap CSS `__. We use Bootstrap's global default ``font-size`` of **14px**, with a ``line-height`` of **1.428**, which is applied to the ```` and all paragraphs.
.. code-block:: css
@@ -57,7 +57,7 @@ The typeface, text size, and line-height are set in the `Bootstrap CSS `__. It provides the background, border, text and link colors used across the application.
+The default color palette is set in the `Bootstrap CSS `__. It provides the background, border, text and link colors used across the application.
Brand Colors
@@ -138,7 +138,7 @@ We use our brand color, a custom burnt orange ``{color:#C55B28;}``, which is set
Text Colors
-----------
-Text color is the default setting from `Bootstrap CSS `__.
+Text color is the default setting from `Bootstrap CSS `__.
.. code-block:: css
@@ -163,7 +163,7 @@ Text color is the default setting from `Bootstrap CSS `__. The hover state color is set to 15% darker.
+Link color is the default setting from `Bootstrap CSS `__. The hover state color is set to 15% darker.
**Please note**, there is a CSS override issue with the link color due to the use of both a Bootstrap stylesheet and a PrimeFaces stylesheet in the UI. We've added CSS such as ``.ui-widget-content a {color: #428BCA;}`` to our stylesheet to keep the link color consistent.
@@ -204,7 +204,7 @@ Link color is the default setting from `Bootstrap CSS `__ can be used to style background and text colors. Semantic colors include various colors assigned to meaningful contextual values. We convey meaning through color with a handful of emphasis utility classes.
+Contextual classes from `Bootstrap CSS `__ can be used to style background and text colors. Semantic colors include various colors assigned to meaningful contextual values. We convey meaning through color with a handful of emphasis utility classes.
.. raw:: html
@@ -259,7 +259,7 @@ We use various icons across the application, which we get from Bootstrap, FontCu
Bootstrap Glyphicons
--------------------
-There are over 250 glyphs in font format from the Glyphicon Halflings set provided by `Bootstrap `__. We utilize these mainly as icons inside of buttons and in message blocks.
+There are over 250 glyphs in font format from the Glyphicon Halflings set provided by `Bootstrap `__. We utilize these mainly as icons inside of buttons and in message blocks.
.. raw:: html
@@ -305,7 +305,7 @@ The :doc:`/developers/fontcustom` section of the Developer Guide explains how to
Socicon Icon Font
-----------------
-We use `Socicon `__ for our custom social icons. In the footer we use icons for Twitter and Github. In our Share feature, we also use custom social icons to allow users to select from a list of social media channels.
+We use `Socicon `__ for our custom social icons. In the footer we use icons for Twitter and Github. In our Share feature, we also use custom social icons to allow users to select from a list of social media channels.
.. raw:: html
diff --git a/doc/sphinx-guides/source/style/patterns.rst b/doc/sphinx-guides/source/style/patterns.rst
index e96f17dc2ec..c6602ffa26e 100644
--- a/doc/sphinx-guides/source/style/patterns.rst
+++ b/doc/sphinx-guides/source/style/patterns.rst
@@ -1,7 +1,7 @@
Patterns
++++++++
-Patterns are what emerge when using the foundation elements together with basic objects like buttons and alerts, more complex Javascript components from `Bootstrap `__ like tooltips and dropdowns, and AJAX components from `PrimeFaces `__ like datatables and commandlinks.
+Patterns are what emerge when using the foundation elements together with basic objects like buttons and alerts, more complex Javascript components from `Bootstrap `__ like tooltips and dropdowns, and AJAX components from `PrimeFaces `__ like datatables and commandlinks.
.. contents:: |toctitle|
:local:
@@ -9,7 +9,7 @@ Patterns are what emerge when using the foundation elements together with basic
Navbar
======
-The `Navbar component `__ from Bootstrap spans the top of the application and contains the logo/branding, aligned to the left, plus search form and links, aligned to the right.
+The `Navbar component `__ from Bootstrap spans the top of the application and contains the logo/branding, aligned to the left, plus search form and links, aligned to the right.
When logged in, the account name is a dropdown menu, linking the user to account-specific content and the log out link.
@@ -74,7 +74,7 @@ When logged in, the account name is a dropdown menu, linking the user to account
Breadcrumbs
===========
-The breadcrumbs are displayed under the header, and provide a trail of links for users to navigate the hierarchy of containing objects, from file to dataset to Dataverse collection. It utilizes a JSF `repeat component `_ to iterate through the breadcrumbs.
+The breadcrumbs are displayed under the header, and provide a trail of links for users to navigate the hierarchy of containing objects, from file to dataset to Dataverse collection. It utilizes a JSF `repeat component `_ to iterate through the breadcrumbs.
.. raw:: html
@@ -108,7 +108,7 @@ The breadcrumbs are displayed under the header, and provide a trail of links for
Tables
======
-Most tables use the `DataTable components `__ from PrimeFaces and are styled using the `Tables component `__ from Bootstrap.
+Most tables use the `DataTable components `__ from PrimeFaces and are styled using the `Tables component `__ from Bootstrap.
.. raw:: html
@@ -187,7 +187,7 @@ Most tables use the `DataTable components `__ from Bootstrap. Form elements like the `InputText component `__ from PrimeFaces are kept looking clean and consistent across each page.
+Forms fulfill various functions across the site, but we try to style them consistently. We use the ``.form-horizontal`` layout, which uses ``.form-group`` to create a grid of rows for the labels and inputs. The consistent style of forms is maintained using the `Forms component `__ from Bootstrap. Form elements like the `InputText component `__ from PrimeFaces are kept looking clean and consistent across each page.
.. raw:: html
@@ -289,7 +289,7 @@ Here are additional form elements that are common across many pages, including r
Buttons
=======
-There are various types of buttons for various actions, so we have many components to use, including the `CommandButton component `__ and `CommandLink component `__ from PrimeFaces, as well as the basic JSF `Link component `__ and `OutputLink component `__. Those are styled using the `Buttons component `__, `Button Groups component `__ and `Buttons Dropdowns component `__ from Bootstrap.
+There are various types of buttons for various actions, so we have many components to use, including the `CommandButton component `__ and `CommandLink component `__ from PrimeFaces, as well as the basic JSF `Link component `__ and `OutputLink component `__. Those are styled using the `Buttons component `__, `Button Groups component `__ and `Buttons Dropdowns component `__ from Bootstrap.
Action Buttons
--------------
@@ -668,7 +668,7 @@ Another variation of icon-only buttons uses the ``.btn-link`` style class from B
Pagination
==========
-We use the `Pagination component `__ from Bootstrap for paging through search results.
+We use the `Pagination component `__ from Bootstrap for paging through search results.
.. raw:: html
@@ -738,7 +738,7 @@ We use the `Pagination component `__ from Bootstrap is used for publication status (DRAFT, In Review, Unpublished, Deaccessioned), and Dataset version, as well as Tabular Data Tags (Survey, Time Series, Panel, Event, Genomics, Network, Geospatial).
+The `Labels component `__ from Bootstrap is used for publication status (DRAFT, In Review, Unpublished, Deaccessioned), and Dataset version, as well as Tabular Data Tags (Survey, Time Series, Panel, Event, Genomics, Network, Geospatial).
.. raw:: html
@@ -768,7 +768,7 @@ The `Labels component `__ from Boots
Alerts
======
-For our help/information, success, warning, and error message blocks we use a custom built UI component based on the `Alerts component `__ from Bootstrap.
+For our help/information, success, warning, and error message blocks we use a custom built UI component based on the `Alerts component `__ from Bootstrap.
.. raw:: html
@@ -859,9 +859,9 @@ Style classes can be added to ``p``, ``div``, ``span`` and other elements to add
Images
======
-For images, we use the `GraphicImage component `__ from PrimeFaces, or the basic JSF `GraphicImage component `__.
+For images, we use the `GraphicImage component `__ from PrimeFaces, or the basic JSF `GraphicImage component `__.
-To display images in a responsive way, they are styled with ``.img-responsive``, an `Images CSS class `__ from Bootstrap.
+To display images in a responsive way, they are styled with ``.img-responsive``, an `Images CSS class `__ from Bootstrap.
.. raw:: html
@@ -879,7 +879,7 @@ To display images in a responsive way, they are styled with ``.img-responsive``,
Panels
======
-The most common of our containers, the `Panels component `__ from Bootstrap is used to add a border and padding around sections of content like metadata blocks. Displayed with a header and/or footer, it can also be used with the `Collapse plugin `__ from Bootstrap.
+The most common of our containers, the `Panels component `__ from Bootstrap is used to add a border and padding around sections of content like metadata blocks. Displayed with a header and/or footer, it can also be used with the `Collapse plugin `__ from Bootstrap.
.. raw:: html
@@ -943,7 +943,7 @@ Tabs
Tabs are used to provide content panes on a page that allow the user to view different sections of content without navigating to a different page.
-We use the `TabView component `__ from PrimeFaces, which is styled using the `Tab component `__ from Bootstrap.
+We use the `TabView component `__ from PrimeFaces, which is styled using the `Tab component `__ from Bootstrap.
.. raw:: html
@@ -989,7 +989,7 @@ Modals are dialog prompts that act as popup overlays, but don't create a new bro
Buttons usually provide the UI prompt. A user clicks the button, which then opens a `Dialog component `__ or `Confirm Dialog component `__ from PrimeFaces that displays the modal with the necessary information and actions to take.
-The modal is styled using the `Modal component `__ from Bootstrap, for a popup window that prompts a user for information, with overlay and a backdrop, then header, content, and buttons. We can use style classes from Bootstrap for large (``.bs-example-modal-lg``) and small (``.bs-example-modal-sm``) width options.
+The modal is styled using the `Modal component `__ from Bootstrap, for a popup window that prompts a user for information, with overlay and a backdrop, then header, content, and buttons. We can use style classes from Bootstrap for large (``.bs-example-modal-lg``) and small (``.bs-example-modal-sm``) width options.
.. raw:: html
diff --git a/doc/sphinx-guides/source/user/account.rst b/doc/sphinx-guides/source/user/account.rst
index 675bae90e5d..81c416bafd1 100755
--- a/doc/sphinx-guides/source/user/account.rst
+++ b/doc/sphinx-guides/source/user/account.rst
@@ -109,7 +109,7 @@ If you are leaving your institution and need to convert your Dataverse installat
ORCID Log In
~~~~~~~~~~~~~
-You can set up your Dataverse installation account to allow you to log in using your ORCID credentials. ORCID® is an independent non-profit effort to provide an open registry of unique researcher identifiers and open services to link research activities and organizations to these identifiers. Learn more at `orcid.org `_.
+You can set up your Dataverse installation account to allow you to log in using your ORCID credentials. ORCID® is an independent non-profit effort to provide an open registry of unique researcher identifiers and open services to link research activities and organizations to these identifiers. Learn more at `orcid.org `_.
Create a Dataverse installation account using ORCID
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
diff --git a/doc/sphinx-guides/source/user/appendix.rst b/doc/sphinx-guides/source/user/appendix.rst
index e36b40a4110..878da96475a 100755
--- a/doc/sphinx-guides/source/user/appendix.rst
+++ b/doc/sphinx-guides/source/user/appendix.rst
@@ -22,13 +22,13 @@ Supported Metadata
Detailed below are what metadata schemas we support for Citation and Domain Specific Metadata in the Dataverse Project:
-- `Citation Metadata `__ (`see .tsv version `__): compliant with `DDI Lite `_, `DDI 2.5 Codebook `__, `DataCite 3.1 `__, and Dublin Core's `DCMI Metadata Terms `__ . Language field uses `ISO 639-1 `__ controlled vocabulary.
-- `Geospatial Metadata `__ (`see .tsv version `__): compliant with DDI Lite, DDI 2.5 Codebook, DataCite, and Dublin Core. Country / Nation field uses `ISO 3166-1 `_ controlled vocabulary.
+- `Citation Metadata `__ (`see .tsv version `__): compliant with `DDI Lite `_, `DDI 2.5 Codebook `__, `DataCite 3.1 `__, and Dublin Core's `DCMI Metadata Terms `__ . Language field uses `ISO 639-1 `__ controlled vocabulary.
+- `Geospatial Metadata `__ (`see .tsv version `__): compliant with DDI Lite, DDI 2.5 Codebook, DataCite, and Dublin Core. Country / Nation field uses `ISO 3166-1 `_ controlled vocabulary.
- `Social Science & Humanities Metadata `__ (`see .tsv version `__): compliant with DDI Lite, DDI 2.5 Codebook, and Dublin Core.
- `Astronomy and Astrophysics Metadata `__ (`see .tsv version `__): These metadata elements can be mapped/exported to the International Virtual Observatory Alliance’s (IVOA)
- `VOResource Schema format `__ and is based on
- `Virtual Observatory (VO) Discovery and Provenance Metadata `__ (`see .tsv version `__).
-- `Life Sciences Metadata `__ (`see .tsv version `__): based on `ISA-Tab Specification `__, along with controlled vocabulary from subsets of the `OBI Ontology `__ and the `NCBI Taxonomy for Organisms `__.
+ `VOResource Schema format `__ and is based on
+ `Virtual Observatory (VO) Discovery and Provenance Metadata `__.
+- `Life Sciences Metadata `__ (`see .tsv version `__): based on `ISA-Tab Specification `__, along with controlled vocabulary from subsets of the `OBI Ontology `__ and the `NCBI Taxonomy for Organisms `__.
- `Journal Metadata `__ (`see .tsv version `__): based on the `Journal Archiving and Interchange Tag Set, version 1.2 `__.
Experimental Metadata
diff --git a/doc/sphinx-guides/source/user/dataset-management.rst b/doc/sphinx-guides/source/user/dataset-management.rst
index 1e8ea897032..ac541bdcee9 100755
--- a/doc/sphinx-guides/source/user/dataset-management.rst
+++ b/doc/sphinx-guides/source/user/dataset-management.rst
@@ -228,9 +228,8 @@ Additional download options available for tabular data (found in the same drop-d
- As tab-delimited data (with the variable names in the first row);
- The original file uploaded by the user;
- Saved as R data (if the original file was not in R format);
-- Variable Metadata (as a `DDI Codebook `_ XML file);
-- Data File Citation (currently in either RIS, EndNote XML, or BibTeX format).
-
+- Variable Metadata (as a `DDI Codebook `_ XML file);
+- Data File Citation (currently in either RIS, EndNote XML, or BibTeX format).
Differentially Private (DP) Metadata can also be accessed for restricted tabular files if the data depositor has created a DP Metadata Release. See :ref:`dp-release-create` for more information.
@@ -337,7 +336,7 @@ You can also search for files within datasets that have been tagged as "Workflow
Astronomy (FITS)
----------------
-Metadata found in the header section of `Flexible Image Transport System (FITS) files `_ are automatically extracted by the Dataverse Software, aggregated and displayed in the Astronomy Domain-Specific Metadata of the Dataset that the file belongs to. This FITS file metadata, is therefore searchable and browsable (facets) at the Dataset-level.
+Metadata found in the header section of `Flexible Image Transport System (FITS) files `_ are automatically extracted by the Dataverse Software, aggregated and displayed in the Astronomy Domain-Specific Metadata of the Dataset that the file belongs to. This FITS file metadata, is therefore searchable and browsable (facets) at the Dataset-level.
.. _geojson:
@@ -496,7 +495,7 @@ Choosing a License
------------------
Each Dataverse installation provides a set of license(s) data can be released under, and whether users can specify custom terms instead (see below).
-One of the available licenses (often the `Creative Commons CC0 Public Domain Dedication `_) serves as the default if you do not make an explicit choice.
+One of the available licenses (often the `Creative Commons CC0 Public Domain Dedication `_) serves as the default if you do not make an explicit choice.
If you want to apply one of the other available licenses to your dataset, you can change it on the Terms tab of your Dataset page.
License Selection and Professional Norms
diff --git a/doc/sphinx-guides/source/user/tabulardataingest/ingestprocess.rst b/doc/sphinx-guides/source/user/tabulardataingest/ingestprocess.rst
index f1d5611ede9..33ae9b555e6 100644
--- a/doc/sphinx-guides/source/user/tabulardataingest/ingestprocess.rst
+++ b/doc/sphinx-guides/source/user/tabulardataingest/ingestprocess.rst
@@ -27,7 +27,7 @@ separately, in a relational database, so that it can be accessed
efficiently by the application. For the purposes of archival
preservation it can be exported, in plain text XML files, using a
standardized, open `DDI Codebook
-`_
+`_
format. (more info below)
@@ -53,6 +53,6 @@ Tabular Metadata in the Dataverse Software
The structure of the metadata defining tabular data variables used in
the Dataverse Software was originally based on the `DDI Codebook
-`_ format.
+`_ format.
You can see an example of DDI output under the :ref:`data-variable-metadata-access` section of the :doc:`/api/dataaccess` section of the API Guide.