Skip to content

Commit

Permalink
Merge remote-tracking branch 'IQSS/develop' into FakeFilePIDs
Browse files Browse the repository at this point in the history
  • Loading branch information
qqmyers committed Nov 4, 2024
2 parents 42034a9 + 6a00ce5 commit 933c8e5
Show file tree
Hide file tree
Showing 24 changed files with 883 additions and 300 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/guides_build_sphinx.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,6 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: OdumInstitute/sphinx-action@master
- uses: uncch-rdmc/sphinx-action@master
with:
docs-folder: "doc/sphinx-guides/"
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Adds a new endpoint (`PUT /api/dataverses/<identifier>`) for updating an existing Dataverse collection using a JSON file following the same structure as the one used in the API for the creation.
3 changes: 3 additions & 0 deletions doc/release-notes/10914-users-token-api-credentials.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
Extended the users/token GET endpoint to support any auth mechanism for retrieving the token information.

Previously, this endpoint only accepted an API token to retrieve its information. Now, it accepts any authentication mechanism and returns the associated API token information.
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
## Unpublished file bug fix

A bug fix was made that gets the major version of a Dataset when all major versions were deaccessioned. This fixes the incorrect showing of the files as "Unpublished" in the search list even when they are published.
This fix affects the indexing, meaning these datasets must be re-indexed once Dataverse is updated. This can be manually done by calling the index API for each affected Dataset.

Example:
```shell
curl http://localhost:8080/api/admin/index/dataset?persistentId=doi:10.7910/DVN/6X4ZZL
```

See also #10947 and #10974.
52 changes: 52 additions & 0 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,58 @@ The request JSON supports an optional ``metadataBlocks`` object, with the follow

To obtain an example of how these objects are included in the JSON file, download :download:`dataverse-complete-optional-params.json <../_static/api/dataverse-complete-optional-params.json>` file and modify it to suit your needs.

.. _update-dataverse-api:

Update a Dataverse Collection
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Updates an existing Dataverse collection using a JSON file following the same structure as the one used in the API for the creation. (see :ref:`create-dataverse-api`).

The steps for updating a Dataverse collection are:

- Prepare a JSON file containing the fields for the properties you want to update. You do not need to include all the properties, only the ones you want to update.
- Execute a curl command or equivalent.

As an example, you can download :download:`dataverse-complete.json <../_static/api/dataverse-complete.json>` file and modify it to suit your needs. The controlled vocabulary for ``dataverseType`` is the following:

- ``DEPARTMENT``
- ``JOURNALS``
- ``LABORATORY``
- ``ORGANIZATIONS_INSTITUTIONS``
- ``RESEARCHERS``
- ``RESEARCH_GROUP``
- ``RESEARCH_PROJECTS``
- ``TEACHING_COURSES``
- ``UNCATEGORIZED``

The curl command below assumes you are using the name "dataverse-complete.json" and that this file is in your current working directory.

Next you need to figure out the alias or database id of the Dataverse collection you want to update.

.. code-block:: bash
export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
export SERVER_URL=https://demo.dataverse.org
export DV_ALIAS=dvAlias
curl -H "X-Dataverse-key:$API_TOKEN" -X PUT "$SERVER_URL/api/dataverses/$DV_ALIAS" --upload-file dataverse-complete.json
The fully expanded example above (without environment variables) looks like this:

.. code-block:: bash
curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT "https://demo.dataverse.org/api/dataverses/dvAlias" --upload-file dataverse-complete.json
You should expect an HTTP 200 response and JSON beginning with "status":"OK" followed by a representation of the updated Dataverse collection.

Same as in :ref:`create-dataverse-api`, the request JSON supports an optional ``metadataBlocks`` object, with the following supported sub-objects:

- ``metadataBlockNames``: The names of the metadata blocks you want to add to the Dataverse collection.
- ``inputLevels``: The names of the fields in each metadata block for which you want to add a custom configuration regarding their inclusion or requirement when creating and editing datasets in the new Dataverse collection. Note that if the corresponding metadata blocks names are not specified in the ``metadataBlockNames``` field, they will be added automatically to the Dataverse collection.
- ``facetIds``: The names of the fields to use as facets for browsing datasets and collections in the new Dataverse collection. Note that the order of the facets is defined by their order in the provided JSON array.

To obtain an example of how these objects are included in the JSON file, download :download:`dataverse-complete-optional-params.json <../_static/api/dataverse-complete-optional-params.json>` file and modify it to suit your needs.

.. _view-dataverse:

View a Dataverse Collection
Expand Down
50 changes: 50 additions & 0 deletions doc/sphinx-guides/source/container/dev-usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -140,6 +140,56 @@ Alternatives:
- If you used Docker Compose for running, you may use ``docker compose -f docker-compose-dev.yml logs <service name>``.
Options are the same.

Accessing Harvesting Log Files
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

\1. Open a terminal and access the Dataverse container.

Run the following command to access the Dataverse container (assuming your container is named dataverse-1):

.. code-block::
docker exec -it dataverse-1 bash
This command opens an interactive shell within the dataverse-1 container.

\2. Navigate to the log files directory.

Once inside the container, navigate to the directory where Dataverse logs are stored:

.. code-block::
cd /opt/payara/appserver/glassfish/domains/domain1/logs
This directory contains various log files, including those relevant to harvesting.

\3. Create a directory for copying files.

Create a directory where you'll copy the files you want to access on your local machine:

.. code-block::
mkdir /dv/filesToCopy
This will create a new folder named filesToCopy inside /dv.

\4. Copy the files to the new directory.

Copy all files from the current directory to the newly created filesToCopy directory:

.. code-block::
cp * /dv/filesToCopy
This command copies all files in the logs directory to /dv/filesToCopy.

\5. Access the files on your local machine.

On your local machine, the copied files should appear in the following directory:

.. code-block::
docker-dev-volumes/app/data/filesToCopy
Redeploying
-----------
Expand Down
8 changes: 4 additions & 4 deletions doc/sphinx-guides/source/developers/version-control.rst
Original file line number Diff line number Diff line change
Expand Up @@ -291,16 +291,16 @@ By default, when a pull request is made from a fork, "Allow edits from maintaine

This is a nice feature of GitHub because it means that the core dev team for the Dataverse Project can make small (or even large) changes to a pull request from a contributor to help the pull request along on its way to QA and being merged.

GitHub documents how to make changes to a fork at https://help.github.com/articles/committing-changes-to-a-pull-request-branch-created-from-a-fork/ but as of this writing the steps involve making a new clone of the repo. This works but you might find it more convenient to add a "remote" to your existing clone. The example below uses the fork at https://github.com/OdumInstitute/dataverse and the branch ``4709-postgresql_96`` but the technique can be applied to any fork and branch:
GitHub documents how to make changes to a fork at https://help.github.com/articles/committing-changes-to-a-pull-request-branch-created-from-a-fork/ but as of this writing the steps involve making a new clone of the repo. This works but you might find it more convenient to add a "remote" to your existing clone. The example below uses the fork at https://github.com/uncch-rdmc/dataverse and the branch ``4709-postgresql_96`` but the technique can be applied to any fork and branch:

.. code-block:: bash
git remote add OdumInstitute [email protected]:OdumInstitute/dataverse.git
git fetch OdumInstitute
git remote add uncch-rdmc [email protected]:uncch-rdmc/dataverse.git
git fetch uncch-rdmc
git checkout 4709-postgresql_96
vim path/to/file.txt
git commit
git push OdumInstitute 4709-postgresql_96
git push uncch-rdmc 4709-postgresql_96
.. _develop-into-develop:

Expand Down
11 changes: 10 additions & 1 deletion src/main/java/edu/harvard/iq/dataverse/Dataset.java
Original file line number Diff line number Diff line change
Expand Up @@ -483,8 +483,17 @@ public Date getMostRecentMajorVersionReleaseDate() {
if (this.isHarvested()) {
return getVersions().get(0).getReleaseTime();
} else {
Long majorVersion = null;
for (DatasetVersion version : this.getVersions()) {
if (version.isReleased() && version.getMinorVersionNumber().equals((long) 0)) {
if (version.isReleased()) {
if (version.getMinorVersionNumber().equals((long) 0)) {
return version.getReleaseTime();
} else if (majorVersion == null) {
majorVersion = version.getVersionNumber();
}
} else if (version.isDeaccessioned() && majorVersion != null
&& majorVersion.longValue() == version.getVersionNumber().longValue()
&& version.getMinorVersionNumber().equals((long) 0)) {
return version.getReleaseTime();
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -891,6 +891,10 @@ public List<DatasetFieldType> findAllDisplayedOnCreateInMetadataBlock(MetadataBl
}

public List<DatasetFieldType> findAllInMetadataBlockAndDataverse(MetadataBlock metadataBlock, Dataverse dataverse, boolean onlyDisplayedOnCreate) {
if (!dataverse.isMetadataBlockRoot() && dataverse.getOwner() != null) {
return findAllInMetadataBlockAndDataverse(metadataBlock, dataverse.getOwner(), onlyDisplayedOnCreate);
}

CriteriaBuilder criteriaBuilder = em.getCriteriaBuilder();
CriteriaQuery<DatasetFieldType> criteriaQuery = criteriaBuilder.createQuery(DatasetFieldType.class);

Expand Down
Loading

0 comments on commit 933c8e5

Please sign in to comment.