Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Harvest: map publisher tag to distributorName #9013

Open
wants to merge 14 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions doc/release-notes/8739-publisher-during-harvesting.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
The publisher value of harvested datasets is now attributed to the dataset's distributor instead of its producer. This improves the citation associated with these datasets, but the change only affects newly harvested datasets. All datasets should be re-harvested if you wish to pick up this change on already harvested datasets. For more information, see [the guides](https://dataverse-guide--9013.org.readthedocs.build/en/9013/admin/harvestclients.html#harvesting-client-changelog), #8739, and #9013.
5 changes: 5 additions & 0 deletions doc/sphinx-guides/source/admin/harvestclients.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,11 @@ Each harvesting client run logs a separate file per run to the app server's defa

Note that you'll want to run a minimum of Dataverse Software 4.6, optimally 4.18 or beyond, for the best OAI-PMH interoperability.

Harvesting Client Changelog
---------------------------

- As of Dataverse 6.6, the publisher value of harvested datasets is now attributed to the dataset's distributor instead of its producer. This change affects all newly harvested datasets. For more information, see https://github.com/IQSS/dataverse/pull/9013

Harvesting Non-OAI-PMH
~~~~~~~~~~~~~~~~~~~~~~

Expand Down
1 change: 1 addition & 0 deletions src/main/resources/db/migration/V6.5.0.1.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
update foreignmetadatafieldmapping set datasetfieldname = 'distributorName' where foreignfieldxpath = ':publisher';
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ INSERT INTO foreignmetadatafieldmapping (id, foreignfieldxpath, datasetfieldname
(15, 'affiliation', 'authorAffiliation', TRUE, 3, 1 ),
(16, ':contributor', 'contributorName', FALSE, NULL, 1 ),
(17, 'type', 'contributorType', TRUE, 16, 1 ),
(18, ':publisher', 'producerName', FALSE, NULL, 1 ),
(18, ':publisher', 'distributorName', FALSE, NULL, 1 ),
(19, ':language', 'language', FALSE, NULL, 1 )
ON CONFLICT DO NOTHING;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -299,6 +299,11 @@ private void harvestingClientRun(boolean allowHarvestingMissingCVV) throws Inte
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, it's been so long that https://jenkins.dataverse.org/job/IQSS-Dataverse-Develop-PR/job/PR-9013/7/display/redirect shows a failure but that job is now a 404 so I can't see any details.

After you merge the latest from develop let's keep an eye on the new Jenkins run.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm still seeing failures. I just kicked off another run. Fingers crossed: https://jenkins.dataverse.org/job/IQSS-Dataverse-Develop-PR/job/PR-9013/12/

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Last time all checks were green except for "continuous-integration/jenkins/pr-merge " that was pending. I'm not clear on what it is doing.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok! Now that the app is being deployed following the SQL script renaming, I can see that a couple harvesting-related tests are failing:

  • edu.harvard.iq.dataverse.api.HarvestingClientsIT.testHarvestingClientRun_AllowHarvestingMissingCVV_False(HarvestingClientsIT.java:187)
  • edu.harvard.iq.dataverse.api.HarvestingClientsIT.testHarvestingClientRun_AllowHarvestingMissingCVV_True(HarvestingClientsIT.java:191)

@plecor can you please take a look? Do you need help with how to run these tests?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @pdurbin ,

I've re-run the tests from scratch and I wonder if it could be a chicken and egg situation. The new assertion fails if I run the tests from this branch because the sql migration file it's adding hasn't been run. If I manually update the db then the tests run fine. Could it be something similar going on with Jenkins?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, the issue is that in a new install, the corresponding data is initialized by afterMigrate__1-7256-upsert-referenceData.sql after the other migrations, so the data we're trying to update doesn't exist yet.

I updated this file with the same change as the sql migration file.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting. Thanks.

@poikilotherm judging from https://github.com/IQSS/dataverse/blame/c8499ba9553ac46cf3adc56d1b9e56f0c781d30f/src/main/resources/db/migration/afterMigrate__1-7256-upsert-referenceData.sql you added that "after migrate" file. What do you think? Any risk in changing it? 🤔

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@qqmyers took a look and had this to say:

"that afterMigrate script appears to run after everything else, every time you restart, rather than being one of the scripts tracked by hash value in the flyway_schema_history table. So - it looks OK to me to update it. (Mostly from looking at https://documentation.red-gate.com/fd/callback-concept-184127466.html and related and verifying that I don't see it in the flyway_schema_history table)."

// verify count after collecting global ids
assertEquals(expectedNumberOfSetsHarvested, jsonPath.getInt("data.total_count"));

// ensure the publisher name is present in the harvested dataset citation
Response harvestedDataverse = given().get(ARCHIVE_URL + "/api/dataverses/1");
String harvestedDataverseName = harvestedDataverse.getBody().jsonPath().getString("data.name");
assertTrue(jsonPath.getString("data.items[0].citation").contains(harvestedDataverseName));

// Fail if it hasn't completed in maxWait seconds
assertTrue(i < maxWait);
Expand Down
Loading