Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

170 - Integration files total download size #192

Merged
merged 10 commits into from
Oct 26, 2023

Conversation

MellyGray
Copy link
Contributor

@MellyGray MellyGray commented Oct 4, 2023

What this PR does / why we need it:

This PR integrates the frontend with the getDatasetFilesTotalDownloadSize use case from js-dataverse to get the total download size of the dataset files. This is used in the UI to display the Zip Download Limit Message to the user when they select all rows and the Files Table is paginated.

Which issue(s) this PR closes:

Special notes for your reviewer:

Changes

  • ZipDownloadLimitMessage component now needs to read the value from the getDatasetFilesTotalDownloadSize use case to correctly display the size of the select files when there Select All button is clicked.
  • Integrate js-dataverse use case getDatasetFilesTotalDownloadSize
  • Fix an integration test about to get the deaccessioned files from js-dataverse now that js-dataverse has been updated to include the deaccessioned files.

Suggestions on how to test this:

Step 1: Run the development environment

  1. Run npm i
  2. cd packages/design-system && npm run build
  3. cd ../../
  4. Check that you have a .env file such as the .env.example, with the VITE_DATAVERSE_BACKEND_URL=http://localhost:8000 variable
  5. cd dev-env
  6. ./run-env.sh 9958-dataset-files-size
  7. To check the environment go to http://localhost:8000 and check your local dataverse installation

Step 2: Test Dataset View mode with the implemented changes for the getDatasetFilesTotalDownloadSize

  1. Go to http://localhost:8000
  2. Login as admin using username: dataverseAdmin and password: admin1
  3. Create a new Dataset
  4. Upload more than 10 files so the Files Table has pagination, I suggest 30 files or so.
  5. Reduce the zip download limit to the minimum to see the message curl -X PUT -d 1 http://localhost:8000/api/admin/settings/:ZipDownloadLimit
  6. From the dataset view mode copy the search parameters from the url. Ex.: ?persistentId=doi:10.5072/FK2/LHGRHP&version=DRAFT
  7. Go to http://localhost:8000/spa/datasets and paste the search parameters. Ex.: http://localhost:8000/spa/datasets?persistentId=doi:10.5072/FK2/LHGRHP&version=DRAFT
  8. You are seeing now the same Dataset in the SPA. Now, select more than 2 files to see the ZipDownloadLimit message, now click Select All and check that the total size of the select files is correct, including the other pages. Play selecting and unselecting rows and check that it works as expected.

Does this PR introduce a user interface change? If mockups are available, please link/include them here:

No

Is there a release notes update needed for this change?:

No

Additional documentation:

@MellyGray MellyGray linked an issue Oct 4, 2023 that may be closed by this pull request
@coveralls
Copy link

coveralls commented Oct 4, 2023

Coverage Status

coverage: 98.367% (-0.03%) from 98.394% when pulling 22ca6af on feature/170-integration-files-download-size into 403f161 on develop.

@MellyGray MellyGray marked this pull request as ready for review October 4, 2023 11:34
@MellyGray MellyGray changed the base branch from develop to feature/129-dataset-page-action-buttons October 4, 2023 11:34
@MellyGray MellyGray added the Size: 3 A percentage of a sprint. 2.1 hours. label Oct 4, 2023
@GPortas GPortas self-assigned this Oct 13, 2023
Base automatically changed from feature/129-dataset-page-action-buttons to develop October 16, 2023 14:05
Copy link
Contributor

@GPortas GPortas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am getting these errors when building the frontend image:

errors_image_building

): Promise<number> {
return getDatasetFilesTotalDownloadSize
.execute(datasetPersistentId, datasetVersion.toString(), FileDownloadSizeMode.ARCHIVAL)
.catch((error: WriteError) => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should never throw a WriteError, but rather a ReadError, as it is a read operation.

I'm not sure if this happens elsewhere too, could you check?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added 2 commits

  1. I found that the error when running the dev-env was because the package-lock.json wasn't being copied to the docker image so there were some conflicts between the dependencies because npm was using the last version of the libraries. I added a commit to fix this
  2. You're right about the error types, I fixed it everywhere and I'll put more attention next time when adding those types, thanks!

@GPortas GPortas assigned MellyGray and unassigned GPortas Oct 17, 2023
@MellyGray MellyGray assigned GPortas and unassigned MellyGray Oct 23, 2023
Copy link
Contributor

@GPortas GPortas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

fileselection

@GPortas GPortas removed their assignment Oct 26, 2023
@kcondon kcondon self-assigned this Oct 26, 2023
@kcondon kcondon merged commit ac5c109 into develop Oct 26, 2023
15 of 18 checks passed
@kcondon kcondon deleted the feature/170-integration-files-download-size branch October 26, 2023 23:07
jayanthkomarraju pushed a commit to jayanthkomarraju/dataverse-frontend that referenced this pull request May 31, 2024
…ownload-size

170 - Integration files total download size
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Size: 3 A percentage of a sprint. 2.1 hours.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Integration - Get total dataset files size
4 participants