Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Summary of major errors in dataset on overview page #53

Open
jpmckinney opened this issue Jul 17, 2020 · 5 comments
Open

Summary of major errors in dataset on overview page #53

jpmckinney opened this issue Jul 17, 2020 · 5 comments
Labels
overview Relating to the Overview page
Milestone

Comments

@jpmckinney
Copy link
Member

Logging this idea. Not yet prioritized.

@sabahfromlondon
Copy link

@jpmckinney do you mean a summary of major errors in creating the report or a new tag for group of fields that a report creator can use just once?

@jpmckinney
Copy link
Member Author

jpmckinney commented Oct 22, 2020

This issue is about adding a high-level summary to the overview page in the web frontend. I've changed the issue title to be clearer.

@jpmckinney jpmckinney changed the title Summary of major errors Summary of major errors in dataset on overview page Oct 22, 2020
@jpmckinney
Copy link
Member Author

From open-contracting-archive/pelican#20

a clickable list of failing checks - as an analyst, I want to see an overview of how many checks were failed, and the ability to quickly jump to the relevant check information, so that I can get a quick impression of potential problem areas in the data and the ability to delve deeper.

@jpmckinney jpmckinney transferred this issue from open-contracting-archive/pelican Sep 14, 2021
@jpmckinney jpmckinney added the overview Relating to the Overview page label Sep 14, 2021
@jpmckinney jpmckinney added this to the Priority milestone Dec 1, 2021
@jpmckinney
Copy link
Member Author

We don't have a prioritized list of check (#56) yet. Listing all failing checks would be duplicative and very long. Listing the checks that have high error rates might still be long – and might not be relevant. Until we have a clear idea of what "major" errors to highlight, I'll close. (In other words, it's a task for the analyst to decide what's major.)

@jpmckinney
Copy link
Member Author

Related to this issue, @allakulov suggested an option to select which quality checks to export (e.g. using a checkbox in each row of the table on the compiled release page, for example), in order to generate a table that can be copied into a report, to serve as a summary of the issues.

(This can maybe be done in JS by putting the relevant HTML in a clipboard as rich text, which should copy-paste correctly.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
overview Relating to the Overview page
Projects
None yet
Development

No branches or pull requests

2 participants