From da0109409f8bd81e74e49db1aaade191771ae612 Mon Sep 17 00:00:00 2001 From: Yuliya_Prihodko Date: Mon, 23 Oct 2023 18:13:28 +0300 Subject: [PATCH] Fix links for widgets --- .../PossibleDashboardsInReportPortal.mdx | 24 +++++++++---------- 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/docs/dashboards-and-widgets/PossibleDashboardsInReportPortal.mdx b/docs/dashboards-and-widgets/PossibleDashboardsInReportPortal.mdx index 6384c110c..e17ae6ce4 100644 --- a/docs/dashboards-and-widgets/PossibleDashboardsInReportPortal.mdx +++ b/docs/dashboards-and-widgets/PossibleDashboardsInReportPortal.mdx @@ -30,27 +30,27 @@ The goal for this test results dashboard to show the status of the latest test r You can configure: -[**Passing rate widget**](./PassingRateSummary) that shows a passing rate for a latest launch "API suite' +[**Passing rate widget**](/dashboards-and-widgets/PassingRateSummary) that shows a passing rate for a latest launch "API suite' -[**Most popular pattern**](./MostPopularPatternTableTop20) tracks TOP-20 problems in the last and previous runs of this suite. +[**Most popular pattern**](/dashboards-and-widgets/MostPopularPatternTableTop20) tracks TOP-20 problems in the last and previous runs of this suite. :::note For Most popular pattern table, you should you create a set of rules and run Pattern Analysis ::: -With [**Investigated percentage of launches**](./InvestigatedPercentageOfLaunches) you can find out the status of failure investigations. You will be able to evaluate team performance and consistency of results. +With [**Investigated percentage of launches**](/dashboards-and-widgets/InvestigatedPercentageOfLaunches) you can find out the status of failure investigations. You will be able to evaluate team performance and consistency of results. -[**Failed cases trend chart**](./FailedCasesTrendChart) shows the history of failures in previous runs. +[**Failed cases trend chart**](/dashboards-and-widgets/FailedCasesTrendChart) shows the history of failures in previous runs. -[**Duration chart**](./LaunchesDurationChart) will be very helpful for those who track duration KPI and want to increase the speed of tests run. +[**Duration chart**](/dashboards-and-widgets/LaunchesDurationChart) will be very helpful for those who track duration KPI and want to increase the speed of tests run. -[**Test growth trend chart**](./TestCasesGrowthTrendChart) shows you the speed of new test cases creation. +[**Test growth trend chart**](/dashboards-and-widgets/TestCasesGrowthTrendChart) shows you the speed of new test cases creation. -Also, you can create [**"Most flaky test cases"**](./FlakyTestCasesTableTop20) and [**"Most failed test case"**](./dashboards-and-widgets/MostFailedTestCasesTableTop50) and find the most unstable items which should be taken into account. +Also, you can create [**"Most flaky test cases"**](/dashboards-and-widgets/FlakyTestCasesTableTop20) and [**"Most failed test case"**](/dashboards-and-widgets/MostFailedTestCasesTableTop50) and find the most unstable items which should be taken into account. Let's assume that you have a lot of test results and a lot of teams. -You can create [**Overall statistics**](./OverallStatistics) and [**Launches table**](./LaunchesTable), and now a team who is responsible for API suite has no need to go to the test results. It can use only this dashboard which gives enough information for test failure management. +You can create [**Overall statistics**](/dashboards-and-widgets/OverallStatistics) and [**Launches table**](/dashboards-and-widgets/LaunchesTable), and now a team who is responsible for API suite has no need to go to the test results. It can use only this dashboard which gives enough information for test failure management. ## Build / Release/ Sprint Report (A dashboard for a Team leads, PM, DM) @@ -65,20 +65,20 @@ Also, it is very useful to compare the results of the Regression on the current On this dashboard you can see different metrics: - A passing rate for the whole Regression -- With [**Cumulative trend chart**](./CumulativeTrendChart) you will be able to compare different versions on one chart, to compare different runs for the current regression -- [**Component Health Check Widget**](./ComponentHealthCheck) would show you product: +- With [**Cumulative trend chart**](/dashboards-and-widgets/CumulativeTrendChart) you will be able to compare different versions on one chart, to compare different runs for the current regression +- [**Component Health Check Widget**](/dashboards-and-widgets/ComponentHealthCheck) would show you product: * on different env * on a different dimension * by business metrics / by features/ by user roles / by etc -Also with a help of [**Component Health Check Widget**](./ComponentHealthCheck) you can create a Test Pyramid. +Also with a help of [**Component Health Check Widget**](/dashboards-and-widgets/ComponentHealthCheck) you can create a Test Pyramid. :::note You need to report test executions with attributes which specified needed metrics or envs ::: -- [**Unique bugs table**](./UniqueBugsTable) helps you collect and analyze new bugs in the system +- [**Unique bugs table**](/dashboards-and-widgets/UniqueBugsTable) helps you collect and analyze new bugs in the system :::note Component Health Check Widget and Cumulative trend chart are very configurable and you can create your own widget based on project needs.