Using Analysis Reports
qTest Insights provides several pre-built, cross-project reports that can be viewed from three different perspectives including Quality, Coverage, and Velocity.
To begin, you should set your Global Filter for the criteria you would like to view:
- Select the Global Filter icon , to choose one or more values for Projects, Test Cycles, Modules, and Releases to filter your Analysis report data. Any change to the criteria will affect the entire report view.
- You can set a global value to automatically refresh data at your expected interval of either 1,2,5,10 or 20 minutes.
- Select the Refresh Now button to automatically reload your reports with the latest data.
- Select the Save Report button to save each Analysis report to the Manage Reports section of qTest Insights. This will allow you to manage or schedule the report later on.
Read the Saved Reports article on how to manage and schedule report.
Below are some helpful tips for navigating the Quality Analysis report.
From the menu Analysis menu, select Quality to open the Quality Analysis page.
- There are three quality charts:
- Latest Test Run Results: the results from the latest runs of each test case.
- Test Results by Day: a breakdown of the test run results by day.
- Defect Status: shows the distribution of defects by status.
- Click on each tab to show data broken down by project, and test cycle.
Hover to See Data
Example: When you hover over the green pie slice, a small pop-up will display telling you that there are 30 defects.
Click to Drill-down for Details
Read this article for things you are able to do on a drill-down page.
The Coverage Analysis report helps you determine test case and execution coverage as it related to requirements and application areas. Below are some helpful tips on navigating the Coverage Analysis report.
From the Analysis menu, select Coverage to open the Coverage Analysis page.
There are two coverage charts:
- Requirements Test Run Coverage: The heat map is setup so that the size of each box is based on the number of test runs and the color of the box is based on the number of failed runs. Dark green is the lowest number of failures and bright red is the highest number.
- Requirements Run Results: The results from the latest run of each test case covering the requirements.
Test Type: This is to filter out manual vs automation test run data for the below analysis data table.
Analysis Data Table: This table shows the test cases for each requirement along with the testing results and number of defects. The followings are some explanations for several fields in the table.
- Explorer Sessions: the count of explorer sessions associated with a specific test case and requirement. (An Explorer Session must be linked to a Test Run which then linked to a Requirement to be counted.)
- Explorer Time: total time spent in Explorer sessions associated to a specific test case and requirement
- Open Defects (qTest): the count of qTest defects whose status are either New, Reopened, or Assigned
- Severe Defects (qTest): the count of qTest defects whose severity are either Fatal, or Major
Hover to View Data
Example: When you hover over the cell of Requirements Test Run Coverage, a small popup displayed telling you that the Requirement US_Login has been covered by 6 test runs and 3 of them failed.
Click on Heat Map to Drill-down to Details
Read this article for things you are able to do on Drilldown page.
The Velocity Analysis reports show how fast the team is going and how much work might be left to do. Below are some helpful tips on navigating the Velocity Analysis report.
From the Analysis menu, select Velocity to open the Velocity Analysis page.
There are three velocity metrics as follows:
- Test Planned vs. Executed: cumulative results by day for the # of test planned & executed & execution results. The Date Range Selector is available to display the start and end dates for the selected Release(s). If multiple releases are selected in the Global Filter, then the date range displayed will be the earliest start date and the latest end date for the selected Releases.
- New Requirements and Test Cases: # of requirements and TCs (manual & auto) added by day.
- Defects Opened and Closed: # of defects opened and closed by day.
Analyzed Data Table: shows activities and results broken down by testers.
- Estimated Time Remaining: calculated based on the measured time spent by tester per test step and the number of test steps remaining.
- Forecasted Defects Remaining: calculated based on the number of defects found by tester per test run and the number of test runs remaining.
Explorer Sessions: the count of explorer sessions conducted by the tester
Explorer Time: total time of explorer sessions conducted by the tester
Hover Over Charts to See Data
Example: When you hover over a point on the Tests Planned vs. Executed chart, a small popup displayed telling you that there are 27 passed test runs on 8/25/2016.
Click Data Points of Charts to Drill-Down
Read this article for things you are able to do on the drill down page.