Performance Report
This report documents the results for the performance benchmark for Go2Group synapseRT. This report contains information including test environment, test scenario, test data, test conclusion, and more.
Purpose
Run the test to find the performance bottleneck of the product, and get the performance results to use as a benchmark for comparison with the next version.
Reader
- synapseRT product team
- synapseRT user
Test Environment
Hardware and System
OS | Windows 10 Pro 64-bit |
---|---|
Processor | Intel(R) Xeon(R) CPU E3-1231 v3 @3.40GHz |
CPU Core | 4 |
Memory | 32.0 GB |
Software
JIRA Server | JIRA Core 7.2.3 |
---|---|
synapseRTRT NextGen | V8.5.1 |
Database | Oracle12c |
Browser | Firefox 47.02 |
Test Data
Testing_Data_JIRA7.2.3_v8.4.3.1_0104(3000TCs&1000REQs&100TPs_REQHierarchyDone_TC&REQLinkageDone).zip
Test Scenarios
As JIRA is a web-based server, our performance testing is focused on page loading time with a large amount of data.
Open the pages listed below with a large amount of data and record the page loading time:
- Navigate to the Requirements page
- Navigate to the Test Suites page
- Navigate to the Test Plans page
- Navigate to the Traceability page
- Navigate to the synapseRT Reports page
- Add a test cycle to a test plan with a large amount of test cases
- Open a test plan issue with a large amount of data
- Open a test cycle page with a large amount of test cases
- Start a test cycle with a large amount of test cases
- Initiate a "Bulk Operation" with a large amount of test cases
- Expand a test plan from the Test Plans page (in the Unresolved Plans tab)
- Link a large amount of test cases to a test suite
- Close a test run dialog box to refresh the Test Cycle page
- Gadget: choose a test plan to load its test cycle in the Edit Gadget page
- Continue to import test cases to a JIRA project
- Expand requirement hierarchy from the Requirements page with requirement hierarchy setup and test case associations
Test Results
We ran this benchmark test several times. The results for each test were almost identical. One set of results is listed below:
Test scenario | Test data | Average page loading time |
---|---|---|
Navigate to the Requirements page |
| 4.64s |
Navigate to the Test Suites page |
| 2.27s |
Navigate to the Test Plans page |
| 13.22s |
Navigate to the Traceability page | Click on the Traceability menu from the project side bar | 1.43s |
Navigate to the synaspeRT Reports page | 1.22s | |
Add a test cycle to a test plan with a large amount of test cases | PTA-8100
| 18s |
Open a test plan issue with a large amount of data | PTA-8100
| 5.78s |
Open a test cycle page with a large amount of test cases | PTA-8100/test cycle one (3000 TCs)
| 19.18s |
Start a test cycle with a large amount of test cases | PTA-8100/test cycle one (3000 TCs)
| 50s |
Initiate a "Bulk Operation" with a large amount of test cases | PTA-8100/test cycle one (3000 TCs)
| 8s |
Expand a test plan from the Test Plans page (in the Unresolved Plans tab) | PTA-8100
| 5.76s |
Link a large amount of test cases to a test suite | [Performance testing] test suite eight (link)
| 12s |
Close a test run dialog box to refresh the Test Cycle page | PTA-8100/test cycle one (3000 TCs)
| <1s |
Gadget: Choose a test plan to load its test cycle in the Edit Gadget page | PTA-8100
| <1s |
Continue to import test cases to a JIRA project |
| 1.) 2m 50s 2.) 3m 10s 3.) NA |
Expand the requirement hierarchy from the Requirements page with requirement hierarchy setup and test case associations |
| 1.) Open the Requirements page: 4.44s 2.) Expand level two requirements: 434ms |
Test Conclusion
- The above data are based on the results of running the test one time. When we run tests many times, the results are almost identical, thus validating the results.
- The next step: In the future, we will re-run the test with the same environment and scripts, and use this data for comparison with the next version.
- Looking at the above data, we can conclude that importing a large amount of test cases takes a lot of time, and we need to improve product performance in this aspect in the future release. However, considering the data load (the large amount of data) we used to exceed the average data load, the wait is still acceptable to the user.