Difference between revisions of "IS480 Team wiki: 2012T2 box.us Test Plans"
Bk.tan.2010 (talk | contribs) |
|||
Line 179: | Line 179: | ||
|- | |- | ||
|style="vertical-align:text-top"| | |style="vertical-align:text-top"| | ||
− | + | 11/04/2013 | |
|style="vertical-align:text-top"| | |style="vertical-align:text-top"| | ||
Release 0.5 | Release 0.5 |
Revision as of 15:49, 21 April 2013
HOME | PROJECT OVERVIEW | PROJECT MANAGEMENT | DOCUMENTATION |
Contents
User Acceptance Test
User Acceptance: 5 April 2013
User Acceptance Final Verification: 12 April 2013
User Acceptance Test Plan
- Test Cases (5 April 2013):
- Test Cases (12 April 2013):
- Scenario-based Instructions and Exceptional cases
- UAT Issue List
Managing Defects
- Note: BA refers to the facilitators in the UAT test
- All defects are highlighted onto a separate issue list.
- The list is collated and sent to Empact to prioritise based on issues.
- Bugs are fixed immediately
Recording Issues
Issues are recorded in a separate issue list on Google Docs
Resources
Pre-UAT Testing
Pre-UAT Testing 1
Pre-UAT Testing 2
Pre-UAT Testing 3
Deployment Testing
We have deployed our system to the live server since 11th of March 2013!
Date | Description | Features Released |
---|---|---|
11/03/2013 |
Release 0.1 |
|
18/03/2013 |
Release 0.2 |
|
25/03/2013 |
Release 0.3 |
|
04/04/2013 |
Release 0.4 |
|
11/04/2013 |
Release 0.5 |
|
Deployment testing 1
Deployment testing 2
Deployment testing 3
Deployment testing 4
Deployment testing 5
Integration Testing
Testing will be done at the end of every iterations with the use of test cases. If there is any bugs found during the testing, it will be logged in the Bug Log.
Iteration 4
Iteration 5
- Integration Testing for Iteration 5 - Volunteer View
- Integration Testing for Iteration 5 - Empact View
Iteration 6
- Integration Testing for Iteration 6 - Volunteer View
- Integration Testing for Iteration 6 - Empact View
Iteration 7
- Integration Testing for Iteration 7 - Volunteer View
- Integration Testing for Iteration 7 - Empact View
Iteration 8
- Integration Testing for Iteration 8 - Volunteer View
- Integration Testing for Iteration 8 - Empact View
Pre User Testing 1 Integration Testing (Iteration 9)
- Integration Testing before User Testing 1 - Volunteer View
- Integration Testing before User Testing 1 - Empact View
Iteration 9
- Integration Testing for Iteration 9 - Volunteer View
- Integration Testing for Iteration 9 - Empact View
Iteration 10
- Integration Testing for Iteration 10 - Volunteer View
- Integration Testing for Iteration 10 - Empact View
- Integration Testing for Iteration 10 - NPO View
Iteration 11-14: Testings are done as deployment testing
Bug Log
Heuristic Evaluation
We will be inviting our SIS experts to help us critique the usability issues in our Web application!
Heuristic evaluations will follow the How to Conduct a Heuristic Evaluation and Ten Usability Heuristics readings by Jakob Nielsen.
Using Nielsen's ten heuristics, evaluators will list as many usability issues as possible. It is less important that the evaluators worry about exactly which heuristic the usability issues occur in.
The heuristics are there to help evaluators find a variety of different types of usability issues. For each usability issue found, the evaluator should capture a description of the issue written as concisely and as specifically as possible.
We are also using Nielsen' Severity Ratings for Usability Problems in our Heuristic Evaluation.
Heuristics | Severity Ratings |
---|---|
1. Visibility of system status 2. Match between system and the real world 3. User control and freedom 4. Consistency and standards 5. Error prevention 6. Recognition rather than recall 7. Flexibility and efficiency of use 8. Aesthetic and minimalist design 9. Help users recognize, diagnose, and recover from errors 10. Help and documentation Not related to any heuristic |
0 = I don't agree that this is a usability problem 1 = Cosmetic problem only 2 = Minor usability problem 3 = Major usability problem 4 = Usability catastrophe |
Heuristic Evaluation 1
We conducted our 1st Heuristic Evaluation where we get our SIS experts, that is those who have taken or are currently taking Interaction Design and Prototyping (IS306) to critique and evaluate the usability issues of our web application. User's feedback
For Further Actions
| |
Documentation for Heuristic Evaluation
Documentation | Results | Followup Actions |
---|---|---|
|
|
User Testing
We conduct user testing to see how do our stakeholders respond to the application that we have built.
User Test 2
We conducted our 2nd user testing to try out our newly added features and at to see how users would respond to them. Interesting Findings
Actions Taken
| |
Documentation for User Test 2
For User Test 2, we made use of templates taken from Usability.gov.
The web provides us with good information regarding user testing!
Documentation | Results | Followup Actions |
---|---|---|
|
User Test 1
We conducted our 1st user testing to try out the new features and at the same time to also test how would the users respond to the system. Until then, there was no information about how would the different users react to the system. Interesting Findings
Actions Taken
| |
Documentation for User Test 1
Documentation | Results | Followup Actions |
---|---|---|