IS480 Team wiki: 2012T2 box.us Test Plans
|HOME||PROJECT OVERVIEW||PROJECT MANAGEMENT||DOCUMENTATION|
Expected User Acceptance: 5 April 2013
User Acceptance Test Plan
- Test Cases:
- Briefing Script
- UAT Issue List
We have deployed our system to the live server since 11th of March 2013!
Deployment testing 1
Deployment testing 2
Deployment testing 3
Deployment testing 4
Deployment testing 5
In order to verify that the project has met the goals, Box.us would also be doing a web testing to collect data about the usage for the deployed application. Here is a plan of what would be collected by Box.us:
Testing will be done at the end of every iterations with the use of test cases. If there is any bugs found during the testing, it will be logged in the Bug Log.
- Integration Testing for Iteration 5 - Volunteer View
- Integration Testing for Iteration 5 - Empact View
- Integration Testing for Iteration 6 - Volunteer View
- Integration Testing for Iteration 6 - Empact View
- Integration Testing for Iteration 7 - Volunteer View
- Integration Testing for Iteration 7 - Empact View
- Integration Testing for Iteration 8 - Volunteer View
- Integration Testing for Iteration 8 - Empact View
Pre User Testing 1 Integration Testing (Iteration 9)
- Integration Testing before User Testing 1 - Volunteer View
- Integration Testing before User Testing 1 - Empact View
- Integration Testing for Iteration 9 - Volunteer View
- Integration Testing for Iteration 9 - Empact View
- Integration Testing for Iteration 10 - Volunteer View
- Integration Testing for Iteration 10 - Empact View
- Integration Testing for Iteration 10 - NPO View
We will be inviting our SIS experts to help us critique the usability issues in our Web application!
Using Nielsen's ten heuristics, evaluators will list as many usability issues as possible. It is less important that the evaluators worry about exactly which heuristic the usability issues occur in.
The heuristics are there to help evaluators find a variety of different types of usability issues. For each usability issue found, the evaluator should capture a description of the issue written as concisely and as specifically as possible.
We are also using Nielsen' Severity Ratings for Usability Problems in our Heuristic Evaluation.
1. Visibility of system status
2. Match between system and the real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose, and recover from errors
10. Help and documentation
Not related to any heuristic
0 = I don't agree that this is a usability problem
1 = Cosmetic problem only
2 = Minor usability problem
3 = Major usability problem
4 = Usability catastrophe
Heuristic Evaluation 1
We conducted our 1st Heuristic Evaluation where we get our SIS experts, that is those who have taken or are currently taking Interaction Design and Prototyping (IS306) to critique and evaluate the usability issues of our web application.
For Further Actions
Documentation for Heuristic Evaluation
We conduct user testing to see how do our stakeholders respond to the application that we have built.
User Test 2
We conducted our 2nd User Testing with the future users of the Volunteer Matching system.
For Further Actions
Documentation for User Test 2
For User Test 2, we made use of templates taken from Usability.gov.
The web provides us with good information regarding user testing!
User Test 1
We conducted our 1st user testing to try out the new features and at the same time to also test how would the users respond to the system. Until then, there was no information about how would the different users react to the system.
Documentation for User Test 1