HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2012T2 box.us Test Plans"

From IS480
Jump to navigation Jump to search
Line 67: Line 67:
 
* UAT Issue List
 
* UAT Issue List
 
#[https://docs.google.com/spreadsheet/ccc?key=0AkfA68G0St4NdHRXbWJlRC1sVnJpVFBUcXFlLW1pQUE&usp=sharing UAT Issue List Google Spreadsheet(http://www.bit.ly/UATIssueList]
 
#[https://docs.google.com/spreadsheet/ccc?key=0AkfA68G0St4NdHRXbWJlRC1sVnJpVFBUcXFlLW1pQUE&usp=sharing UAT Issue List Google Spreadsheet(http://www.bit.ly/UATIssueList]
#[https://docs.google.com/spreadsheet/ccc?key=0AsGBoq64IRhgdGVwbjgwVUg0WUxsYTF6MnN0UzVXY2c&usp=sharing UAT Pilot Test Issue List]
 
  
  

Revision as of 13:26, 21 April 2013

degree=90
HOME   PROJECT OVERVIEW     PROJECT MANAGEMENT   DOCUMENTATION  
         

Expected User Acceptance: 5 April 2013

User Acceptance Test Plan

  • Test Cases:
  1. Test Cases for NPO users
  2. Test Cases for Volunteer users
  3. Test Cases for Empact users
  • Briefing Script
  • UAT Issue List
  1. UAT Issue List Google Spreadsheet(http://www.bit.ly/UATIssueList


Resources

We have deployed our system to the live server since 11th of March 2013!

Date Description Features Released

11/03/2013

Release 0.1

  • Task Management
  • User Management
  • Questions
  • Feedback Form
  • Dashboard(NPO)

18/03/2013

Release 0.2

  • Task Management
  • User Management(partial)
  • Questions
  • Dashboard
  • Statistics
  • Improvements from Release 0.1

25/03/2013

Release 0.3

  • Task Management
  • User Management(partial)
  • Questions
  • Notifications
  • Improvements from Release 0.2

04/04/2013

Release 0.4

  • All features released
  • Improvements from Release 0.3

04/04/2013

Release 0.5

  • Final version for Final Verification
  • Improvements from Release 0.4

Deployment testing 1

Deployment testing 2

Deployment testing 3

Deployment testing 4

Deployment testing 5

Testing will be done at the end of every iterations with the use of test cases. If there is any bugs found during the testing, it will be logged in the Bug Log.

Iteration 4


Iteration 5


Iteration 6


Iteration 7


Iteration 8


Pre User Testing 1 Integration Testing (Iteration 9)


Iteration 9


Iteration 10

Iteration 11-14: Testings are done as deployment testing

Bug Log

We will be inviting our SIS experts to help us critique the usability issues in our Web application!


Heuristic evaluations will follow the How to Conduct a Heuristic Evaluation and Ten Usability Heuristics readings by Jakob Nielsen.

Using Nielsen's ten heuristics, evaluators will list as many usability issues as possible. It is less important that the evaluators worry about exactly which heuristic the usability issues occur in.

The heuristics are there to help evaluators find a variety of different types of usability issues. For each usability issue found, the evaluator should capture a description of the issue written as concisely and as specifically as possible.


We are also using Nielsen' Severity Ratings for Usability Problems in our Heuristic Evaluation.


Heuristics Severity Ratings

1. Visibility of system status

2. Match between system and the real world

3. User control and freedom

4. Consistency and standards

5. Error prevention

6. Recognition rather than recall

7. Flexibility and efficiency of use

8. Aesthetic and minimalist design

9. Help users recognize, diagnose, and recover from errors

10. Help and documentation

Not related to any heuristic

0 = I don't agree that this is a usability problem

1 = Cosmetic problem only

2 = Minor usability problem

3 = Major usability problem

4 = Usability catastrophe


Heuristic Evaluation 1

Heuristics Evaluation 1

We conducted our 1st Heuristic Evaluation where we get our SIS experts, that is those who have taken or are currently taking Interaction Design and Prototyping (IS306) to critique and evaluate the usability issues of our web application.

User's feedback

  1. "Log in was easy"
  2. "More confirmation message needed"
  3. "Question UI is good"
  4. "It can be even more intuitive"

For Further Actions

  1. Based on UT2 Follow Up Actions



Documentation for Heuristic Evaluation

Documentation Results Followup Actions


  • As our UT2 was held a few days after our midterms presentation, we have decided to tackle all the usability issues in our UT2 Follow Up Actions.

We conduct user testing to see how do our stakeholders respond to the application that we have built.

User Test 2

UT 2 Photo

We conducted our 2nd User Testing with the future users of the Volunteer Matching system.

User's feedback

  1. "Generally easy to use"
  2. "Can be more user-centric instead of Empact-centric"
  3. "Clean interface"

For Further Actions

  1. Refer to UT2 Follow Up Action.





Documentation for User Test 2

For User Test 2, we made use of templates taken from Usability.gov.

The web provides us with good information regarding user testing!

Documentation Results Followup Actions


User Test 1

User Test 1

We conducted our 1st user testing to try out the new features and at the same time to also test how would the users respond to the system. Until then, there was no information about how would the different users react to the system.

Interesting Findings

  1. Questions module was not intuitive enough
  2. Task matching results could displayed in a more interactive way

Actions Taken

  1. Touched up Question module
  2. Finalized a user interface template
  3. Revamped the entire structure of how the Task module was being managed.



Documentation for User Test 1

Documentation Results Followup Actions