HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2012T2 box.us Test Plans"

From IS480
Jump to navigation Jump to search
Line 116: Line 116:
  
  
'''Pre-Heuristics Evaluation Testing (Iteration 10)'''
+
'''Iteration 10'''
 
*Coming Soon!
 
*Coming Soon!
  

Revision as of 00:00, 6 March 2013

degree=90
HOME   PROJECT OVERVIEW     PROJECT MANAGEMENT   DOCUMENTATION  
         

Expected User Acceptance: 26 March 2013

User Acceptance Test Plan

Resources


Our field test will begin on the 11th of March

It would be tested on our live server.


Details of the field test will be added soon, stay tune!

Testing will be done at the end of every iterations with the use of test cases. If there is any bugs found during the testing, it will be logged in the Bug Log.

Iteration 4


Iteration 5


Iteration 6


Iteration 7


Iteration 8


Pre User Testing 1 Integration Testing (Iteration 9)


Iteration 9


Iteration 10

  • Coming Soon!


Bug Log

We will be inviting our SIS experts to help us critique the usability issues in our Web application!


Heuristic evaluations will follow the How to Conduct a Heuristic Evaluation and Ten Usability Heuristics readings by Jakob Nielsen.

Using Nielsen's ten heuristics, evaluators will list as many usability issues as possible. It is less important that the evaluators worry about exactly which heuristic the usability issues occur in.

The heuristics are there to help evaluators find a variety of different types of usability issues. For each usability issue found, the evaluator should capture a description of the issue written as concisely and as specifically as possible.


We are also using Nielsen' Severity Ratings for Usability Problems in our Heuristic Evaluation.


Heuristics Severity Ratings

1. Visibility of system status

2. Match between system and the real world

3. User control and freedom

4. Consistency and standards

5. Error prevention

6. Recognition rather than recall

7. Flexibility and efficiency of use

8. Aesthetic and minimalist design

9. Help users recognize, diagnose, and recover from errors

10. Help and documentation

Not related to any heuristic

0 = I don't agree that this is a usability problem

1 = Cosmetic problem only

2 = Minor usability problem

3 = Major usability problem

4 = Usability catastrophe


Heuristic Evaluation 1

Heuristics Evaluation 1

We conducted our 1st Heuristic Evaluation where we get our SIS experts, that is those who have taken or are currently taking Interaction Design and Prototyping (IS306) to critique and evaluate the usability issues of our web application.

User's feedback

  1. "Log in was easy"
  2. "More confirmation message needed"
  3. "Question UI is good"
  4. "It can be even more intuitive"

For Further Actions

  1. No actions taken yet!
  2. Actions will be taken after our mid terms presentation



Documentation for Heuristic Evaluation

Documentation Results Followup Actions


  • To solve Heuristic problems after mid terms!

We conduct user testing to see how do our stakeholders respond to the application that we have built.

User Test 2

Date: 26 February 2013

Location: The Hub


Documentation for User Test 2

For User Test 2, we made use of templates taken from Usability.gov.

The web provides us with good information regarding user testing!

Documentation Results Followup Actions


User Test 1

User Test 1

We conducted our 1st user testing to try out the new features and at the same time to also test how would the users respond to the system. Until then, there was no information about how would the different users react to the system.

Interesting Findings

  1. Questions module was not intuitive enough
  2. Task matching results could displayed in a more interactive way

Actions Taken

  1. Touched up Question module
  2. Finalized a user interface template
  3. Revamped the entire structure of how the Task module was being managed.



Documentation for User Test 1

Documentation Results Followup Actions