HeaderSIS.jpg

IS480 Team wiki: 2012T2 box.us Test Plans

From IS480
Jump to navigation Jump to search
degree=90
HOME   PROJECT OVERVIEW     PROJECT MANAGEMENT   DOCUMENTATION  
         

User Acceptance: 5 April 2013 User Acceptance Final Verification: 12 April 2013

User Acceptance Test Plan

  • Test Cases (5 April 2013):
  1. Test Cases - NPO
  2. Test Cases - Volunteer
  3. Test Cases - Empact
  • Test Cases (12 April 2013):
  1. Test Cases for UAT final verificiation
  • Scenario-based Instructions and Exceptional cases
  1. Registration
  2. Task
  3. Question
  4. Exceptional Cases
  • UAT Issue List
  1. UAT Issue List Google Spreadsheet(http://www.bit.ly/UATIssueList
Version 2


Managing Defects

UAT Defect Lifecycle
  • Note: BA refers to the facilitators in the UAT test
  • All defects are highlighted onto a separate issue list.
  • The list is collated and sent to Empact to prioritise based on issues.
  • Bugs are fixed immediately

Recording Issues

Issues are recorded in a separate issue list on Google Docs


Resources

Pre-UAT Testing

Pre-UAT Testing 1

Pre-UAT Testing 2

Pre-UAT Testing 3

Deployment Plan
We have deployed our system to the live server since 11th of March 2013!

Date Description Features Released

11/03/2013

Release 0.1

  • Task Management
  • User Management
  • Questions
  • Feedback Form
  • Dashboard(NPO)

18/03/2013

Release 0.2

  • Task Management
  • User Management(partial)
  • Questions
  • Dashboard
  • Statistics
  • Improvements from Release 0.1

25/03/2013

Release 0.3

  • Task Management
  • User Management(partial)
  • Questions
  • Notifications
  • Improvements from Release 0.2

02/04/2013

Release 0.4

  • All features released
  • Improvements from Release 0.3

11/04/2013

Release 0.5

  • Final version for Final Verification
  • Improvements from Release 0.4

Collated Issues From All Deployments: Collated Deployment Testing Issues

Deployment testing 1

Deployment testing 2

Deployment testing 3

Deployment testing 4

Deployment testing 5

Testing will be done at the end of every iterations with the use of test cases. If there is any bugs found during the testing, it will be logged in the Bug Log.

Iteration 4


Iteration 5


Iteration 6


Iteration 7


Iteration 8


Pre User Testing 1 Integration Testing (Iteration 9)


Iteration 9


Iteration 10

Iteration 11-14: Testings are done as deployment testing

Bug Log

We will be inviting our SIS experts to help us critique the usability issues in our Web application!


Heuristic evaluations will follow the How to Conduct a Heuristic Evaluation and Ten Usability Heuristics readings by Jakob Nielsen.

Using Nielsen's ten heuristics, evaluators will list as many usability issues as possible. It is less important that the evaluators worry about exactly which heuristic the usability issues occur in.

The heuristics are there to help evaluators find a variety of different types of usability issues. For each usability issue found, the evaluator should capture a description of the issue written as concisely and as specifically as possible.


We are also using Nielsen' Severity Ratings for Usability Problems in our Heuristic Evaluation.


Heuristics Severity Ratings

1. Visibility of system status

2. Match between system and the real world

3. User control and freedom

4. Consistency and standards

5. Error prevention

6. Recognition rather than recall

7. Flexibility and efficiency of use

8. Aesthetic and minimalist design

9. Help users recognize, diagnose, and recover from errors

10. Help and documentation

Not related to any heuristic

0 = I don't agree that this is a usability problem

1 = Cosmetic problem only

2 = Minor usability problem

3 = Major usability problem

4 = Usability catastrophe


Heuristic Evaluation 1

Heuristics Evaluation 1

We conducted our 1st Heuristic Evaluation where we get our SIS experts, that is those who have taken or are currently taking Interaction Design and Prototyping (IS306) to critique and evaluate the usability issues of our web application.

User's feedback

  1. "Log in was easy"
  2. "More confirmation message needed"
  3. "Question UI is good"
  4. "It can be even more intuitive"

For Further Actions

  1. Based on UT2 Follow Up Actions



Documentation for Heuristic Evaluation

Documentation Results Followup Actions


  • As our UT2 was held a few days after our midterms presentation, we have decided to tackle all the usability issues in our UT2 Follow Up Actions.

We conduct user testing to see how do our stakeholders respond to the application that we have built.

User Test 2

User Test 1

We conducted our 2nd user testing to try out our newly added features and at to see how users would respond to them.

Interesting Findings

  1. Users thought changing email and password is inside profile

Actions Taken

  1. Success Messages to be more descriptive
  2. Clearer images used for buttons
  3. Login button to be near the sign up button
  4. Refer to UT2 follow up actions




Documentation for User Test 2

For User Test 2, we made use of templates taken from Usability.gov.

The web provides us with good information regarding user testing!

Documentation Results Followup Actions


User Test 1

User Test 1

We conducted our 1st user testing to try out the new features and at the same time to also test how would the users respond to the system. Until then, there was no information about how would the different users react to the system.

Interesting Findings

  1. Questions module was not intuitive enough
  2. Task matching results could displayed in a more interactive way

Actions Taken

  1. Touched up Question module
  2. Finalized a user interface template
  3. Revamped the entire structure of how the Task module was being managed.



Documentation for User Test 1

Documentation Results Followup Actions