HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2015T2 The C-Suite User Test 4"

From IS480
Jump to navigation Jump to search
 
Line 191: Line 191:
  
 
==Pictures==
 
==Pictures==
 
+
[[Image:Image5.JPG|500px|center]]<br/>
 +
[[Image:Image6.JPG|500px|center]]<br/>
 +
[[Image:Image8.JPG|500px|center]]<br/>
  
 
==Test Documentations==
 
==Test Documentations==

Latest revision as of 14:51, 16 April 2015

Frameless
Home About us Project Overview Project Management Documentation
Technical Diagrams Prototype Testing Meeting Minutes Presentation Materials
Internal Testing User Test 1 User Test 2 User Test 3 User Test 4


User Test 4

Test Plan

Number of Users: 8 Testers (4 Heads and 4 others)
Venue: Queenstown FSC
Date: 25th March 2015
Duration: 1 hour planned
Objectives:

  • Let client have a feel and understanding of the application and how to use it.
  • Ensure that we are on the right track after redesigning the application.
  • Find out usability problems of the application.
  • Get a sense as to the difficulty in using the application
  • See which functions are more difficult than others
  • Receive feedback as to how to further customize the application to the client's liking.

Scope of Testing:

  • Adding, Updating, Viewing and Removing Functions for Member
  • Adding, Updating, Viewing and Removing for Activity
  • Taking and Checking Attendance
  • Generate Statistic/Report
  • Create, Edit and Delete Accounts
  • Login/Logout
  • Inactive Member

Testing Design

Now that live deployment is up and most of the functions have been exposed to the staff, We wanted the 4 Programme Heads (Early Learning Program, Care Connection, We Are Youths and A Mother A Woman) as well as 4 other people to expose them to the new, friendlier user interface and see how they've been using the application. The main aim is to see if the main 4 heads have changed their behaviour of using the application and to be notice how the newer users have been using the application. Again, we wanted to see if their requirements were met and up to their standard as well as to introduce the new functions of the Inactive Member notification.

As per our previous User Tests, we also looked out for feedback regarding the design and layout of the application to ensure that we were able to provide the best user experience for the 8 users.

What was different from this User Test was that we asked 8 users to simulate a more realistic environment and to stress test the system. In this case we focused more on the 4 new testers to see how they used the application and to note any interesting occurrence to see how we can further look into customizing the application for Care Corner. We also had each of the 4 new users do the test case for their own programs (1 user for each program) which we term as the 2nd tester (eg. Children 2, Youth 2)

Scoring

Difficulty

What is new is the new grading criteria which is based on how difficult each task is with 1 being very easy and 10 being very difficult.

  • If the rating for any task was rated above 3, we would note it as something we would try to address in order to make the function much simpler to use.
  • Any rating above 5 would have to be brought to the attention of the front end and back end developer to resolve.

User Experience

Each task was allowed to be given a maximum of 10 marks in terms of the User Experience. Which was defined as the easiness to complete the task and the usability of the system in completing the task.

  • If the rating for any task was rated below 7 , it would be noted down as a usability issue.
  • Any rating below 5 would have to be brought to the attention of the front end and back end developer to resolve.

Other Sections

Comment sections were also left behind after every section to catch any non-verbal or non-action thoughts. We also asked for the Overall Rating for the application.

Results

Our results showed that overall, there was no problems with the scores reflected from the user test. Average User Experience rating was above our set mark, and difficulty ratings were below the set mark as well. In fact there was an increase in overall User Experience as compared to User Test 3
However we needed to take note that many people were a little confused over the inactive member function and what it was meant to do. This affected the difficulty rating. This could also attributed to new testers who are relatively newer to the system.Furthermore, the Tester (Children 2) actually had an average score of 6.9 for the user experience which was below the mark we projected.

Quantitative Feedback

Difficulty Rating 4.png


User Experience 4.png


The following are Ratings for the User Experience and Difficulty score for the different tasks in the different programmes.

(Children) Early Learning Programme

  • Tester 1
    • Difficulty (1 = Very Easy) - Average Per Task 1.33/10
    • User Experience (10 = Best) - Average Per Task 9.62/10
  • Tester 2
    • Difficulty (1 = Very Easy) - Average Per Task 1.90/10
    • User Experience (10 = Best) - Average Per Task 6.90/10

(Youth) We Are Youths

  • Tester 1
    • Difficulty (1 = Very Easy) - Average Per Task 1.86/10
    • User Experience (10 = Best) - Average Per Task 8.95/10
  • Tester 2
    • Difficulty (1 = Very Easy) - Average Per Task 1/10
    • User Experience (10 = Best) - Average Per Task 10/10

(Parent) A Mother A Woman

  • Tester 1
    • Difficulty (1 = Very Easy) - Average Per Task 1.19/10
    • User Experience (10 = Best) - Average Per Task 9.67/10
  • Tester 2
    • Difficulty (1 = Very Easy) - Average Per Task 2.38/10
    • User Experience (10 = Best) - Average Per Task 8.38/10

(Elderly) Care Connection

  • Tester 1
    • Difficulty (1 = Very Easy) - Average Per Task 1.33/10
    • User Experience (10 = Best) - Average Per Task 9.52/10
  • Tester 2
    • Difficulty (1 = Very Easy) - Average Per Task 1/10
    • User Experience (10 = Best) - Average Per Task 9.81/10


Overall Scores.png


Overall

  • Difficulty (1 = Very Easy) - Average Per Task 1.50/10
  • User Experience (10 = Best) - Average Per Task 9.11/10
  • Overall Rating - Average 8.63/10

Qualitative Feedback

  • General Feedback of a good and easy to use application.
  • Some bugs with regards to date formatting.
  • Some minor cosmetic issues. (Edited immediately after the user test)

Verbal Feedback

  • Make instructions a little clearer.

Review

Cosmetic Bugs

Hardly any bugs except some minor cosmetic bugs

Slow Server

There was quite a bit of lag time for the testers. This did not occur during our concurrency test

Inactive Member

As this function was introduced by our main sponsor, many people did not know of this function and its purpose. This led to many people being confused and unable to complete the task until we told them the idea behind it.

Guidance

We had to help some users understand some parts of the instructions as they had never done a user test before, hence their scores might have been affected. We might create a user guide to help out as such.

Perfect Score

We achieved a perfect score from one new participant who felt that the application was easy to use.

Pictures

Image5.JPG


Image6.JPG


Image8.JPG


Test Documentations

External Testing