HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2015T2 The C-Suite User Test 3"

From IS480
Jump to navigation Jump to search
 
Line 179: Line 179:
  
 
==Pictures==
 
==Pictures==
 
+
[[Image:Ut3-1.JPG|500px|center]]<br/>
 +
[[Image:Ut3-2.JPG|500px|center]]<br/>
  
 
==Test Documentations==
 
==Test Documentations==

Latest revision as of 15:05, 16 April 2015

Frameless
Home About us Project Overview Project Management Documentation
Technical Diagrams Prototype Testing Meeting Minutes Presentation Materials
Internal Testing User Test 1 User Test 2 User Test 3 User Test 4


User Test 3

Test Plan

Number of Users: Main Users of the System (5 users)
Venue: Queenstown FSC
Date: 24th February 2015 and 27th February
Duration: 1 hour planned
Objectives:

  • Let client have a feel and understanding of the application and how to use it.
  • Ensure that we are on the right track after redesigning the application.
  • Find out usability problems of the application.
  • Get a sense as to the difficulty in using the application
  • See which functions are more difficult than others
  • Receive feedback as to how to further customize the application to the client's liking.

Scope of Testing:

  • Adding, Updating, Viewing and Removing Functions for Member
  • Adding, Updating, Viewing and Removing for Activity
  • Taking and Checking Attendance
  • Generate Statistic/Report
  • Create, Edit and Delete Accounts
  • Login/Logout

Testing Design

As we were going to push for live deployment, we wanted to find out one last time if the application was suitable to their needs and look out for where we could improve the application for Care Corner. The purpose of this testing was to allow the 4 Programme Heads (Early Learning Program, Care Connection, We Are Youths and A Mother A Woman) as well as the Admin Head to be exposed to the application to see if their requirements were met and up to their standard as well as to introduce the new functions of Generating Statistics and Generating Report.

Again, with our idea of creating a very simplistic and easy to use application, one of our goals was to see if the application was easy to understand and use. We also looked out for feedback regarding the design and layout of the application to ensure that we were able to provide the best user experience for the 5 users.

As with the last user test, we chose to keep our user test to as little users as possible. In this case we focused on just 5 users as these 5 were not only the ones who would be in charge of overlooking the different programmes as it allows us to observe them closely as they went about the tasks. Any interesting occurrence would be noted down to see how we can further look into customizing the application for Care Corner.

Differences

The testing went back to function based testing to find out which functions would be more difficult to use. In this way we can improve those specific functions.

We also added a new rating system of the difficulty of performing the task where 1 was Very Easy and 10 was Very Tough.

Scoring

Difficulty

What is new is the new grading criteria which is based on how difficult each task is with 1 being very easy and 10 being very difficult.

  • If the rating for any task was given at 3 and above, we would note it as something we would try to address in order to make the function much simpler to use.
  • Any rating above 5 would have to be brought to the attention of the front end and back end developer to resolve.

User Experience

Each task was allowed to be given a maximum of 10 marks in terms of the User Experience. Which was defined as the easiness to complete the task and the usability of the system in completing the task.

  • If the rating for any task was given at 7 and below, it would be noted down as a usability issue.
  • Any rating below 5 would have to be brought to the attention of the front end and back end developer to resolve.

Other Sections

Comment sections were also left behind after every section to catch any non-verbal or non-action thoughts.

Results

Our results showed that overall, there was no problems with the scores reflected from the user test. Average User Experience rating was above our set mark, and difficulty ratings were below the set mark as well.

Quantitative Feedback

Difficulty Rating.png
User Experience Rating.png

The following are Ratings for the User Experience and Difficulty score for the different tasks in the different programmes.
(Youth) We Are Youths

  • Difficulty (1 = Very Easy) - Average Per Task 1.32/10
  • User Experience (10 = Best) - Average Per Task 8.63/10

(Children) Early Learning Programme

  • Difficulty (1 = Very Easy) - Average Per Task 1.42/10
  • User Experience (10 = Best) - Average Per Task 7.79/10

(Elderly) Care Connection

  • Difficulty (1 = Very Easy) - Average Per Task 1.16/10
  • User Experience (10 = Best) - Average Per Task 9.74/10

(Parent) A Mother A Woman

  • Difficulty (1 = Very Easy) - Average Per Task 1.16/10
  • User Experience (10 = Best) - Average Per Task 9.53/10

(Volunteer) Admin Head

  • Difficulty (1 = Very Easy) - Average Per Task 1.11/10
  • User Experience (10 = Best) - Average Per Task 8.74/10
Average Ratings.png

Overall

  • Difficulty (1 = Very Easy) - Average Per Task 1.23/10
  • User Experience (10 = Best) - Average Per Task 8.88/10

Qualitative Feedback

  • General Feedback of a good and easy to use application.
  • Some bugs with regards to date formatting.
  • Some minor cosmetic issues. (Edited immediately after the user test)

Verbal Feedback

  • Make instructions a little clearer.

Review

Many Tasks

As there were many tasks to be done, some users were unprepared for the long duration of the test. Might have made them feel restless.

Minor Bugs

We found that despite the overall success of this User Test that some minor bugs still existed and as such, we would have to fix it.

Difficulty Rating

The new rating system was not well phrased and got some users confused, resulting in us having to explain to them.

General Chatter

We found it tough to isolate each individual from talking to each other during the user test. However we feel like this simulated their working environment more where they usually discussed things within their small office rather frequently.

Unclear Instruction

Task 19 saw 4 users unable to complete the task initially. Both tester and users were to blame here as instructions were given in the test case but not clearly. This resulted in them giving a relatively lower score for the last task.

Pictures

Ut3-1.JPG


Ut3-2.JPG


Test Documentations

External Testing