HeaderSIS.jpg

IS480 Team wiki: 2015T2 The C-Suite User Test 2

From IS480
Jump to navigation Jump to search
Frameless
Home About us Project Overview Project Management Documentation
Technical Diagrams Prototype Testing Meeting Minutes Presentation Materials
Internal Testing User Test 1 User Test 2 User Test 3 User Test 4


User Test 2

Test Plan

Number of Users: Main Users of the System (5 users)
Venue: Queenstown FSC
Date: 21st January 2015
Duration: 30 mins planned, 1 hour actual time
Objectives:

  • Let client have a feel and understanding of the application and how to use it.
  • Ensure that we are on the right track after redesigning the application.
  • Find out usability problems of the application.
  • Receive feedback as to how to further customize the application to the client's liking.

Scope of Testing:

  • Adding, Updating and Removing Functions for Member
  • Adding, Updating and Removing for Activity
  • Taking and Checking Attendance

Testing Design

After the entrance and new inputs of the 4 Programme Heads from Care Corner, we had a massive overhaul and had redesigned the application according to the new user requirements. The purpose of this testing was to allow the 4 Programme Heads (Early Learning Program, Care Connection, We Are Youths and A Mother A Woman) as well as the Admin Head to be exposed to the redesigned application to see if the new requirements were met and up to their standard.

Again, with our idea of creating a very simplistic and easy to use application, one of our goals was to see if the application was easy to understand and use. We also looked out for feedback regarding the design and layout of the application to ensure that we were able to provide the best user experience for the 5 users.

As with the last user test, we chose to keep our user test to as little users as possible. In this case we focused on just 5 users as these 5 were not only the ones who would be in charge of overlooking the different programmes as it allows us to observe them closely as they went about the tasks. Any interesting occurrence would be noted down to see how we can further look into customizing the application for Care Corner.

The test was designed with 3 tasks to simulate a proper scenario that could occur while working. The scenario allowed the user to go through the process of Adding, Updating and Removing a Member, Adding, Updating and Removing an Activity as well as to Take and Check the attendance.

  • Task 1 focuses on Adding Member and Activity, Searching and Viewing Member and Activity.
  • Task 2 focuses on Updating both Member and Activity, Searching and Viewing Member and Activity.
  • Task 3 focuses on Taking and Checking Attendance, Removing Activity and Member as well as Searching and Viewing Member and Activity.

Each task was allowed to be given a maximum of 10 marks in terms of the User Experience. Which was defined as the easiness to complete the task and the usability of the system in completing the task.

  • If the rating for any task was given at 7 and below would be noted down as a usability issue. This is due to the fact that we wanted to make sure that the application would be as easy to use as possible.
  • Any rating below 5 would have to be brought to the attention of the front end and back end developer to resolve.

Comment sections were also left behind after every section to catch any non-verbal or non-action thoughts.

Results

4 out of 5 were unable to complete the tasks successfully within the given time frame. This was due to a concurrency problem within the application. We tried to fix the problem on the spot but were only successfully in allowing 1 more user to complete the task.

However, the remaining 3 users were able to complete most, but not all of the tasks.

Ut2total.png

With a score of 8 per task needed and an overall score of 24/30 being what we aimed for, only 1 out of 5 users passed the test.

Quantitative Feedback

Ut2results.png

The following are Ratings for the User Experience for the different tasks in the different programmes.

  • We Are Youths Programme - 23/30
    • Task 1 - 10/10, Task 2 - 5/10, Task 3 - 8/10
  • Early Learning Programme - 23/30
    • Task 1 - 8/10, Task 2 - 8/10, Task 3 - 7/10
  • Care Connection - 22/30
    • Task 1 - 8/10, Task 2 - 8/10, Task 3 - 6/10
  • A Mother A Woman - 25/30
    • Task 1 - 10/10, Task 2 - 5/10, Task 3 - 10/10
  • Admin Head - 23/30
    • Task 1 - 9/10, Task 2 - 7/10, Task 3 - 7/10

Qualitative Feedback

Member Functions

We Are Youths

  • 2 repeated fields for contact number.
  • Additional fields for an extra parent needed.
  • Confirmation box needed for deleting members.
  • Easy to add member.

Early Learning Program

  • Additional fields for an extra parent needed.
  • Very straightforward to add member.
  • Easy and straightforward to update member.
  • Easy to remove member.

Care Connection

  • 2 repeated fields for contact number.
  • A member's contact number can be more than 8 numbers.
  • Confirmation box needed for deleting members.

A Mother A Woman

  • Only one field for a Child, problem arises when there are more children.
  • Adding a member was Easy

Activity Functions

We Are Youths

  • Couldn't update Activity
  • Confirmation box needed for deleting activities.

Early Learning Program

  • Very straightforward to add activity.
  • Couldn't update activity.
  • Easy to delete activity.

Care Connection

  • Confirmation box needed for deleting activities.

A Mother A Woman

  • Couldn't update Activity

Attendance Function

Early Learning Program

  • Couldn't view attendance page.

Care Connection

  • Couldn't view attendance page.
  • Default number of guests should be "0" and not "1".


Verbal Feedback

  • Overall usability is simple and easy to use.
  • Will show it to colleagues.
  • High likelihood that more fields would be needed.

Review

Concurrency

The problem of the concurrency issue severely affected the result of this test. User Experience was largely affected and scored automatically went downwards. This affected the credibility of the easiness of using the application.

Scenario Base Testing

Many of the users were confused with the storyline with most saying that they are more used to things happening in reality than in paper. This also affected their user experience in using the application. Also, merging tasks together to form a bigger task in a scenario makes it tough to analyze exactly which tasks specifically are really difficult.

Score Design

With the aforementioned problems clearly showing that User Experience is very subjective, we need to separate the easiness of using the application from the experience during the test.

Interesting Occurrences

We overheard one user mentioning the idea of not only adding activities but new programmes. We brought it up with the users after the test and discovered that we would be redesigning the application because of New Requirements.

Pictures

UT2-1.jpg


UT2-2.jpg


UT2-3.jpg


UT2-4.jpg


UT2-5.jpg


Test Documentations

External Testing