HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2012T1 Bumblebee Project Documentation User Acceptance Test"

From IS480
Jump to navigation Jump to search
Line 33: Line 33:
 
{| class="wikitable" cellpadding="15"
 
{| class="wikitable" cellpadding="15"
 
|-
 
|-
| style="background:#ffffff;; color: #000000; font-weight: bold; text-indent: 2px; width: 32%; font-size:22px; text-align:lrft; border: 1px solid #ffffff" valign="top" | <b>User Testing 1 (on 17 September 2012)</b>
+
| style="background:#ffffff;; color: #000000; font-weight: bold; text-indent: 2px; width: 32%; font-size:22px; text-align:lrft; border: 1px solid #ffffff" valign="top" | <b>User Testing 1 (on 17 September 2012)
 +
Objectives</b>
 
|-
 
|-
 
| style= "background:#000000; color: #000000; padding: 10px 4px 0 15px; font-size: 13px;" valign="top"  |
 
| style= "background:#000000; color: #000000; padding: 10px 4px 0 15px; font-size: 13px;" valign="top"  |

Revision as of 18:11, 9 October 2012

Bao.png
Home Project Overview Project Documentation Project Management Learning and Growth
Diagrams User Interface User Manual User Testing Meeting Minutes Presentations




Objectives

User Testing 1 (on 17 September 2012)

Objectives

The goals and objectives of usability testing:

• Record and document general feedback and first impressions

• Identify any potential concerns to address regarding application usability, presentation, and navigation.

• Get feedback on the usefulness and accuracy of the functions developed.

• To match client expectations on the system developed.

Back to Top

Methodology

Methodology

User Testing Environment

Computer platform : Intel Pentium Processor

Screen resolution : 1028 x 768

Operating System : Windows XP

Set-up required : Computer date format (English (Australia)) of d/MM/YYYY

Participants

The participants will attempt to complete a set of scenarios presented to them and to provide feedback regarding the usability and acceptability of the application.

1. Kevin Choy, SATS Airline Relations Manager - Person-in-charge for this project.

2. Goh Wei Xuan, SATS Airline Relations Manager

Procedure

Instructions

These instructions were given to our clients:

1. Each user will be accompanied by 1 facilitator.

2. Users are encouraged to verbalize their movements, purpose, and problems.

3. Facilitators will record mistakes and questions made by users during testing.

4. To start the test, click on the file named “START.bat” found in folder named “SATS_Bumblebee_Beta_v5”.

5. All sample files needed for testing are found in: SATS_Bumblebee_Beta_v5/data

6. Database used to store imported data is also found in ROOT folder.

7. Users are allowed to change their input(s) to verify data validity.

8. Users are to complete the tasks stated below. After completing each task, users have to answer the test questions pertaining to the specific task.

Tasks

These are the task descriptions given to clients:

Below are tasks for users to complete.

1. Bootstrap/import files(s)

This task is for user to import data from excel files such as Flight Schedule Departure, Flight Schedule Arrival, Staff Records, etc. into the application. The application will then use these data for simulation purpose in the later step.

2. Add staff costs

This task is to record various costs in hiring staff into the application.

3. Add uncertainties

This task is to record the mean and standard deviation of different uncertainties that will affect the initial schedule prepared by the application. Simulation period = 7 days (represents the number of days the data is to be generate).

4. Run simulation

Run simulation to start assigning staff to different job assignments.

5. View staff schedule (in Gantt Chart)

This allows user to view and compare between a staff’s planned and actual working time.

6. Add airline requirements

Airlines have several different requirements on number of CSA and CSO needed.

This task is to record the individual requirements into the database. The input data will be used for simulation purpose in the later step.

7. Generate result

This task is to view the result generated in PDF format.

Team Roles

Overall in-charge (Yosin Anggusti)

- Provide training overview prior to usability testing

- Defines usability and purpose of usability testing to participants

Facilitators (Glorya Marie, Suriyanti)

- Evaluate on the application and user interaction with the application, rather than evaluating on the user

- Facilitator will observe and enter user behavior and user comments.

- Responds to participant’s requests for assistance

Test Observers (Yosin Anggusti)

- Silent observers

- Assists the data logger in identifying problems, concerns, coding bugs, and procedural errors


Back to Top

Reporting Results

Usability Metrics

Critical Errors

Critical errors are deviations of results from the actual result. These errors will cause the task to fail. Facilitators are to records there critical errors.

Non-Critical Errors

Non-critical errors are usually procedural, in which the participant does not complete a task in the most optimal means (e.g. excessive steps, initially selecting the wrong function, attempting to edit an un-editable field). These errors may not be detected by the user himself. Facilitators have to record these errors independently.

Scenario Completion Time

The time to complete each scenario, not including subjective evaluation durations, will be recorded.


Back to Top

Reporting Results

Reporting Results

tba


Back to Top

Subjective Evaluations

Subjective Evaluations

Subjective evaluations regarding ease of use and satisfaction will be collected via questionnaires. There are 2 participants, thus results from both participants will be combined or averaged whenever it is necessary.

Each of the two participants contributes 50% to their answers. Not all sections are answered, thus not all questions have a total of 100% weight.

Navigation Impression Agree Neutral Disagree
It is easy to find my way around the system 50%
It is easy to remember where to find things 50%
The system is well-suited to first-time users 50%

Comment(s):

-‘Back’ button at simulation is missing

- Have a flow. Not sure which button to select?

- Need more instructions for first time users.

Look and Feel Agree Neutral Disagree
The interface design is simple 100%
The size and layout of the application is optimal 100%
Functions Agree Neutral Disagree
Each function has a clear purpose 50% 50%

Comment(s):

- Must be more explicit on description


Back to Top

Reporting Conclusions

Reporting Conclusions

• Client was satisfied with the system. However, there are more to be improved in terms of the presentation and navigation of the application.

• Client, especially another participant gained better and clearer understanding on what the application delivers after testing the application.

• There are critical errors on the logic/formula for staff utilization rate and staff working hours that can be further improved to increase the accuracy of the calculation.

• Non-critical errors will also be solved.

Back to Top