Difference between revisions of "IS480 Team wiki: 2012T1 Bumblebee Project Documentation User Acceptance Test"

From IS480
Jump to navigation Jump to search
Line 231: Line 231:
| style= "background:#000000; color: #ffffff; padding: 10px 4px 0 15px; width: 50%; font-size: 13px;" valign="top"  | The system is well-suited to first-time users
| style= "background:#000000; color: #ffffff; padding: 10px 4px 0 15px; width: 50%; font-size: 13px;" valign="top"  | Scenario Completion Time
| style= "background:#ffffff; color: #000000; padding: 10px 4px 0 15px; width: 25%;font-size: 13px;" valign="top"  | --
| style= "background:#ffffff; color: #000000; padding: 10px 4px 0 15px; width: 25%;font-size: 13px;" valign="top"  | 4 minutes
| style= "background:#ffffff; color: #000000; padding: 10px 4px 0 15px; width: 25%;font-size: 13px;" valign="top"  | --
| style= "background:#ffffff; color: #000000; padding: 10px 4px 0 15px; width: 25%;font-size: 13px;" valign="top"  | --

Revision as of 18:24, 9 October 2012

Home Project Overview Project Documentation Project Management Learning and Growth
Diagrams User Interface User Manual User Testing Meeting Minutes Presentations


User Testing 1 (17 September 2012)


The goals and objectives of usability testing:

• Record and document general feedback and first impressions

• Identify any potential concerns to address regarding application usability, presentation, and navigation.

• Get feedback on the usefulness and accuracy of the functions developed.

• To match client expectations on the system developed.

Back to Top



User Testing Environment

Computer platform : Intel Pentium Processor

Screen resolution : 1028 x 768

Operating System : Windows XP

Set-up required : Computer date format (English (Australia)) of d/MM/YYYY


The participants will attempt to complete a set of scenarios presented to them and to provide feedback regarding the usability and acceptability of the application.

1. Kevin Choy, SATS Airline Relations Manager - Person-in-charge for this project.

2. Goh Wei Xuan, SATS Airline Relations Manager



These instructions were given to our clients:

1. Each user will be accompanied by 1 facilitator.

2. Users are encouraged to verbalize their movements, purpose, and problems.

3. Facilitators will record mistakes and questions made by users during testing.

4. To start the test, click on the file named “START.bat” found in folder named “SATS_Bumblebee_Beta_v5”.

5. All sample files needed for testing are found in: SATS_Bumblebee_Beta_v5/data

6. Database used to store imported data is also found in ROOT folder.

7. Users are allowed to change their input(s) to verify data validity.

8. Users are to complete the tasks stated below. After completing each task, users have to answer the test questions pertaining to the specific task.


These are the task descriptions given to clients:

Below are tasks for users to complete.

1. Bootstrap/import files(s)

This task is for user to import data from excel files such as Flight Schedule Departure, Flight Schedule Arrival, Staff Records, etc. into the application. The application will then use these data for simulation purpose in the later step.

2. Add staff costs

This task is to record various costs in hiring staff into the application.

3. Add uncertainties

This task is to record the mean and standard deviation of different uncertainties that will affect the initial schedule prepared by the application. Simulation period = 7 days (represents the number of days the data is to be generate).

4. Run simulation

Run simulation to start assigning staff to different job assignments.

5. View staff schedule (in Gantt Chart)

This allows user to view and compare between a staff’s planned and actual working time.

6. Add airline requirements

Airlines have several different requirements on number of CSA and CSO needed.

This task is to record the individual requirements into the database. The input data will be used for simulation purpose in the later step.

7. Generate result

This task is to view the result generated in PDF format.

Team Roles

Overall in-charge (Yosin Anggusti)

- Provide training overview prior to usability testing

- Defines usability and purpose of usability testing to participants

Facilitators (Glorya Marie, Suriyanti)

- Evaluate on the application and user interaction with the application, rather than evaluating on the user

- Facilitator will observe and enter user behavior and user comments.

- Responds to participant’s requests for assistance

Test Observers (Yosin Anggusti)

- Silent observers

- Assists the data logger in identifying problems, concerns, coding bugs, and procedural errors

Back to Top

Reporting Results

Usability Metrics

Critical Errors

Critical errors are deviations of results from the actual result. These errors will cause the task to fail. Facilitators are to records there critical errors.

Non-Critical Errors

Non-critical errors are usually procedural, in which the participant does not complete a task in the most optimal means (e.g. excessive steps, initially selecting the wrong function, attempting to edit an un-editable field). These errors may not be detected by the user himself. Facilitators have to record these errors independently.

Scenario Completion Time

The time to complete each scenario, not including subjective evaluation durations, will be recorded.

Back to Top

Reporting Results

Reporting Results

Task 1: Bootstrap/ Import File(s)

Bootstrap/import file(s) Concern Recommendations
Critical Errors -- --
Non-Critical Errors - Window to get bootstrap data blocked the “Bootstrap Data” page. - @Homepage: give system name instead of “Welcome..”

- “Bootstrap”change it to: “import” - Texts are not cleared after bootstrap is started - 1% progress bar is not implemented - Have everything in process view (e.g. Breadcrumb) - While importing, disable all browse and bootstrap buttons.

Scenario Completion Time 4 minutes --

Back to Top

Subjective Evaluations

Subjective Evaluations

Subjective evaluations regarding ease of use and satisfaction will be collected via questionnaires. There are 2 participants, thus results from both participants will be combined or averaged whenever it is necessary.

Each of the two participants contributes 50% to their answers. Not all sections are answered, thus not all questions have a total of 100% weight.

Navigation Impression Agree Neutral Disagree
It is easy to find my way around the system -- -- 50%
It is easy to remember where to find things -- 50% --
The system is well-suited to first-time users -- -- 50%


-‘Back’ button at simulation is missing

- Have a flow. Not sure which button to select?

- Need more instructions for first time users.

Look and Feel Agree Neutral Disagree
The interface design is simple 100% -- --
The size and layout of the application is optimal -- 100% --
Functions Agree Neutral Disagree
Each function has a clear purpose 50% 50% --


- Must be more explicit on description

Bootstrap/import file(s) Agree Neutral Disagree
The function works well 100% -- --
The function takes reasonable amount of time 100% -- --
The function provides right amount of information 100% -- --
The result/outcome of the function is right 100% -- --


- “Bootstrap” should be changed to “import”

- Disable ‘browse’ button when bootstrapping

Back to Top

Reporting Conclusions

Reporting Conclusions

• Client was satisfied with the system. However, there are more to be improved in terms of the presentation and navigation of the application.

• Client, especially another participant gained better and clearer understanding on what the application delivers after testing the application.

• There are critical errors on the logic/formula for staff utilization rate and staff working hours that can be further improved to increase the accuracy of the calculation.

• Non-critical errors will also be solved.

Back to Top