IS480 Team wiki: 2012T1 Bumblebee Project Documentation User Acceptance Test
|Home||Project Overview||Project Documentation||Project Management||Learning and Growth|
|Diagrams||User Interface||User Manual||User Testing||Meeting Minutes||Presentations|
- 1 Objectives
- 2 Methodology
- 3 Reporting Results
- 4 Reporting Results
- 5 Subjective Evaluations
- 5.1 Navigation Impression
- 5.2 Look and Feel
- 5.3 Functions
- 5.4 Bootstrap/import file(s)
- 5.5 Add staff costs [Manage Simulation Parameters]
- 5.6 Add uncertainties [Manage Sim. Parameters]
- 5.7 Run Simulation
- 5.8 View staff schedule [in Gantt Chart]
- 5.9 Add airline requirements
- 5.10 Generate Result
- 5.11 Overall Impression
- 6 Reporting Conclusions
|User Testing 1 (17 September 2012)
The goals and objectives of usability testing:
• Record and document general feedback and first impressions
• Identify any potential concerns to address regarding application usability, presentation, and navigation.
• Get feedback on the usefulness and accuracy of the functions developed.
• To match client expectations on the system developed.
User Testing Environment
Computer platform : Intel Pentium Processor
Screen resolution : 1028 x 768
Operating System : Windows XP
Set-up required : Computer date format (English (Australia)) of d/MM/YYYY
The participants will attempt to complete a set of scenarios presented to them and to provide feedback regarding the usability and acceptability of the application.
1. Kevin Choy, SATS Airline Relations Manager - Person-in-charge for this project.
2. Goh Wei Xuan, SATS Airline Relations Manager
These instructions were given to our clients:
1. Each user will be accompanied by 1 facilitator.
2. Users are encouraged to verbalize their movements, purpose, and problems.
3. Facilitators will record mistakes and questions made by users during testing.
4. To start the test, click on the file named “START.bat” found in folder named “SATS_Bumblebee_Beta_v5”.
5. All sample files needed for testing are found in: SATS_Bumblebee_Beta_v5/data
6. Database used to store imported data is also found in ROOT folder.
7. Users are allowed to change their input(s) to verify data validity.
8. Users are to complete the tasks stated below. After completing each task, users have to answer the test questions pertaining to the specific task.
These are the task descriptions given to clients:
Below are tasks for users to complete.
1. Bootstrap/import files(s)
This task is for user to import data from excel files such as Flight Schedule Departure, Flight Schedule Arrival, Staff Records, etc. into the application. The application will then use these data for simulation purpose in the later step.
2. Add staff costs
This task is to record various costs in hiring staff into the application.
3. Add uncertainties
This task is to record the mean and standard deviation of different uncertainties that will affect the initial schedule prepared by the application. Simulation period = 7 days (represents the number of days the data is to be generate).
4. Run simulation
Run simulation to start assigning staff to different job assignments.
5. View staff schedule (in Gantt Chart)
This allows user to view and compare between a staff’s planned and actual working time.
6. Add airline requirements
Airlines have several different requirements on number of CSA and CSO needed.
This task is to record the individual requirements into the database. The input data will be used for simulation purpose in the later step.
7. Generate result
This task is to view the result generated in PDF format.
Overall in-charge (Yosin Anggusti)
- Provide training overview prior to usability testing
- Defines usability and purpose of usability testing to participants
Facilitators (Glorya Marie, Suriyanti)
- Evaluate on the application and user interaction with the application, rather than evaluating on the user
- Facilitator will observe and enter user behavior and user comments.
- Responds to participant’s requests for assistance
Test Observers (Yosin Anggusti)
- Silent observers
- Assists the data logger in identifying problems, concerns, coding bugs, and procedural errors
Critical errors are deviations of results from the actual result. These errors will cause the task to fail. Facilitators are to records there critical errors.
Non-critical errors are usually procedural, in which the participant does not complete a task in the most optimal means (e.g. excessive steps, initially selecting the wrong function, attempting to edit an un-editable field). These errors may not be detected by the user himself. Facilitators have to record these errors independently.
Scenario Completion Time
The time to complete each scenario, not including subjective evaluation durations, will be recorded.
Task 1: Bootstrap/ Import File(s)
Subjective evaluations regarding ease of use and satisfaction will be collected via questionnaires. There are 2 participants, thus results from both participants will be combined or averaged whenever it is necessary.
Each of the two participants contributes 50% to their answers. Not all sections are answered, thus not all questions have a total of 100% weight.
-‘Back’ button at simulation is missing
- Have a flow. Not sure which button to select?
- Need more instructions for first time users.
Look and Feel
- Must be more explicit on description
- “Bootstrap” should be changed to “import”
- Disable ‘browse’ button when bootstrapping
Add staff costs [Manage Simulation Parameters]
Add uncertainties [Manage Sim. Parameters]
- Unit of measurement change to “hrs+mins”
- Allow shortcut key(e.g. [Alt + S] to start simulation)
- Inconsistent textbox format
- Progress bar is not showing
- Exception handling. Encountered null pointer exception.
View staff schedule [in Gantt Chart]
- Staff schedule is incorrect
- Please add ‘flight number’ in Gantt chart
Add airline requirements
- Success message not seen properly
- Can have guideline
- Cannot delete PDF record
- How does this differ from “run simulation”?
1. “What did you like best about this system?”
- Cost calculation is beneficial.
2. What did you like least about this system?
3. If you could make changes to this system, what change would you make?
4. Do you have any questions or comments about the system or your experiences with it?
- It could be a good tool
• Client was satisfied with the system. However, there are more to be improved in terms of the presentation and navigation of the application.
• Client, especially another participant gained better and clearer understanding on what the application delivers after testing the application.
• There are critical errors on the logic/formula for staff utilization rate and staff working hours that can be further improved to increase the accuracy of the calculation.
• Non-critical errors will also be solved.