Difference between revisions of "IS480 Team wiki: 2012T1 Bumblebee Project Documentation User Acceptance Test"
Line 225: | Line 225: | ||
| style= "background:#ffffff; color: #000000; padding: 10px 4px 0 15px; width: 30%;font-size: 13px;" valign="top" | - @Homepage: give system name instead of “Welcome..” | | style= "background:#ffffff; color: #000000; padding: 10px 4px 0 15px; width: 30%;font-size: 13px;" valign="top" | - @Homepage: give system name instead of “Welcome..” | ||
- “Bootstrap” change it to: “import” | - “Bootstrap” change it to: “import” | ||
+ | |||
- Texts are not cleared after bootstrap is started | - Texts are not cleared after bootstrap is started | ||
+ | |||
- 1% progress bar is not implemented | - 1% progress bar is not implemented | ||
+ | |||
- Have everything in process view (e.g. Breadcrumb) | - Have everything in process view (e.g. Breadcrumb) | ||
+ | |||
- While importing, disable all browse and bootstrap buttons. | - While importing, disable all browse and bootstrap buttons. | ||
Revision as of 22:25, 9 October 2012
Home | Project Overview | Project Documentation | Project Management | Learning and Growth |
Diagrams | User Interface | User Manual | User Testing | Meeting Minutes | Presentations |
Contents
- 1 Objectives
- 2 Methodology
- 3 Reporting Results
- 4 Reporting Results
- 5 Subjective Evaluations
- 5.1 Navigation Impression
- 5.2 Look and Feel
- 5.3 Functions
- 5.4 Bootstrap/import file(s)
- 5.5 Add staff costs [Manage Simulation Parameters]
- 5.6 Add uncertainties [Manage Sim. Parameters]
- 5.7 Run Simulation
- 5.8 View staff schedule [in Gantt Chart]
- 5.9 Add airline requirements
- 5.10 Generate Result
- 5.11 Overall Impression
- 6 Reporting Conclusions
Objectives
User Testing 1 (17 September 2012)
Objectives |
The goals and objectives of usability testing: • Record and document general feedback and first impressions • Identify any potential concerns to address regarding application usability, presentation, and navigation. • Get feedback on the usefulness and accuracy of the functions developed. • To match client expectations on the system developed. |
Methodology
Methodology |
User Testing EnvironmentComputer platform : Intel Pentium Processor Screen resolution : 1028 x 768 Operating System : Windows XP Set-up required : Computer date format (English (Australia)) of d/MM/YYYY ParticipantsThe participants will attempt to complete a set of scenarios presented to them and to provide feedback regarding the usability and acceptability of the application. 1. Kevin Choy, SATS Airline Relations Manager - Person-in-charge for this project. 2. Goh Wei Xuan, SATS Airline Relations Manager ProcedureInstructionsThese instructions were given to our clients: 1. Each user will be accompanied by 1 facilitator. 2. Users are encouraged to verbalize their movements, purpose, and problems. 3. Facilitators will record mistakes and questions made by users during testing. 4. To start the test, click on the file named “START.bat” found in folder named “SATS_Bumblebee_Beta_v5”. 5. All sample files needed for testing are found in: SATS_Bumblebee_Beta_v5/data 6. Database used to store imported data is also found in ROOT folder. 7. Users are allowed to change their input(s) to verify data validity. 8. Users are to complete the tasks stated below. After completing each task, users have to answer the test questions pertaining to the specific task. TasksThese are the task descriptions given to clients: Below are tasks for users to complete. 1. Bootstrap/import files(s) This task is for user to import data from excel files such as Flight Schedule Departure, Flight Schedule Arrival, Staff Records, etc. into the application. The application will then use these data for simulation purpose in the later step. 2. Add staff costs This task is to record various costs in hiring staff into the application. 3. Add uncertainties This task is to record the mean and standard deviation of different uncertainties that will affect the initial schedule prepared by the application. Simulation period = 7 days (represents the number of days the data is to be generate). 4. Run simulation Run simulation to start assigning staff to different job assignments. 5. View staff schedule (in Gantt Chart) This allows user to view and compare between a staff’s planned and actual working time. 6. Add airline requirements Airlines have several different requirements on number of CSA and CSO needed. This task is to record the individual requirements into the database. The input data will be used for simulation purpose in the later step. 7. Generate result This task is to view the result generated in PDF format. Team RolesOverall in-charge (Yosin Anggusti) - Provide training overview prior to usability testing - Defines usability and purpose of usability testing to participants Facilitators (Glorya Marie, Suriyanti) - Evaluate on the application and user interaction with the application, rather than evaluating on the user - Facilitator will observe and enter user behavior and user comments. - Responds to participant’s requests for assistance Test Observers (Yosin Anggusti) - Silent observers - Assists the data logger in identifying problems, concerns, coding bugs, and procedural errors
|
Reporting Results
Usability Metrics |
Critical ErrorsCritical errors are deviations of results from the actual result. These errors will cause the task to fail. Facilitators are to records there critical errors. Non-Critical ErrorsNon-critical errors are usually procedural, in which the participant does not complete a task in the most optimal means (e.g. excessive steps, initially selecting the wrong function, attempting to edit an un-editable field). These errors may not be detected by the user himself. Facilitators have to record these errors independently. Scenario Completion TimeThe time to complete each scenario, not including subjective evaluation durations, will be recorded.
|
Reporting Results
Reporting Results | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Task 1: Bootstrap/ Import File(s)
Task 2: Add Staff Costs
Task 3: Add Uncertainties
Task 4: Run Simulation
Task 5: Generate result
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Subjective Evaluations
Subjective Evaluations | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Subjective evaluations regarding ease of use and satisfaction will be collected via questionnaires. There are 2 participants, thus results from both participants will be combined or averaged whenever it is necessary. Each of the two participants contributes 50% to their answers. Not all sections are answered, thus not all questions have a total of 100% weight.
Comment(s): -‘Back’ button at simulation is missing - Have a flow. Not sure which button to select? - Need more instructions for first time users. Look and Feel
Comment(s): NA Functions
Comment(s): - Must be more explicit on description Bootstrap/import file(s)
Comment(s): - “Bootstrap” should be changed to “import” - Disable ‘browse’ button when bootstrapping Add staff costs [Manage Simulation Parameters]
Comment(s): NA Add uncertainties [Manage Sim. Parameters]
Comment(s): - Unit of measurement change to “hrs+mins” - Allow shortcut key(e.g. [Alt + S] to start simulation) Run Simulation
Comment(s): - Inconsistent textbox format - Progress bar is not showing - Exception handling. Encountered null pointer exception. View staff schedule [in Gantt Chart]
Comment(s): - Staff schedule is incorrect - Please add ‘flight number’ in Gantt chart Add airline requirements
Comment(s): - Success message not seen properly - Can have guideline Generate Result
Comment(s): - Cannot delete PDF record - How does this differ from “run simulation”? Overall Impression
Comment(s): NA 1. “What did you like best about this system?” - Cost calculation is beneficial. 2. What did you like least about this system? - NA 3. If you could make changes to this system, what change would you make? - NA 4. Do you have any questions or comments about the system or your experiences with it? - It could be a good tool | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Reporting Conclusions
Reporting Conclusions |
• Client was satisfied with the system. However, there are more to be improved in terms of the presentation and navigation of the application. • Client, especially another participant gained better and clearer understanding on what the application delivers after testing the application. • There are critical errors on the logic/formula for staff utilization rate and staff working hours that can be further improved to increase the accuracy of the calculation. • Non-critical errors will also be solved. |