Difference between revisions of "IS480 Team wiki: 2012T1 Bumblebee Final Wiki"
Line 9: | Line 9: | ||
{| style="background-color:#4c4942; color:#504A4B padding: 5px 0 0 0;" width="100%" cellspacing="0" cellpadding="0" valign="top" border="0" | | {| style="background-color:#4c4942; color:#504A4B padding: 5px 0 0 0;" width="100%" cellspacing="0" cellpadding="0" valign="top" border="0" | | ||
− | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width=" | + | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="12.5%"|[[IS480_Team_wiki:_2012T1_Bumblebee|<font face = "century gothic" color="#000000">Back to Main Page</font>]] |
− | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width=" | + | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="12.5%" |[[IS480_Team_wiki:_2012T1_Bumblebee_Final_Wiki#Project Overview|<font face = "century gothic" color="#000000">Project Overview</font>]] |
− | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width=" | + | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="12.5%"|[[IS480_Team_wiki:_2012T1_Bumblebee_Final_Wiki#Project Scope|<font face = "century gothic" color="#000000">Project Scope</font>]] |
− | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width=" | + | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="12.5%"|[[IS480_Team_wiki:_2012T1_Bumblebee_Final_Wiki#User Testing 2|<font face = "century gothic" color="#000000">User Testing 2</font>]] |
− | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width=" | + | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="12.5%"|[[IS480_Team_wiki:_2012T1_Bumblebee_Final_Wiki#Heuristic Evaluation|<font face = "century gothic" color="#000000">Heuristic Evaluation</font>]] |
− | [[IS480_Team_wiki:_2012T1_Bumblebee_Final_Wiki# | + | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="12.5%"| |
+ | [[IS480_Team_wiki:_2012T1_Bumblebee_Final_Wiki#Project Management|<font face = "century gothic" color="#000000">Project Management</font>]] | ||
+ | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="20%"| | ||
+ | [[IS480_Team_wiki:_2012T1_Bumblebee_Final_Wiki#Learning Outcome and Reflection|<font face = "century gothic" color="#000000">Learning Outcome and Reflection</font>]] | ||
| style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="20%"| | | style="padding:0.3em; font-size:90%; background-color:#FBB917; border-bottom:2px solid #504A4B; text-align:center; " width="20%"| | ||
|} | |} |
Revision as of 18:16, 26 November 2012
Back to Main Page | Project Overview | Project Scope | User Testing 2 | Heuristic Evaluation |
Contents
- 1 Midterm Presentation Slides
- 2 Project Overview
- 3 Project Scope
- 4 User Testing
- 4.1 Objectives
- 4.2 Methodology
- 4.3 Usability Metrics
- 4.4 Reporting Results
- 4.5 Subjective Evaluations
- 4.5.1 Navigation Impression
- 4.5.2 Look and Feel
- 4.5.3 Functions
- 4.5.4 Bootstrap/import file(s)
- 4.5.5 Add staff costs [Manage Simulation Parameters]
- 4.5.6 Add uncertainties [Manage Sim. Parameters]
- 4.5.7 Run Simulation
- 4.5.8 View staff schedule [in Gantt Chart]
- 4.5.9 Add airline requirements
- 4.5.10 Generate Result
- 4.5.11 Overall Impression
- 4.6 Reporting Conclusions
- 5 Project Management
Midterm Presentation Slides
Slides
Stakeholders |
Click here to download our midterm presentation slides.
|
Project Overview
Stakeholders
Stakeholders |
The BeesSponsors
|
Project Description
Project Description |
Singapore Airport Terminal Services (SATS) is the leading provider of the gateway services and food solutions in the region. It has a staff of 800 to handle more than 35 airlines. Every year, there are two major airline flight schedule change - summer (April) and winter (November). Each change can be a drastic one and has great impact on staff roster. One of SATS’s goals for its staff roster is to meet all its airlines’ requirements. However, there are many uncertainties such as staff members falling sick, staff resigning and flight delay, that cause the roster planned to be ineffective. Thus, Duty Manager in SATS often has to make last minute changes to the rosters and incur costs such as Staff Recall Cost, Over-Time cost and Meal Allowance Compensation (MAC). To help improve the efficiency involving various costs and consequences of the planned staff roster, our project aims to create a Staff Deployment Simulation Software (SDSS) which will first deploy staff based on flight schedule, flight requirements, staff records and staff roster and then simulate the roster plan by taking into account all the various uncertainties and forecast the various costs that the management would have to incur. Understanding the cost and consequences of a given roster, SATS would then be able to make necessary adjustments to avoid high expenses to the company. The diagram below explains the flow of the software. To predict the uncertainties, we will take in the mean and standard deviation of the listed simulation parameters in diagram above(MC Rate, Flight Delay Rate, Staff Resign Rate, etc). Having those values, our software will then generate a normal distribution of each uncertainty. After such, we will be applying the uncertainties at the planned roster plan such as removing staff from the roster plan if he/she is taking MC. The software will also try to mimic the actions of Duty Manager who does the necessary adjustment to the staff roster to fulfill the airlines requirements i.e. to imitate human decision making process to come out with an optimal staff roster. At the end of the day, our software will generate a management report which will reveal the cost and consequences of implementation of a given roster. The output given will be Total Staff Working Hours, Flight Demand Coverage, Meal Allowance Compensation (MAC), Over Time (OT) Cost, OT hours, Staff Utilization Rate, Unproductive Hours, Recall Cost and Recall Hours. Understanding the cost and consequences of a given roster, SATS would then be able to make necessary adjustment to avoid high expenses and thus, increase cost savings to the company. |
X Factor
X Factor |
EfficiencyOur system is able to handle a highly complex, huge domain size, scheduling problem efficiently in comparison to SATS’ current manual scheduling process. Algorithmic ComplexityOptimal SchedulingWe provide our client an optimized staff schedule to effectively reduce client's overheads from non-optimal scheduling practices. In essence, we developed an optimal scheduling algorithm to maximise SATS’ resources to meet all airline requirements. This optimal scheduling is written based on Greedy Algorithm concepts. Probabilistic SimulationOur system gives our client a clear perspective of cost loss profile from human resource planning uncertainties (flight delays, staff resign, etc). And we would provide recommendation for minimizing additional cost effectively. To be specific, we use normal distribution and Kolmogorov-Smirnov test to forecast the uncertainties which SATS could face. After which we run a simulation to plot our client’s cost profile. Aviation Industry KnowledgeOur system complies with numerous aviation specific business rules. To develop this system, our group has to understand aviation industry practises deeply. For this, We did a 2 months weekly attachment to SATS' Duty Manager(DM) and Customer Service Officer(CSO).
|
Motivation
Motivation |
There are 2 main motivations for working on this project:
|
Project Scope
Use Case
Use Case |
|
Deliverables and Scope
Deliverables and Scope | ||||||||||||||||||||||||||||
Please refer to use case description for greater details. | ||||||||||||||||||||||||||||
Graphical User Interface
Graphical User Interface |
|
User Testing
Objectives
User Testing 1 (17 September 2012)
Objectives |
The goals and objectives of usability testing: • Record and document general feedback and first impressions • Identify any potential concerns to address regarding application usability, presentation, and navigation. • Get feedback on the usefulness and accuracy of the functions developed. • To match client expectations on the system developed. |
Methodology
Methodology |
User Testing EnvironmentComputer platform : Intel Pentium Processor Screen resolution : 1028 x 768 Operating System : Windows XP Set-up required : Computer date format (English (Australia)) of d/MM/YYYY ParticipantsThe participants will attempt to complete a set of scenarios presented to them and to provide feedback regarding the usability and acceptability of the application. 1. Kevin Choy, SATS Airline Relations Manager - Person-in-charge for this project. 2. Goh Wei Xuan, SATS Airline Relations Manager ProcedureInstructionsThese instructions were given to our clients: 1. Each user will be accompanied by 1 facilitator. 2. Users are encouraged to verbalize their movements, purpose, and problems. 3. Facilitators will record mistakes and questions made by users during testing. 4. To start the test, click on the file named “START.bat” found in folder named “SATS_Bumblebee_Beta_v5”. 5. All sample files needed for testing are found in: SATS_Bumblebee_Beta_v5/data 6. Database used to store imported data is also found in ROOT folder. 7. Users are allowed to change their input(s) to verify data validity. 8. Users are to complete the tasks stated below. After completing each task, users have to answer the test questions pertaining to the specific task. TasksThese are the task descriptions given to clients: Below are tasks for users to complete. 1. Bootstrap/import files(s) This task is for user to import data from excel files such as Flight Schedule Departure, Flight Schedule Arrival, Staff Records, etc. into the application. The application will then use these data for simulation purpose in the later step. 2. Add staff costs This task is to record various costs in hiring staff into the application. 3. Add uncertainties This task is to record the mean and standard deviation of different uncertainties that will affect the initial schedule prepared by the application. Simulation period = 7 days (represents the number of days the data is to be generate). 4. Run simulation Run simulation to start assigning staff to different job assignments. 5. View staff schedule (in Gantt Chart) This allows user to view and compare between a staff’s planned and actual working time. 6. Add airline requirements Airlines have several different requirements on number of CSA and CSO needed. This task is to record the individual requirements into the database. The input data will be used for simulation purpose in the later step. 7. Generate result This task is to view the result generated in PDF format. Team RolesOverall in-charge (Yosin Anggusti) - Provide training overview prior to usability testing - Defines usability and purpose of usability testing to participants Facilitators (Glorya Marie, Suriyanti) - Evaluate on the application and user interaction with the application, rather than evaluating on the user - Facilitator will observe and enter user behavior and user comments. - Responds to participant’s requests for assistance Test Observers (Yosin Anggusti) - Silent observers - Assists the data logger in identifying problems, concerns, coding bugs, and procedural errors
|
Usability Metrics
Usability Metrics |
Critical ErrorsCritical errors are deviations of results from the actual result. These errors will cause the task to fail. Facilitators are to records there critical errors. Non-Critical ErrorsNon-critical errors are usually procedural, in which the participant does not complete a task in the most optimal means (e.g. excessive steps, initially selecting the wrong function, attempting to edit an un-editable field). These errors may not be detected by the user himself. Facilitators have to record these errors independently. Scenario Completion TimeThe time to complete each scenario, not including subjective evaluation durations, will be recorded.
|
Reporting Results
Reporting Results | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Task 1: Bootstrap/ Import File(s)
Task 2: Add Staff Costs
Task 3: Add Uncertainties
Task 4: Run Simulation
Task 5: View staff schedule (in Gantt chart)
Task 6: Task 5: Add airline requirements
Task 7: Generate result
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Subjective Evaluations
Subjective Evaluations | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Subjective evaluations regarding ease of use and satisfaction will be collected via questionnaires. There are 2 participants, thus results from both participants will be combined or averaged whenever it is necessary. Each of the two participants contributes 50% to their answers. Not all sections are answered, thus not all questions have a total of 100% weight.
Comment(s): -‘Back’ button at simulation is missing - Have a flow. Not sure which button to select? - Need more instructions for first time users. Look and Feel
Comment(s): NA Functions
Comment(s): - Must be more explicit on description Bootstrap/import file(s)
Comment(s): - “Bootstrap” should be changed to “import” - Disable ‘browse’ button when bootstrapping Add staff costs [Manage Simulation Parameters]
Comment(s): NA Add uncertainties [Manage Sim. Parameters]
Comment(s): - Unit of measurement change to “hrs+mins” - Allow shortcut key(e.g. [Alt + S] to start simulation) Run Simulation
Comment(s): - Inconsistent textbox format - Progress bar is not showing - Exception handling. Encountered null pointer exception. View staff schedule [in Gantt Chart]
Comment(s): - Staff schedule is incorrect - Please add ‘flight number’ in Gantt chart Add airline requirements
Comment(s): - Success message not seen properly - Can have guideline Generate Result
Comment(s): - Cannot delete PDF record - How does this differ from “run simulation”? Overall Impression
Comment(s): NA 1. “What did you like best about this system?” - Cost calculation is beneficial. 2. What did you like least about this system? - NA 3. If you could make changes to this system, what change would you make? - NA 4. Do you have any questions or comments about the system or your experiences with it? - It could be a good tool | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Reporting Conclusions
Reporting Conclusions |
• Client was satisfied with the system. However, there are more to be improved in terms of the presentation and navigation of the application. • Client, especially another participant gained better and clearer understanding on what the application delivers after testing the application. • There are critical errors on the logic/formula for staff utilization rate and staff working hours that can be further improved to increase the accuracy of the calculation. • Non-critical errors will also be solved. |
Project Management
Time Management
Time Management |
MilestonesIterations Parallel PlanSchedule Metric
Schedule TrackingClick here to see our Schedule Tracking. |
Quality Management
Quality Management |
Bug Metric
Bug TrackingClick here to see our Bug Log. |
Risk Management
Risk Management | ||||||||||||||||||||||||||||||||||||||||||||||||
Risk Metric
Risk Table
| ||||||||||||||||||||||||||||||||||||||||||||||||