HeaderSIS.jpg

IS480 Team wiki: 2012T2 Team Prime MidTerm Wiki

From IS480
Jump to navigation Jump to search

<< MAIN WIKI

PrimeLogo.png


Project Progress Summary

Project Overview

Swimix     Presentation Slides

  1. The team is currently at Sprint 5 (31 Jan 2013 to 26 Feb 2013). View our sprint progress.
  2. Our first user test was conducted in Sprint 4. View our user test results.
  3. Our second user test will be conducted in Sprint 6 (tentatively 17 Mar 2013).
  4. View our Interview with Tao Li, a national swimmer who is endorsing our platform.

The team is confident of completing the project within the schedule as shown below.

Prime ProjectMilestones WeAreHere V0.1.png

Project Highlights


Highlight 1: Confirmation of endorsement from Tao Li who has represented Singapore in many international swimming competitions including the Beijing and London Olympics.
Highlight 2: Manage Reviews and Manage Users’ Reviews was replaced with Manage Evaluations
Highlight 3: In Sprint 3, there was a spike in the number of bugs found, due to the fact that more rigorous testing was conducted on the application to prepare for User Test 1 in the Sprint 4.

Project Management

Schedule (Planned Vs Actual)

Prime Schedule BeforeAfter.png

Note: No significant changes have been made to the schedule with regards to the proposal and acceptance milestones.
However, significant changes have been made to the project schedule in the subsequent four milestones as shown below.

Changes Made

  1. New user stories were added under the full Manage Search feature in the form of Manage Search (Instructor).
  2. Manage Users' Reviews and Manage Reviews feature was replaced with Manage Evaluations.
  3. Manage Class Registration feature was shifted to Midterm.
  4. Manage Instructor Schedule and Manage Instructor Account web mobile features were shifted to User Test 2.
  5. Manage Notifications, Manage Instructors, Manage Users, Manage Swim Schools, Manage Recruitment, Manage Data Analytics and Manage Enquiry features were shifted to Finals.
  6. Manage Students' Progress and Manage Advertisements features were removed, shifted to Good-to-Have features.


Refer to the PROJECT TIMELINE for a full view of the current project schedule.


Scope (Planned Vs Actual)

Prime Scope BeforeAfter.png

Version 1 (Original)

  • Features are categorized in priority circles:
  • Core, Secondary and Tertiary Features: Features which will be developed by Prime during IS480
  • Good-to-have Features: Features which are of lower priority and can be implemented in future beyond this project

Version 2

  • Manage Students’ Progress was re-prioritised as a Tertiary feature since it is not the main focus of the IS480 project.
  • It was further re-prioritised as a Good-to-Have feature so that we can have a more manageable scope.

Version 3 (Latest)

  • Manage Reviews and Manage Users’ Reviews was replaced with Manage Evaluations

Project Metrics

Schedule Metric

The team has completed 5 sprints thus far, as shown in the burn down charts below.

BURNDOWN CHARTS
Sprints1to4.png

Note: Burndown chart for Sprint 5 to be uploaded.

SCHEDULE RATIO CHARTS
ScheduleRatios Sprints1234.png

Note: Schedule ratio chart for Sprint 5 to be uploaded.

KEY ISSUES

Sprint 3:

  1. During December, the team met up every Monday to Thursday to complete tasks assigned. On 21 Dec, the calculated schedule ratio indicated that the team was behind schedule. We found out that the Manage Search (Instructors) feature needed more time to be completed. To get back on schedule, the team decided to shift the Manage Reviews feature to Sprint 4.
  2. On 27 Dec, the logic for Manage Search (i) feature was working but the feature had many alignment issues. The team also realised that we were behind schedule. To get back on track, we shifted the Update feature for Manage Class Registration to Sprint 4.

Sprint 4:

  1. On 10 Jan, the team met Dr. Miles Gilman (Entrepreneur-in-Residence, SMU Institute of Innovation and Entrepreneurship) for advice on improving our business plan for the ACE Startup Grant application.
  2. On 11 Jan, we consolidated all the changes to be made and realized that there were many changes to be made. The burndown chart also indicated that we were behind schedule. Hence, for the next 4 days, we dedicated much time and effort to get back on track. We decided to put our focus on the ACE submission which was in two weeks' time, therefore we shifted two features (Manage Recruitment and Manage Swim Schools) to future sprints.
  3. Since then, the team made slow progress on the features and eventually on 17 Jan, the burndown chart indicated that we were running behind schedule. The team decided to shift another two features (Manage Instructors and Manage Users) to future sprints.
  4. On 19 Jan, the burndown chart once again indicated that we were running behind schedule. The team decided to shift the Manage Notifications feature to future sprints.
  5. On 23 Jan, the team met with our supevisor Prof Lin Mei. After taking her feedback into consideration, we decided to replace the Manage Reviews feature and Manage User Reviews feature with Manage Evaluations which will be done at a later sprint.

Sprint 5:

  1. From User Test 1, the team collected much valuable feedback from the participants and wanted to implement as much feedback as we could. Hence, we put much focus into doing this at the start of the sprint.
  2. On 11 Feb, the team finalized the storyboarding based on the user test feedback, and realized that we needed more time to make changes. Since then, the team started moving away from the ideal progress line.
  3. On 17 Feb, the burndown chart indicated that the team was behind schedule. The team had an urgent meeting on 18 Feb to discuss our possible options. The outcome of the meeting was to prioritize the changes to be made from the user test feedback. We also managed our expectations for the remaining features before midterm.


For more details:
1. Schedule Metric Calculation
2. Schedule Ratio Documentation: Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5

Bug Metric

Prime NoOfBugsFound V2.png

Number of Bugs Found

  1. The chart on the left show the number of bugs found in each sprint.
  2. In Sprint 3, there was a spike in the number of bugs found, due to the fact that more rigorous testing was conducted on the application to prepare for User Test 1 in the Sprint 4.


Prime BugMetricSeverityChart V2.png

Total Bug Score

  1. The chart above shows the corresponding bug severity score with the number of bugs found in each sprint.
  2. The bug severity score in Sprint 3 was the highest out of all sprints due to the testing for User Test 1 as mentioned above.
Total Bug Score Action to be Taken
< 5 Developers resolve issues within the sprint.
5 - 9 Resolve the bugs during the planned debugging time.
≥ 10 Stop current development and resolve the bugs immediately.


For more details:
1. Bug Metric Calculation
2. Bug Log

Project Risks

The top 3 risks are prioritised as follows::

Risk Probability Impact Mitigation
Many issues might be raised during Usability Tests; time is required to rectify these issues High Medium A Response Plan Document is created to help us decided whether to implement a change based on priority, complexity and time needed to rectify the issues.
Project contains numerous documentation and different versions. Inefficent access to a particular document. Low Medium Use a collaborative file management software (e.g. Google Documents, dropbox) to organise respective folders of the project. Consensus amongst team members to adhere to proper version labelling.
Extra workload out of project scope (eg. Ace submission, interview with Taoli) High Medium Prioritize which is the most important to complete at that point of time and put in exra working time if necessary.

View the full list of risks here.

Technical Complexity

Type here

Quality of Product

Intermediate Deliverables

Stage Specification Modules
Project Management Minutes
Metrics
Requirements Product Backlog


Metrics
Analysis Use Case
  • Use Case Diagram
Process Flow Diagram?
  • Process Flow Diagram
Design ER Diagram
  • ER Diagram
System Architecture Diagram
  • System Architecture Diagram
Testing User Test Plan

Deployment

Instructions to access the system. If necessary, provide a deployment diagram link.

Area Description
Development Environment
Web browser and smartphone running on Google Chrome and Safari
Database
Database hosted on Vodien
Web Links
View the Swimix application

Testing

Objectives

OBJECTIVES:

  1. To obtain feedback from our users with regards to the features in the application
  2. To improve the usability of the application


Scope

The table below shows a list of features that were tested for our first user test. The features target parents (representing registered users of Swimix) and swimming instructors.

No. Features Reg. User Instructor
1 Register / Log in / Log out
2 Change Password
3 Update User Profile
4 Update Instructor Profile
5 Search for Class
6 Search for Instructor
7 Create and Remove Lesson Slot
8 Create and Remove Student Details

Insert relevant PICTURE.

THE SESSION
User Test 1 was conducted successfully on 27 Jan 2013 at Yishun Swimming Complex.

  1. A total of 8 parents participated in the user test in the role of a Registered User.
  2. A total of 2 swimming instructors participated in the user test in the role of an Instructor.


View the Supporting Documents for the user test HERE.


Testing Methodology

Collecting of Qualitative Metrics

  1. Participants are encouraged to think aloud their thought process as they are performing each task. For example, we would encourage them to say whatever they are looking at, thinking, doing, and feeling as they go about the task. This enables us as the observer to better understand each participant’s thought processs and see first-hand the process of him/her completing the task.
  2. Facilitators will observe for usability issues during the testing procedure by recording down what participants say. The test sessions will also be video-recorded on Screen Flow so that we can go back and refer to what participants did and how they reacted.
  3. In addition, after participants have completed the user test, they will be asked to do a satisfaction level survey that will aid in the collection of qualitative metrics.

Collecting of Quantitative Metrics

  1. The amount of time taken to complete each task
  2. The number of clicks taken for each task


Registered Users

MOST COMMON FEEDBACK:

UserFeedback1.png

Solution: Place Login and Register in the same area and allow switching by tabs.

UserFeedback2.png

Solution: Change the View link to a button so users know that they can click on it to view the instructor’s profile



POST-TEST SURVEY RESULTS:


SN Functions Very Unlikely Unlikely Undecided Likely Very Likely
1 Search for Class/Instructor
0 0 0 5 3
2 Online Class Registration
0 1 2 3 2
3 Online Payment
0 1 0 4 3
4 Instructor Review
0 2 1 4 1
5 Instructor Rating
0 0 2 5 1
6 Receive Notification
0 0 1 3 4


Conclusion

Based on the user feedback, we found out that users are mostly receptive to the idea of a swim-related search portal.

The top 2 favourite functions identified by the users were the Manage Search function and the Manage Notification function. They commented that the search function was easy to use and could be very useful to them. The notification function is also something they felt is lacking in the industry now. This is because they have made wasted trips to the swimming complex on the lesson day only to find out the lesson was cancelled.

The function that had the highest amount of users indicating that they are unlikely to use is the Instructor Review feature. They explained that they would not want to go through the trouble to register an account just to write a review for the instructors. However, they would not mind writing if given the option of a simpler and more convenient alternative.

A user also commented that he preferred to register classes with the instructor in person instead of registering online because he could infer the instructor’s character and personality through the former. A possible solution is to include a short introduction video clip of each instructor so that users are able to gauge the instructor for themselves through the video.

In conclusion, many users expressed that they portal is user-friendly and would be very useful to them.


Instructors

MOST COMMON FEEDBACK:
InstructorFeedback1.png

Solution:
Use radio buttons instead of dropdown list.

InstructorFeedback2.png

Solution:
Display only the student name and contact number.
Instructors can choose to click on the student's name to view the rest of their information.


POST-TEST SURVEY RESULTS:


SN Functions Very Unlikely Unlikely Undecided Likely Very Likely
1 Adding lesson slots to calendar
0 0 1 1 0
2 Selling available class slots
0 0 0 1 1
3 Sending mass notification
0 0 0 0 2
4 Online payment system
0 0 2 0 0


Conclusion

It was hard for us to get instructors to test our system for us because they is usually busy all the time while they are at the pool. They will either be teaching a class, or be on duty as a security guard. Thus, we only manage to get 2 instructors testers during their lunch break period when it is too hot to conduct a swim class. Both instructors found the portal user friendly and they did not face any problem using the portal. The provided us with mostly aesthetics feedback of the portal eg. Student list too cluttered, login button too small. One important point which they commented was that they are always on the go. Thus, they prefer to use the system on their smartphone rather than in front of the computer.


Reflections

Member Reflections
Prime XC.png
Shen Xiaochuan
  • Type here
Prime HQ.png
Lim Hui Qing
  • Type here
Prime JO.png
Josephine Heng
Prime LR.png
Larry Ho
  • ---