2011T2 Bazinga Final Wiki2
Contents
- 1 Project Progress Summary
- 2 Project Management
- 3 User Testing 2
- 4 Beta Testing
- 5 Reflections
Project Progress Summary
Summary Overview
Since 23 February, we have gone through 7 iterations with the assigned functionality being completed timely every iteration. During these 1.5 months, our team accomplished the following:
- Designed and planned the blueprint for our Kinect Application
- Delivered a working Kinect application from 5 initially to 7 amazing functionalities
- Conducted User Testing 2 with a total of 35 testers that allowed us to gather very useful quality feedback.
Project Highlights
While the team has put in much effort in planning the execution of the project, there were some unexpected events which forced the team to take certain corrective measures as list below:
1 | Improving overall User Experience |
Through our survey results and after interacting with the testers from our first User Test, we realised that most of them do not have any prior experience in using the Kinect before. Thus, it was really difficult for the users to know how to navigate through our Kinect application easily. Thus, we placed a high priority in improving the overall user experience by having good tutorial videos and an improved user interface to guide them through the application. Over the past two months, our group has gone through several iterations to create better and improved tutorial videos for different features. We kept trying to look for the best practices of other Kinect applications, such as the Kinect Sports Season One. We tried to understand their tutorial videos. Two weeks before our User Test 2, we kept reviewing the video internally to ensure that we have made the tutorial as easy as possible to comprehend and to follow. Even during our second User Test, we tried to improve the videos after hearing feedback from our testers. | |
2 | Beta Testing |
Beta Testing was the first time that our team collaborated with a real client. It was truly a great experience for us to be able to collaborate with the cafe Glassroom, located at SMU School of Information Systems. We created an application that could allow their customers to browse through images and select the one they preferred for a free virtual photo shoot. Many customers came up with their friends and tried out the different photos available. We started by showing them the capabilities and features of our application. We then gathered feedback from them on what they were very interested in. They showed a strong interest in the Augmented Reality function. Thus, we decided to incorporate both the Augmented Reality and browsing of images for their application. Firstly, they gave us both their menu and marketing collaterals.
After our User Test 2, we immediately started on our Beta Testing. ver a period of 5 days, we completed the application. However, there were some unforeseen delays as we required more time than expected to finish the augmented reality application. We had to push back our application launch at Glassroom by 2 days. We informed them of the issue early and they had no issues with the delay.
| |
3 | User Testing 2 |
Despite our busy and hectic schedule, we managed to get the functions, video tutorials, test instructions, procedures and survey questions ready for our User Test 2. We had a total of 35 testers who spent at least 15 to 25 minutes each. Different from our first User Test, we did not offer any help to them and asked them a 'Quest', where they to complete a set of tasks based on the instructions that we have given to them. They did not receive any help from us at all. If they required any help, they would ask the Kinect for help by either the voice command or the help gesture. The application has a total of xx tutorials, all for the different functions.
|
Project Management
Project Schedule
Project Status
Task/function/features, etc | Status | Complexity Level (0-10) |
Comment |
---|---|---|---|
Login with Facial Recognition using Face.com | Function to be kept in view - fully developed but removed for the time being | 9 | Kiwie |
Kinect with SAP integration: Pick Items | Broadened scope instead of limiting to SAP - removed function for the time being | 6 | Kiwie |
Kinect: Apply WPF Skin/Theme | Complete with constant updates in each iteration, UAT1 done | 5 | Carmen & Joy |
Kinect: Listbox scrolling with Kinect | Fully deployed and tested 100%, UAT1 done | 7 | Kiwie |
Kinect Gestures: Recognize Left and Right Swipes | Fully deployed and tested 100%, UAT1 done | 9 | Kiwie |
Kinect: Implement ElementFlow | Fully deployed and tested 100%, UAT1 done | 8 | Kiwie |
Kinect Gestures: Single Hand Scroll for Element Flow | Fully deployed and tested 100%, UAT1 done | 7 | Kiwie |
Application Paging (get Root Page, page and MenuOptions) | Fully deployed and tested 100%, UAT1 done | Kinect: 3, Backend: 7 | Kiwie and Alex |
Kinect Gestures: Recognize Up Swipes | Fully deployed and tested 100%, UAT1 done | 5 | Kiwie |
Kinect: KinectListMenu (Template and Display Selected MenuOption Preview) | Fully deployed and tested 100%, UAT1 done | 5 | Kiwie |
Kinect: HalfCircleMenu (Template and Databind Menu Options to HalfCirclePanel) | Fully deployed and tested 100%, UAT1 done | 6 | Kiwie |
Kinect Gestures: Vertical Scrolling for HalfCirclePanel | Fully deployed and tested 100%, UAT1 done | 8 | Kiwie |
Kinect: ElementFlow Product Browsing | Fully deployed and tested 100%, UAT1 done | 8 | Kiwie |
Kinect Gestures: 3D Two-Handed Scroll for Element Flow | Fully deployed and tested 100%, UAT1 done | 9 | Kiwie |
Kinect: ElementFlowMenu (Template and Databind MenuOptions to ElementFlow) | Fully deployed and tested 100%, UAT1 done | 7 | Kiwie |
View and Display Elements (tabs, prices, colors and features) | Fully deployed and tested 100%, UAT1 done | Kinect: 3, Backend: 7 | Kiwie and Alex |
View and Display Retailer-Generated Content | Fully deployed and tested 100%, UAT1 done | Kinect: 3, Backend: 6 | Kiwie and Alex |
Web app: Manage catalog | 70% completed | 9 | Alex |
Kinect: Instructional Video | 90% completed | 8 | Bevan |
Project Schedule (Plan vs Actual)
Comparing our project plan during the acceptance with the actual work done, our team feels that almost all the tasks/functions have been completed as plan. We had made some minor adjustments to our project schedule due to an additional functionality in order to justify the benefits of a Kinect app as opposed to a touchscreen app.
Planned | Actual | ||||||
---|---|---|---|---|---|---|---|
Iteration | Task | Start | End | Task | Start | End | comment |
1 | Create WPF App Project for Kinect Front-end | 23/12/11 | 24/12/11 | Create WPF App Project for Kinect Front-end | 23/12/11 | 24/12/11 | |
Login with Facial Recognition | 24/12/11 | 27/12/11 | Login with Facial Recognition | 24/12/11 | 27/12/11 | ||
Main Menu Navigation (Listbox) | 28/12/11 | 01/01/12 | Main Menu Navigation (Listbox) | 28/12/11 | 01/01/12 | ||
Pick Items (with SAP) | 01/01/12 | 05/01/12 | Pick Items (with SAP) | 01/01/12 | 04/01/12 | ||
Create ASP.NET Web App for back-end | 05/01/12 | 06/01/12 | Create ASP.NET Web App for back-end | 05/01/12 | 06/01/12 | ||
2 | Review Project Scope and Storyboards | 10/01/12 | 19/01/12 | Review Project Scope and Storyboards | 10/01/12 | 21/01/12 | Went through more revisions than expected |
Kinect Gestures: Recognize Left/Right Swipes | 12/01/12 | 13/01/12 | Kinect Gestures: Recognize Left/Right Swipes | 12/01/12 | 13/01/12 | ||
Kinect: ListBox Scrolling by Kinect | 13/01/12 | 14/01/12 | Kinect: ListBox Scrolling by Kinect | 13/01/12 | 14/01/12 | ||
Kinect: Implement ElementFlow | 14/01/12 | 16/01/12 | Kinect: Implement ElementFlow | 14/01/12 | 16/01/12 | ||
Kinect Gestures: Single Hand Scroll for ElementFlow | 16/01/12 | 17/01/12 | Kinect Gestures: Single Hand Scroll for ElementFlow | 16/01/12 | 17/01/12 | ||
3 | Application Paging | 19/01/12 | 24/01/12 | Application Paging | 21/01/12 | 28/01/12 | Member's laptop malfunctioned |
Kinect: View KinectListMenu | 19/01/12 | 20/01/12 | Kinect: View KinectListMenu | 19/01/12 | 20/01/12 | ||
Kinect Gestures: Recognize Up Swipes | 20/01/12 | 21/01/12 | Kinect Gestures: Recognize Up Swipes | 20/01/12 | 21/01/12 | ||
Kinect: Create HalfCircleMenu template | 20/01/12 | 21/01/12 | Kinect: Create HalfCircleMenu template | 20/01/12 | 21/01/12 | ||
Kinect Gestures: Vertical Scrolling for HalfCirclePanel | 20/01/12 | 21/01/12 | Kinect Gestures: Vertical Scrolling for HalfCirclePanel | 20/01/12 | 21/01/12 | ||
Kinect: Create ElementFlow Product Browsing template | 20/01/12 | 25/01/12 | Kinect: Create ElementFlow Product Browsing template | 21/01/12 | 25/01/12 | ||
4 | Kinect Gestures: 3D Two-Handed Scroll for Element Flow | 27/01/12 | 28/01/12 | Kinect Gestures: 3D Two-Handed Scroll for Element Flow | 25/01/12 | 26/01/12 | Kinect development going faster than planned |
Kinect: Create ElementFlowMenu layout | 28/01/12 | 30/01/12 | Kinect: Create ElementFlowMenu layout | 28/01/12 | 30/01/12 | ||
View and Display Elements | 27/01/12 | 03/02/12 | View and Display Elements | 27/01/12 | 03/02/12 | ||
View Retailer-Generated Content | 29/01/12 | 04/02/12 | View Retailer-Generated Content | 29/01/12 | 04/02/12 | ||
Web app: Manage Catalog | 01/02/12 | 04/02/12 | Web app: Manage Catalog | 01/02/12 | 06/02/12 | Still need to touch up on UI | |
Kinect: Special Function | 06/02/12 | 07/02/12 | Added a special function to justify use of Kinect |
Project Metrics
The following is a summary analysis of our project metrics for the first 4 iterations. For more details of each metric, please click here.
Project Management Metric
Schedule Metric
Apart from Iterations 2 and 3, the team managed to finish the planned tasks within the estimated period allocated for each iteration. Iteration 2 took 2 days longer as the team went through more revisions to the storyboards than planned. This allowed us to make sure our scope was agreeable to all our stakeholders. Iteration 3 took 4 extra days than estimated previously and the course of action taken by the team was to extend the iteration. However, only the back-end web app was affected whereas the Kinect front-end was on schedule.
So far, the schedule metric served our team well by allowing us to have a better grasp of the actual duration needed to develop a function, so that we can better improve our estimation of similar functions in future iterations. This helped our Project Manager when he had to revise the project schedule at the end/beginning of each iteration.
For example, during the first iteration, the team's estimation for the assigned tasks was rather not accurate for most tasks (because the Project Manager used his experience with Windows Phone 7 development to estimate the schedule initially). However, the schedule metric results collected during the first few iterations helped the team to revise the estimated duration for Iterations 4 to 7 in an attempt to better reflect the eventual duration required for the remaining iterations.
Code Metrics
Maintainability Index Metric
Maintability Index metric is the only metric that have Color coded ratings. It is used to quickly identify trouble spots in code and reveal code smell. It can be used together with cyclomatic complexity and lines of code metrics in order to reveal code smell which is any symptom in the source code of a particular program that may have the possible indication of a deeper problem.
Code Metrics Analysis
For example, using logJointStatus() has potential to be refactored. It has a relatively low maintainability index (even though >20 and a high complexity relative to other methods. Takes up almost half of total lines of codes in parent class
Project Risks
Summary
Throughout the project, we have learnt that the impact and likelihood of certain risks presented during the project acceptance presentation did not turn out the way our team thought they would. For example, we used to be quite worried with integrating Bump API into the application but it turned out that the unstable Bump connection (due to the Bump's Android API being only in beta currently) was a bigger cause for concern. Next, the frequent disruptions to our server's uptime during Iteration 1 also did not occur at all in subsequent iterations so our team's reliance on our iLAB infrastructure was actually downgraded as a risk. Similarly, Bump service downtime which was observed in Dec 2010 also did not occur since then, therefore the likelihood of the Bump service being down due to maintenance was also downgraded.
Lastly, when the semester begun to kick into full swing, our team started to feel the heavy workload of juggling FYP with other SMU modules, such as assignment deadlines and midterms. Hence, we added this as a risk that was very likely with high impact.
No | Types of Risk | Reasons | Likelihood | Impact | Mitigation Strategy | Status |
---|---|---|---|---|---|---|
1 | Project Management Risk |
Project scope not optimal for Kinect app |
Medium |
High |
|
Mitigated |
Own project risk: we do not know what’s best for us |
High |
High |
See all our pre-survey results here: Pre-Survey Results |
Mitigated | ||
2 | Technology & Learning Risk |
Lack of documentation for Kinect SDK |
High |
High |
Follow latest Kinect samples uploaded by online community |
Mitigated |
No suitable gestures library that we can use |
High |
High |
Write our own gesture library |
Eliminated | ||
Self-written motion gestures may be inaccurate/ inconsistent / too difficult to perform |
High |
High |
|
Mitigated | ||
Low audio recognition accuracy |
High |
Medium |
Do AEC Reduce reliance on voice commands |
Mitigated |
Technical Complexity
Technical complexity of the task/function/features shown below.
Task/function/features, etc | Description | % of Codes from API | % of Self written code |
---|---|---|---|
Gestures
|
|
0.1% | 99.9% |
Augmented Reality
|
|
30% (Kinect Startup Tuts) | 70% |
Audio Beam Angle
|
Detect source audio angle & ignore background noise
|
? | ? |
ElementFlow
|
3D WPF graphics
|
50% | 50% |
Product Catalog Management
|
|
0% | 100% |
Kinect Page Management
|
|
0% | 100% |
Facial Recognition (removed)
|
|
50% | 50% |
Quality of Product
Quality of code | Description |
---|---|
Algorithm-based motion gesture detection |
|
3D Graphics in WPF for menus & product browsing |
|
Dynamic page order/creation | Hierarchy and appearing order of Pages, e.g. menus can be configured
|
Dynamic tabs & fields for products |
|
Tagging of products to multiple pages | User can determine which products appear on which page
|
Intermediate Deliverables
Stage | Specification | Modules |
---|---|---|
Project Management | Meeting Minutes | Team meetings: Minutes 1-12 |
Supervisor: Minutes 1-12 | ||
Requirements | Final Presentation | Final Presentation Slides |
Analysis | Use case | Use Case Diagram & Documentation |
Service Diagram | Service Diagram | |
Design | Architecture System Diagram | System Architecture |
ER Diagram | ER Diagram |
Deployment
1. Our Web services are running on application backend server.
2. SQL 2008 Server database - our application backend server with access granted to the web services
3. Secondary backup web services/DB - running on Azure as precautionary measure
4. Local machine to run Kinect App
with Kinect SDK installed, Kinect plugged in and output to 32” TV
User Testing 2
Our User Test 2 took place on 21/3/12 and ended on 27/3/12. A total of 35 testers participated. In order to ensure that our test results are not affected by any biasness, our testers included students from different faculties and include a mixture of both non-Kinect and Kinect users so that we can anticipate the general public's receptiveness to our application once it is launched. The following documents were what we prepared for our User Testing 2.
Feel free to view the schedule and User Testing execution process that has been prepared, as shown in the first external link. The second link is the feedback form that all testers will fill in after trying out the testing.
Prepared Documents | Link |
---|---|
User Test 2 Schedule, Execution Process | UT 2 Schedule and Execution process |
User Test 2 Feedback Questions | SMU Quadtrics Feedback Questions |
Download UT 2 Feedback | |
User Test 2 Plan | Download UT 2 Plan |
User Test 2 Video Tutorial Transformation Process | Our Video Tutorial Improvements and Timeline |
User Test 2 Instructions for Testers | Download instructions |
UT2 Goals
Collect useful usability and gesture feedback from our testers to better understand their needs and improve our existing application.
UT2 Objectives
- Unguided Test for all testers
- Verify if the user finds it intuitive and easy to pick up hand gestures
- Verify the usability and easy navigation of the application
- Gather user opinion on Kinect Application feasibility in a retail setting
Testers Demographics
Type | Number | Percentage |
---|---|---|
SIS Students | 25 | 71% |
Non-IS Students | 9 | 26% |
Professors and Instructors | 1 | 3% |
Comments: Non-SIS students represent the user population from other faulties in SMU and their comments tend to be much less technical, reflecting most of the actual target public users. In addition, they also require much more guidance than SIS students, who are more experienced in application development and new technologies. Non SIS students provided us with an interesting dimension to the feedback collected, such as more of the business feasibility of the Kinect application, as compared to the SIS Students who will be more inclined towards the technology, and will tell more of which functions are more useful.
The instructor, on the other hand, is also very technical and provided us the best advice and opinions for a user to pick up gestures naturally, and how the application will be suitable for the public, based onher vast experiences with lots of applications.
In addition, we had a good mix of males and females, giving us a good mix of reviews, not bias only to one gender.
Type | Number | Percentage |
---|---|---|
Male | 19 | 54% |
Female | 16 | 46% |
Black Box Testing
To ensure that our 2T1 meets our objectives in allowing the tester to use the application without our help, we made it a non guided User Test. We wanted to verify if the users could pick up the gestures naturally and provide us opinion on how our group can further improve our application. Thus, we will place instructions at the side of the tester (both left and right) and the user will follow the steps required to allow us to know which features require more help from the user, without the need of someone to provide immediate help.
User Testing 2 Outcome
Apart from quantitative results, our team has received very positive feedbacks and suggestions on how we can improve our application. Most of the testers feel that the Kinect application is cool, interactive, and still room to improve in the Augmented Reality function and the gestures used in the application. They see a potential in this application but a minority of 3 testers feel that they do not want to use the application again. All other testers will want to use our application again.
Our team has prioritized the following list of issues that our team will embark on before UAT2.
Things to Improve | How we are doing it | When? |
---|---|---|
More instructions on the hand movement | Filming tutorial videos, isolate tutorial from the feature itself | Iteration 5 |
Voice recognition is not very accurate | Ignore sound from other directions using Acoustic Echo Cancellation (AEC) | Iteration 5 |
|
Review gestures | Iteration 6 (pre-planned) |
Expand AR feature | Prioritize AR over Take Survey and allocate | Iterations 5 & 6 |
User Testing 2 Documents
For more information on our UAT 1, you can download our documents below:
Documents | Link |
---|---|
UAT Results/Analysis | UT 2 Results |
User Test 2 Schedule and Final Results | UT 2 Observation Results |
Beta Testing
Reflections
Team Reflections
These are the skills that we had learnt so far from each iteration:
Iteration 1 and 2 | |
Soft Skills | Technical Skills |
|
|
Iteration 3 and 4 | |
Soft Skills | Technical Skills |
|
|
Individual Reflection
These are our group individual reflections: