Difference between revisions of "2011T2 Bazinga Midterm Wiki"
(19 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
+ | <!--Navigation--> | ||
+ | {| style="background-color:#FFFFFF; color:#000000 padding: 5px 0 0 0;" width="100%" cellspacing="0" cellpadding="0" valign="top" border="0" font face="HelveticaNeue LT 55 Roman" | | ||
+ | | style="padding:0 .6em; font-size:90%; border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; text-align:center; background-color:#ffffff; " width="10%" | [[IS480_Team_wiki:_2011T2_Bazinga |<font color="#000000">Main Wiki Page</font>]] | ||
+ | | style="border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; background:none;" width="2%" | | ||
+ | | style="padding:0.6em; font-size:90%; background-color:#ffffff; border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; text-align:center; color:#000000" width="12%" | [[2011T2_Bazinga_Midterm_Wiki#Project_Management |<font color="#000000">Project Management</font>]] | ||
+ | | style="border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; background:none;" width="2%" | | ||
+ | | style="padding:0.6em; font-size:90%; background-color:#ffffff; border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; text-align:center; color:#000000" width="12%" | [[2011T2_Bazinga_Midterm_Wiki#Project_Metrics|<font color="#000000">Project Metrics</font>]] | ||
+ | | style="border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; background:none;" width="2%" | | ||
+ | | style="padding:0.6em; font-size:90%; background-color:#ffffff; border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; text-align:center; color:#000000" width="12%" | [[2011T2_Bazinga_Midterm_Wiki#Technical_Complexity|<font color="#000000">Technical Complexity & Deployment</font>]] | ||
+ | | style="border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; background:none;" width="2%" | | ||
+ | | style="padding:0.6em; font-size:90%; background-color:#ffffff; border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; text-align:center; color:#000000" width="12%" | [[2011T2_Bazinga_Midterm_Wiki#User_Test_1|<font color="#000000">User Test 1 </font>]] | ||
+ | | style="border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; background:none;" width="2%" | | ||
+ | | style="padding:0.6em; font-size:90%; background-color:#ffffff; border-bottom:6px solid #2f2a79; border-top:6px solid #2f2a79; text-align:center; color:#000000" width="12%" | [[2011T2_Bazinga_Midterm_Wiki#Reflections |<font color="#000000"> Team Reflections</font>]] | ||
+ | |} | ||
+ | |||
+ | |||
[[Image:Bazinga - Team Logo.png|500px|right|]] | [[Image:Bazinga - Team Logo.png|500px|right|]] | ||
Line 693: | Line 709: | ||
|} | |} | ||
+ | |||
+ | <br> | ||
===<font color = "#0164a6">Quality of Product</font>=== | ===<font color = "#0164a6">Quality of Product</font>=== | ||
Line 762: | Line 780: | ||
|rowspan="1" valign="top"|Requirements | |rowspan="1" valign="top"|Requirements | ||
|rowspan="1" valign="top"|Mid Term Presentation | |rowspan="1" valign="top"|Mid Term Presentation | ||
− | ||[[ | + | |
+ | ||[[Media:Mid Term v1.ppt| Mid Term Presentation Slides]] | ||
|- | |- | ||
Line 771: | Line 790: | ||
|- | |- | ||
||Service Diagram | ||Service Diagram | ||
− | ||[[ | + | ||[[IS480_Team_wiki:_2011T2_Bazinga_ServiceDiagram| Service Diagram]] |
|- | |- | ||
+ | |||
|rowspan="2" valign="top"|Design | |rowspan="2" valign="top"|Design | ||
||Architecture System Diagram | ||Architecture System Diagram | ||
− | ||[[ | + | ||[[IS480_Team_wiki:_2011T2_Bazinga_SystemArchitecture| System Architecture]] |
|- | |- | ||
||ER Diagram | ||ER Diagram | ||
− | ||[[2011T2 Bazinga Business Process Diagram|ER Diagram]] | + | ||[[2011T2 Bazinga Business Process Diagram|ER Diagram]] |
− | |||
|- | |- | ||
Line 811: | Line 830: | ||
with Kinect SDK installed, Kinect plugged in and output to 32” TV | with Kinect SDK installed, Kinect plugged in and output to 32” TV | ||
− | =<div style="width: 95%; filter: progid:DXImageTransform.Microsoft.gradient(startColorstr='#37ab4d', endColorstr='#00922b'); background: -webkit-gradient(linear, left top, left bottom, from(#37ab4d), to(#00922b)); background: -moz-linear-gradient(top, #37ab4d, #00922b); padding:7px; margin-top: 35px; line-height: 1.4em; color:white"> User | + | =<div style="width: 95%; filter: progid:DXImageTransform.Microsoft.gradient(startColorstr='#37ab4d', endColorstr='#00922b'); background: -webkit-gradient(linear, left top, left bottom, from(#37ab4d), to(#00922b)); background: -moz-linear-gradient(top, #37ab4d, #00922b); padding:7px; margin-top: 35px; line-height: 1.4em; color:white"> User Test 1</div>= |
<div style="width: 96%" align="justify"> | <div style="width: 96%" align="justify"> | ||
Our team's UAT 1 took place on 6/2/12 and ended on 9/2/12. A total of 107 testers participated in our UAT. In order to ensure that our test results are not affected by any biasness, our testers included students from different faculties, a handful of SIS professors and instructors. Our testers also include a mixture of both non-Kinect and Kinect users so that we can anticipate the general public's receptiveness to our application once it is launched. | Our team's UAT 1 took place on 6/2/12 and ended on 9/2/12. A total of 107 testers participated in our UAT. In order to ensure that our test results are not affected by any biasness, our testers included students from different faculties, a handful of SIS professors and instructors. Our testers also include a mixture of both non-Kinect and Kinect users so that we can anticipate the general public's receptiveness to our application once it is launched. |
Latest revision as of 10:35, 7 October 2012
Main Wiki Page | Project Management | Project Metrics | Technical Complexity & Deployment | User Test 1 | Team Reflections |
Contents
Project Progress Summary
Summary Overview
Since 23 December 2011, we have gone through 4 iterations with the assigned functionality being completed timely every iteration. During these 3 months, our team accomplished the following:
- Picked up Kinect development from scratch
- Designed and planned the blueprint for our Kinect Application
- Delivered a working Kinect application with 70% of the functionality completed
- Conducted User Acceptance Test 1 with a total of 107 testers
Project Highlights
While the team has put in much effort in planning the execution of the project, there were some unexpected events which forced the team to take certain corrective measures as list below:
1 | Took 2 weeks to learn on Kinect SDK |
At the beginning phase of our application development, our team discovered that Kinect application development was rather difficult as there are limited relevant resources available online, as compared to a mobile or ordinary web development. In addition, the initial Kinect SDK only provides the coordinates of the human body, making it really difficult for us to do up the body gestures from scratch, that normal Kinect games have.
Thus, our group decided to try out the two available gesture libraries we found through the Kinect Community. The libraries could recognize and detect motion gestures accurately, but there are limitations such as adding new gesture functions or detecting new users easily. Thus, it made it very difficult for us to fully utilize all their functions. Due to this, our core programmer, Kiwie, decided to write our own gesture library from scratch. While doing up the gestures, he also rely on a third party library that could enable different layouts of viewing products. The layouts were well designed and easy to implement. | |
2 | Unforeseen delays to our Project during Iteration 2 and 3 |
2 of our team members' laptops crashed during Iteration
| |
3 | Change of Project Idea after Project Acceptance |
During the Project Acceptance, both Prof Shim and Prof Gan provided us with very concrete and useful feedback on how we could improve our project further. All of us spent some time thinking about their feedback and had several discussions before we decided to change our project idea and scenario. We even had two supervisor meetings with Prof Shim to pitch our new ideas to her. She has indeed helped us alot in getting towards the right direction of our project. All of us agreed on the need to change our project scope and functionalities partly as we wanted to target the masses, and not creating our project based on a particular software, although used by many companies around the world. In addition, we wanted to fully utilitze the Kinect and its main functions (Skeletal Tracking, Voice Recognition, Facial Recognition) and find a scenario where using the Kinect is simplier than using the keyboard and mouse.
The ideas can be found in our Meeting Minutes of 7,8 and Supervisor Meeting Minutes 4,5. Our team then gave top priority to improve the scope, and then moving on to changing the Project Plan, and other technical documentation. Our project manager spent some time to make the necessary changes required for the project plan and schedule to accommodate the new proposed functionality. We had to reprioritize several functionalities planned to ensure that we completed the main functions.
| |
4 | User Acceptance Testing 1 |
During the UAT1,
The changes made in our project schedule includes:
|
Project Management
Project Schedule
Project Status
Task/function/features, etc | Status | Complexity Level (0-10) |
Comment |
Login with Facial Recognition using Face.com | Function to be kept in view - fully developed but removed for the time being | 9 | Kiwie |
Kinect with SAP integration: Pick Items | Broadened scope instead of limiting to SAP - removed function for the time being | 6 | Kiwie |
Kinect: Apply WPF Skin/Theme | Complete with constant updates in each iteration, UAT1 done | 5 | Carmen & Joy |
Kinect: Listbox scrolling with Kinect | Fully deployed and tested 100%, UAT1 done | 7 | Kiwie |
Kinect Gestures: Recognize Left and Right Swipes | Fully deployed and tested 100%, UAT1 done | 9 | Kiwie |
Kinect: Implement ElementFlow | Fully deployed and tested 100%, UAT1 done | 8 | Kiwie |
Kinect Gestures: Single Hand Scroll for Element Flow | Fully deployed and tested 100%, UAT1 done | 7 | Kiwie |
Application Paging (get Root Page, page and MenuOptions) | Fully deployed and tested 100%, UAT1 done | Kinect: 3, Backend: 7 | Kiwie and Alex |
Kinect Gestures: Recognize Up Swipes | Fully deployed and tested 100%, UAT1 done | 5 | Kiwie |
Kinect: KinectListMenu (Template and Display Selected MenuOption Preview) | Fully deployed and tested 100%, UAT1 done | 5 | Kiwie |
Kinect: HalfCircleMenu (Template and Databind Menu Options to HalfCirclePanel) | Fully deployed and tested 100%, UAT1 done | 6 | Kiwie |
Kinect Gestures: Vertical Scrolling for HalfCirclePanel | Fully deployed and tested 100%, UAT1 done | 8 | Kiwie |
Kinect: ElementFlow Product Browsing | Fully deployed and tested 100%, UAT1 done | 8 | Kiwie |
Kinect Gestures: 3D Two-Handed Scroll for Element Flow | Fully deployed and tested 100%, UAT1 done | 9 | Kiwie |
Kinect: ElementFlowMenu (Template and Databind MenuOptions to ElementFlow) | Fully deployed and tested 100%, UAT1 done | 7 | Kiwie |
View and Display Elements (tabs, prices, colors and features) | Fully deployed and tested 100%, UAT1 done | Kinect: 3, Backend: 7 | Kiwie and Alex |
View and Display Retailer-Generated Content | Fully deployed and tested 100%, UAT1 done | Kinect: 3, Backend: 6 | Kiwie and Alex |
Web app: Manage catalog | 70% completed | 9 | Alex |
Kinect: Instructional Video | 90% completed | 8 | Bevan |
Project Schedule (Plan vs Actual)
Comparing our project plan during the acceptance with the actual work done, our team feels that almost all the tasks/functions have been completed as plan. We had made some minor adjustments to our project schedule due to an additional functionality in order to justify the benefits of a Kinect app as opposed to a touchscreen app.
Planned | Actual | ||||||
Iteration | Task | Start | End | Task | Start | End | comment |
1 | Create WPF App Project for Kinect Front-end | 23/12/11 | 24/12/11 | Create WPF App Project for Kinect Front-end | 23/12/11 | 24/12/11 | |
Login with Facial Recognition | 24/12/11 | 27/12/11 | Login with Facial Recognition | 24/12/11 | 27/12/11 | ||
Main Menu Navigation (Listbox) | 28/12/11 | 01/01/12 | Main Menu Navigation (Listbox) | 28/12/11 | 01/01/12 | ||
Pick Items (with SAP) | 01/01/12 | 05/01/12 | Pick Items (with SAP) | 01/01/12 | 04/01/12 | ||
Create ASP.NET Web App for back-end | 05/01/12 | 06/01/12 | Create ASP.NET Web App for back-end | 05/01/12 | 06/01/12 | ||
2 | Review Project Scope and Storyboards | 10/01/12 | 19/01/12 | Review Project Scope and Storyboards | 10/01/12 | 21/01/12 | Went through more revisions than expected |
Kinect Gestures: Recognize Left/Right Swipes | 12/01/12 | 13/01/12 | Kinect Gestures: Recognize Left/Right Swipes | 12/01/12 | 13/01/12 | ||
Kinect: ListBox Scrolling by Kinect | 13/01/12 | 14/01/12 | Kinect: ListBox Scrolling by Kinect | 13/01/12 | 14/01/12 | ||
Kinect: Implement ElementFlow | 14/01/12 | 16/01/12 | Kinect: Implement ElementFlow | 14/01/12 | 16/01/12 | ||
Kinect Gestures: Single Hand Scroll for ElementFlow | 16/01/12 | 17/01/12 | Kinect Gestures: Single Hand Scroll for ElementFlow | 16/01/12 | 17/01/12 | ||
3 | Application Paging | 19/01/12 | 24/01/12 | Application Paging | 21/01/12 | 28/01/12 | Member's laptop malfunctioned |
Kinect: View KinectListMenu | 19/01/12 | 20/01/12 | Kinect: View KinectListMenu | 19/01/12 | 20/01/12 | ||
Kinect Gestures: Recognize Up Swipes | 20/01/12 | 21/01/12 | Kinect Gestures: Recognize Up Swipes | 20/01/12 | 21/01/12 | ||
Kinect: Create HalfCircleMenu template | 20/01/12 | 21/01/12 | Kinect: Create HalfCircleMenu template | 20/01/12 | 21/01/12 | ||
Kinect Gestures: Vertical Scrolling for HalfCirclePanel | 20/01/12 | 21/01/12 | Kinect Gestures: Vertical Scrolling for HalfCirclePanel | 20/01/12 | 21/01/12 | ||
Kinect: Create ElementFlow Product Browsing template | 20/01/12 | 25/01/12 | Kinect: Create ElementFlow Product Browsing template | 21/01/12 | 25/01/12 | ||
4 | Kinect Gestures: 3D Two-Handed Scroll for Element Flow | 27/01/12 | 28/01/12 | Kinect Gestures: 3D Two-Handed Scroll for Element Flow | 25/01/12 | 26/01/12 | Kinect development going faster than planned |
Kinect: Create ElementFlowMenu layout | 28/01/12 | 30/01/12 | Kinect: Create ElementFlowMenu layout | 28/01/12 | 30/01/12 | ||
View and Display Elements | 27/01/12 | 03/02/12 | View and Display Elements | 27/01/12 | 03/02/12 | ||
View Retailer-Generated Content | 29/01/12 | 04/02/12 | View Retailer-Generated Content | 29/01/12 | 04/02/12 | ||
Web app: Manage Catalog | 01/02/12 | 04/02/12 | Web app: Manage Catalog | 01/02/12 | 06/02/12 | Still need to touch up on UI | |
Kinect: Special Function | 06/02/12 | 07/02/12 | Added a special function to justify use of Kinect |
Project Metrics
The following is a summary analysis of our project metrics for the first 4 iterations. For more details of each metric, please click here.
Project Management Metric
Schedule Metric
Apart from Iterations 2 and 3, the team managed to finish the planned tasks within the estimated period allocated for each iteration. Iteration 2 took 2 days longer as the team went through more revisions to the storyboards than planned. This allowed us to make sure our scope was agreeable to all our stakeholders. Iteration 3 took 4 extra days than estimated previously and the course of action taken by the team was to extend the iteration. However, only the back-end web app was affected whereas the Kinect front-end was on schedule.
So far, the schedule metric served our team well by allowing us to have a better grasp of the actual duration needed to develop a function, so that we can better improve our estimation of similar functions in future iterations. This helped our Project Manager when he had to revise the project schedule at the end/beginning of each iteration.
For example, during the first iteration, the team's estimation for the assigned tasks was rather not accurate for most tasks (because the Project Manager used his experience with Windows Phone 7 development to estimate the schedule initially). However, the schedule metric results collected during the first few iterations helped the team to revise the estimated duration for Iterations 4 to 7 in an attempt to better reflect the eventual duration required for the remaining iterations.
Code Metrics
Maintainability Index Metric
Maintability Index metric is the only metric that have Color coded ratings. It is used to quickly identify trouble spots in code and reveal code smell. It can be used together with cyclomatic complexity and lines of code metrics in order to reveal code smell which is any symptom in the source code of a particular program that may have the possible indication of a deeper problem.
Code Metrics Analysis
For example, using logJointStatus() has potential to be refactored. It has a relatively low maintainability index (even though >20 and a high complexity relative to other methods. Takes up almost half of total lines of codes in parent class
Project Risks
Summary
Throughout the project, we have learnt that the impact and likelihood of certain risks presented during the project acceptance presentation did not turn out the way our team thought they would. For example, we used to be quite worried with integrating Bump API into the application but it turned out that the unstable Bump connection (due to the Bump's Android API being only in beta currently) was a bigger cause for concern. Next, the frequent disruptions to our server's uptime during Iteration 1 also did not occur at all in subsequent iterations so our team's reliance on our iLAB infrastructure was actually downgraded as a risk. Similarly, Bump service downtime which was observed in Dec 2010 also did not occur since then, therefore the likelihood of the Bump service being down due to maintenance was also downgraded.
Lastly, when the semester begun to kick into full swing, our team started to feel the heavy workload of juggling FYP with other SMU modules, such as assignment deadlines and midterms. Hence, we added this as a risk that was very likely with high impact.
No | Types of Risk | Reasons | Likelihood | Impact | Mitigation Strategy | Status |
---|---|---|---|---|---|---|
1 | Project Management Risk |
Project scope not optimal for Kinect app |
Medium |
High |
|
Mitigated |
Own project risk: we do not know what’s best for us |
High |
High |
See all our pre-survey results here: Pre-Survey Results |
Mitigated | ||
2 | Technology & Learning Risk |
Lack of documentation for Kinect SDK |
High |
High |
Follow latest Kinect samples uploaded by online community |
Mitigated |
No suitable gestures library that we can use |
High |
High |
Write our own gesture library |
Eliminated | ||
Self-written motion gestures may be inaccurate/ inconsistent / too difficult to perform |
High |
High |
|
Mitigated | ||
Low audio recognition accuracy |
High |
Medium |
Do AEC Reduce reliance on voice commands |
Mitigated |
Technical Complexity
Technical complexity of the task/function/features shown below.
Task/function/features, etc | Description | % of Codes from API | % of Self written code |
Gestures
|
|
0.1% | 99.9% |
Augmented Reality
|
|
30% (Kinect Startup Tuts) | 70% |
Audio Beam Angle
|
Detect source audio angle & ignore background noise
|
? | ? |
ElementFlow
|
3D WPF graphics
|
50% | 50% |
Product Catalog Management
|
|
0% | 100% |
Kinect Page Management
|
|
0% | 100% |
Facial Recognition (removed)
|
|
50% | 50% |
Quality of Product
Quality of code | Description |
Algorithm-based motion gesture detection |
|
3D Graphics in WPF for menus & product browsing |
|
Dynamic page order/creation | Hierarchy and appearing order of Pages, e.g. menus can be configured
|
Dynamic tabs & fields for products |
|
Tagging of products to multiple pages | User can determine which products appear on which page
|
Intermediate Deliverables
Stage | Specification | Modules |
Project Management | Meeting Minutes | Team meetings: Minutes 1-12 |
Supervisor: Minutes 1-7 | ||
Requirements | Mid Term Presentation | Mid Term Presentation Slides |
Analysis | Use case | Use Case Diagram & Documentation |
Service Diagram | Service Diagram | |
Design | Architecture System Diagram | System Architecture |
ER Diagram | ER Diagram | |
UAT 1 Documents | UAT 1 Test Plan | UAT 1 Test Plan |
UAT 1 Test Introduction, Survey Questions and initial Test Case | UAT 1 Test Introduction, Survey Questions and initial Test Case | |
UAT 1 Results and Analysis | UAT 1 Results |
Deployment
1. Our Web services are running on application backend server.
2. SQL 2008 Server database - our application backend server with access granted to the web services
3. Secondary backup web services/DB - running on Azure as precautionary measure
4. Local machine to run Kinect App
with Kinect SDK installed, Kinect plugged in and output to 32” TV
User Test 1
Our team's UAT 1 took place on 6/2/12 and ended on 9/2/12. A total of 107 testers participated in our UAT. In order to ensure that our test results are not affected by any biasness, our testers included students from different faculties, a handful of SIS professors and instructors. Our testers also include a mixture of both non-Kinect and Kinect users so that we can anticipate the general public's receptiveness to our application once it is launched.
UAT 1 Goals
Collect useful usability and gesture feedback from our testers to better understand their needs and improve our existing application
UAT 1 Objectives
- Guided Test for all Users
- Verify if the user finds it intuitive and easy to pick up hand gestures
- Verify the usability and easy navigation of the application
- Gather user opinion on Kinect Application feasibility in a retail setting
Click here to see our UAT Schedule List.
Click here to see our UAT Feedback Questions, consisting of 14 MCQs and 4 open ended questions.
Testers Demographics
Type | Number | Percentage |
IS Students | 73 | 68% |
Non-IS Students | 29 | 27% |
Professors and Instructors | 5 | 5% |
Comments: Non-SIS students represent the user population from other faulties in SMU and their comments tend to be much less technical, reflecting most of the actual target public users. In addition, they also require much more guidance than SIS students, who have been experienced in application development. They provided us with an interesting dimension to the feedback collected, such as more of the business feasibility of the Kinect application, as compared to the SIS Students who will be more inclined towards the technology, and will tell more of which functions are more useful.
The professors and instructors on the other hand, are also very technical and provided us the best advice and opinions for a user to pick up gestures naturally, and how the application will be suitable for the public, based on their experiences with lots of applications.
In addition, we had a good mix of males and females, giving us a good mix of reviews, not bias only to one gender.
Test Case
To ensure that our UAT1 meets our objectives in seeking feedback in mainly on the gestures, and the usability of the application, we made it a guided UAT. We wanted to verify if the users could pick up the gestures naturally and provide us which opinion on how to improve. Thus, as this stage, we do not need the users to do a test case to check the usage of each function.
UAT1 Outcome
Apart from quantitative results, our team has received very positive feedbacks and numerous suggestions on how we can improve our application. Most of the testers feel that the Kinect application is cool, interactive, and still room to improve in ease of use and ease of learning. They see a potential in this application but a minority still feel that the application is not very feasible in the retail setting, as only 1 user can use the application at one time.
Our team has prioritized the following list of issues that our team will embark on before UAT2.
Things to Improve | How we are doing it | When? |
More instructions on the hand movement | Filming tutorial videos, isolate tutorial from the feature itself | Iteration 5 |
Voice recognition is not very accurate | Ignore sound from other directions using Acoustic Echo Cancellation (AEC) | Iteration 5 |
|
Review gestures | Iteration 6 (pre-planned) |
Expand AR feature | Prioritize AR over Take Survey and allocate | Iterations 5 & 6 |
UAT Documents
For more information on our UAT 1, you can download our documents below:
Documents | Link |
UAT Plan | download |
UAT Introduction, Survey Questions and initial Test Case | download |
UAT Results/Analysis | download |
Reflections
Team Reflections
These are the skills that we had learnt so far from each iteration:
Iteration 1 and 2 | |
Soft Skills | Technical Skills |
|
|
Iteration 3 and 4 | |
Soft Skills | Technical Skills |
|
|
Individual Reflection
These are our group individual reflections: