Difference between revisions of "IS480 Team wiki: 2015T1 Vulcan Midterm"
Kcheng.2013 (talk | contribs) |
Kcheng.2013 (talk | contribs) |
||
Line 45: | Line 45: | ||
* LiveLabs servers overload (data increase 1GB/minute), seemed to have been caused by us | * LiveLabs servers overload (data increase 1GB/minute), seemed to have been caused by us | ||
* API level of phone borrowed from school too low for our development | * API level of phone borrowed from school too low for our development | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
==<div style="background: #6CACFF; padding: 15px; font-weight: bold; line-height: 0.3em; text-indent: 0px; font-size:20px; font-family:helvetica"><font color= #FFFFFF>Project Management</font></div>== | ==<div style="background: #6CACFF; padding: 15px; font-weight: bold; line-height: 0.3em; text-indent: 0px; font-size:20px; font-family:helvetica"><font color= #FFFFFF>Project Management</font></div>== |
Revision as of 04:19, 7 October 2015
HOME | ABOUT US | PROJECT OVERVIEW | PROJECT MANAGEMENT | DOCUMENTATION |
Contents
Project Progress Summary
Place your Midterm slides link and deployed site link here
For proposal, please see Requirements at the Project Deliverables. This will help us understand your scope. Note wiki policy here.
This page should NOT be too long. It should link to other pages in the IS480 team wiki. Do not repeat the proposal or other wiki information here. However, keep a snapshot of the midterm state. Highlight changes since project acceptance.
Describe the project progress briefly here. Have the project continued as planned? If not, is the team confident to complete? This is a crossroad for the team to make a decision. Proceed with confident or file an incomplete.
Highlights of Project:
What unexpected events occurred?
- LiveLabs relocation of servers on Oct 5th 2015, two days before Midterms Presentation
- LiveLabs servers overload (data increase 1GB/minute), seemed to have been caused by us
- API level of phone borrowed from school too low for our development
Project Management
Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.
Project Status:
Highlight changes to modules, the completion status (implemented, user testing done, client approved, deployed, etc), the confidence level (0-1 where 0 is no confident of getting it done, 1 is 100% confident in getting it done) and comments (who has been assigned to do it, new scope, removed scoped, etc). Please use a table format to summarize with links to function details.
Task/function/features, etc | Status | Confident Level (0-1) | Comment |
Customer CRUD | Fully deployed and tested 100% | 1 | Fiona |
Trend Analytic | 25% | 0.9 | Ben is researching analytic algoritms |
Project Schedule (Plan Vs Actual):
Compare the project plan during acceptance with the actual work done at this point. Briefly describe a summary here. Everything went as plan, everything has changed and the team is working on a new project with new sponsors or the supervisor is missing. A good source for this section comes from the project weekly report.
Provide a comparison of the plan and actual schedule. Has the project scope expanded or reduced? You can use the table below or your own gantt charts.
Planned Schedule
Actual Schedule
Detailed Planned VS Actual
Iterations | Planned | Actual | Comments | ||
1 | Customer CRUD | 1 Sept 2010 | 25 Aug 2010 | Fiona took the Sales CRUD as well. | |
Trend Analytic | 1 Sept 2010 | 15 Sept 2010 | Ben is too busy and pushed iteration 1 back | ||
2 | User tutorial | 1 Oct 2010 | Removed proposed by Ben | ||
Psycho analysis | 1 Oct 2010 | New module proposed by sponsor |
Project Metrics:
Summary of analysis for the metrics collected. You may refer to another page for the details about the metrics and how it is collected.
Schedule Metric Formula: (Estimated Days / Actual Days) x 100%
Iteration | Planned Duration (Days) | Actual Duration (Days) | Schedule Metric Score | Action | Status |
---|---|---|---|---|---|
2 | 18 | 32 | 56.25% | Team is behind schedule. This is due to the complexity of the tasks planned (Android App and Smart Watch). Follow up action: Rescheduled the future iterations, deducted days from buffer days. |
Completed |
4 | 18 | 24 | 75% | Team is behind schedule. This is due to Livelab's server permission issues. Follow up action: Rescheduled the future iterations, deducted days from buffer days. |
Completed |
Project Risks:
Update the proposal assumptions and risks. Describe what you learn from the risk update and mitigation steps taken.
Risk | Probability | Impact | Mitigation |
Sponsor want to use Joomla instead of Drupal | High | High | Team evaluating Joomla to write an impact analysis report |
Sponsor deployment machine approval and support | High | Medium (now it is low) | Use UPL machine |
Be sure to prioritize the risks.
Technical Complexity:
Describe and list the technical complexity of your project in order of highest complexity first. For example, deploying on iPhone using Objective-C, customizing Drupal with own database, quick search for shortest flight path, database structure, etc.
Quality of product
Provide more details about the quality of your work. For example, you designed a flexible configurable system using XML.config files, uses Strategy Design Pattern to allow plugging in different strategy, implement a regular expression parser to map a flexible formula editor, etc.
Intermediate Deliverables:
There should be some evidence of work in progress.
Stage | Specification | Modules |
---|---|---|
Project Management | Minutes (Sponsor,Supervisor, Team) | Minutes |
Metrics | Schedule, Bug, Change Management Metrics | |
Requirements Gathering | Design Documents | Scenario,Storyboard,Navigation Diagram,Prototype |
Market Research | Market Research | |
Analysis | Use case | Participant & Researcher Use Cases |
Business Process Diagram | Business Process Diagram | |
Testing | User Testing | User Test |
Test Plans | Test Cases |
Not all parts of the deliverables are necessary but the evidence should be convincing of the progress. Try to include design deliverables that shows the quality of your project.
Deployment:
In an iterative approach, ready to use system should be available (deployed) for client and instructions to access the system described here (user name). If necessary, provide a deployment diagram link.
Testing:
Number of User Tests: 3
Tester Profile:
Our testers consist of users with research backgrounds, specifically research assistants currently pursuing their PhD in Psychology. With their experience, we were able to gain valuable feedback with regards to the creation of studies.
For more information about the user tests and the detailed results, please visit the link below:
User Test
Test Cases:
For each iteration, we have functional test cases to test individual functions. Towards the end of the iteration, we will do regression testing and go through the entire flow of the project to ensure all parts are working.
For the detailed test cases, please visit the link below:
Test Cases
For our bug metrics score, we can see that iteration 5 had an exceptionally high score. This was due to the aftermath of User Test 2 and 3, which proved to be useful for us with the functional and UI bugs spotted. Even though the bug score was well above the threshold level of 10, we managed to solve all the bugs in the scheduled debugging time.
For the detailed bug reports, please visit the link below:
Bug Metrics
Reflection
In this section, describe what have the team learn? Be brief. Sometimes, the client writes a report to feedback on the system; this sponsor report can be included or linked from here.
Team Reflection:
Any training and lesson learn? What are the take-away so far? It would be very convincing if the knowledge is share at the wiki knowledge base and linked here.
Benjamin Gan Reflection:
You may include individual reflection if that make sense at this point. The team is uncooperative and did not follow my instructions.