Difference between revisions of "IS480 Team wiki: 2012T2 box.us MidTerms"
Qyang.2010 (talk | contribs) |
Qyang.2010 (talk | contribs) |
||
Line 216: | Line 216: | ||
{| border="0" | {| border="0" | ||
− | |width="500px"| [[Image:Schedule-Ratio-Over-Time.jpg | 500px | Schedule Ratio over Time]] | + | |width="500px" padding="5px"| [[Image:Schedule-Ratio-Over-Time.jpg | 500px | Schedule Ratio over Time]] |
|| Bug Metrics Graph | || Bug Metrics Graph | ||
|- | |- | ||
− | |width="500px"| | + | |width="500px" padding="5px"| |
* We are generally meeting our schedule as the time goes by. Most of the delays are caused by a difference in the estimated time that is required to do each task. | * We are generally meeting our schedule as the time goes by. Most of the delays are caused by a difference in the estimated time that is required to do each task. | ||
* Spike in Iteration 7 is due to the changes in the requirements of the Task components. | * Spike in Iteration 7 is due to the changes in the requirements of the Task components. |
Revision as of 15:02, 6 February 2013
HOME | PROJECT OVERVIEW | PROJECT MANAGEMENT | DOCUMENTATION |
Contents
To Do List
Samples
Project Progress Summary
- Mid Terms Slides: Coming Soon
- Mid Terms Deployed Site: Coming Soon
For proposal, please see Requrements at the Project Deliverables. This will help us understand your scope. Note wiki policy here.
This page should NOT be too long. It should link to other pages in the IS480 team wiki. Do not repeat the proposal or other wiki information here. However, keep a snapshot of the midterm state. Highlight changes since project acceptance.
Describe the project progress briefly here. Have the project continued as planned? If not, is the team confident to complete? This is a crossroad for the team to make a decision. Proceed with confident or file an incomplete.
Project Highlights:
What unexpected events occurred?
- "Task" functionality took longer than expected to scope and develop
- Small details that led to major changes
- Drop of mobile functionalities
- No clear defined business process initially
- Difference in understanding of coding conventions
Project Management
Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.
Project Status:
Functions | Prototyping | Implemented | User Tested | Client Approved | Confident Level | Comment |
Registration | Completed | Completed | User Testing 1 | Pending | 1 | Finishing up user interface for registration of Empact and NPO. Going on to do heuristics evaluation. |
Task | Completed | Completed | User Testing 1 | Pending | 1 | Left to complete the user interface for pending. Includes Task Review items. |
Questions | Completed | Completed | User Testing 1 | Pending | 1 | Final touchups to be done and then to subject to heuristics testing. |
Search | Completed | In Progress | User Testing 2 | Pending | 1 | Task search and NPO search is not yet completed. |
Volunteer Records | Completed | In Progress | User Testing 2 | Pending | 1 | Task search and NPO search is not yet completed. |
Statistics | Not started | Not started | User Testing 3 | Pending | 1 | Includes the dashboard for volunteers, Empact and NPOs. |
Notification | Not started | Not started | User Testing 3 | Pending | 1 | Includes notification |
Project Schedule (Plan Vs Actual):
Compare the project plan during acceptance with the actual work done at this point. Briefly describe a summary here. Everything went as plan, everything has changed and the team is working on a new project with new sponsors or the supervisor is missing. A good source for this section comes from the project weekly report.
Provide a comparison of the plan and actual schedule. Has the project scope expanded or reduced? You can use the table below or your own gantt charts.
Iterations | Planned | Actual | Comments | ||
1 | Customer CRUD | 1 Sept 2010 | 25 Aug 2010 | Fiona took the Sales CRUD as well. | |
Trend Analytic | 1 Sept 2010 | 15 Sept 2010 | Ben is too busy and pushed iteration 1 back | ||
2 | User tutorial | 1 Oct 2010 | Removed proposed by Ben | ||
Psycho analysis | 1 Oct 2010 | New module proposed by sponsor |
Project Metrics:
Project Risks:
Update the proposal assumptions and risks. Describe what you learn from the risk update and mitigation steps taken.
Risk | Probability | Impact | Mitigation |
Sponsor want to use Joomla instead of Drupal | High | High | Team evaluating Joomla to write an impact analysis report |
Sponsor deployment machine approval and support | High | Medium (now it is low) | Use UPL machine |
Be sure to prioritize the risks.
Technical Complexity:
Describe and list the technical complexity of your project in order of highest complexity first. For example, deploying on iPhone using Objective-C, customizing Drupal with own database, quick search for shortest flight path, database structure, etc.
Quality of product
Provide more details about the quality of your work. For example, you designed a flexible configurable system using XML.config files, uses Strategy Design Pattern to allow plugging in different strategy, implement a regular expression parser to map a flexible formula editor, etc.
Intermediate Deliverables:
There should be some evidence of work in progress.
Stage | Specification | Modules |
Project Management | Minutes | Sponsor weeks -10 -5 3 7 Supervisor weeks -2 3 5 7 |
Metrics | Bug metrics | |
Requirements | Story cards | CRUD Customer, Trend Analytic |
Analysis | Use case | overall |
System Sequence Diagram | client, server | |
Business Process Diagram | Here | |
Screen Shots | CRUD Customer, Trend Analysis | |
Design | ER Diagram | 1, 2, 3 |
Class Diagram | 1, 2, 3 | |
Testing | User test plan | instructions |
Not all parts of the deliverables are necessary but the evidence should be convincing of the progress. Try to include design deliverables that shows the quality of your project.
Deployment:
In an iterative approach, ready to use system should be available (deployed) for client and instructions to access the system described here (user name). If necessary, provide a deployment diagram link.
Testing:
Describe the testing done on your system. For example, the number of user testing, tester profile, test cases, survey results, issue tracker, bug reports, etc.
Reflection
In this section, describe what have the team learn? Be brief. Sometimes, the client writes a report to feedback on the system; this sponsor report can be included or linked from here.
Team Reflection:
Any training and lesson learn? What are the take-away so far? It would be very convincing if the knowledge is share at the wiki knowledge base and linked here.
Benjamin Gan Reflection:
You may include individual reflection if that make sense at this point. The team is uncooperative and did not follow my instructions.