Difference between revisions of "IS480 Team wiki: 2017T2 Tetris Midterm"
Zhwhey.2014 (talk | contribs) (made changes to the progress for core functions) |
Zhwhey.2014 (talk | contribs) (corrected percentages for completion) |
||
Line 83: | Line 83: | ||
|- | |- | ||
|| Medication Data Retrieval & Rule Engine | || Medication Data Retrieval & Rule Engine | ||
− | || | + | || 100% implemented. Deployed, rules in rule engine approved by client. |
|| 0.75 | || 0.75 | ||
|| Assigned to Irshad (lead), Avinash, Elliotz. <br/> | || Assigned to Irshad (lead), Avinash, Elliotz. <br/> | ||
Line 90: | Line 90: | ||
|- | |- | ||
|| Presence Data Retrieval & Rule Engine | || Presence Data Retrieval & Rule Engine | ||
− | || | + | || 70% implemented. Deployed, rules in rule engine approved by client. |
|| 0.65 | || 0.65 | ||
|| Assigned to Irshad (lead), Avinash, Elliotz. <br/> | || Assigned to Irshad (lead), Avinash, Elliotz. <br/> | ||
Line 98: | Line 98: | ||
|- | |- | ||
|| Movement Data Retrieval & Rule Engine | || Movement Data Retrieval & Rule Engine | ||
− | || | + | || 100% implemented. Deployed, rules in rule engine approved by client. |
|| 0.6 | || 0.6 | ||
|| Assigned to Irshad (lead), Avinash, Elliotz. <br/> | || Assigned to Irshad (lead), Avinash, Elliotz. <br/> | ||
Line 105: | Line 105: | ||
|- | |- | ||
|| <b>Average</b> | || <b>Average</b> | ||
− | || <b> | + | || <b>87%</b> |
|| <b>0.75</b> | || <b>0.75</b> | ||
|| | || |
Revision as of 15:40, 20 February 2017
Project Progress Summary
Team Tetris is on Iteration 11, and our next milestone is Midterms on 21 February 2017. We have deployed our project online and completed 1 round of User Testing. You may download our proposal submitted for Acceptance and our Midterm presentation slide deck for your reference.
Project Highlights:
What unexpected events occurred?
1) Took 3 weeks in December break to learn Drools for Rule Engine & Quartz for timing / scheduling jobs, but didn't use them.
For Drools library, rules could be represented as String expressions and using String expressions make it more flexible when it came to customisation of the rules. In short, String expressions fit the task better than Drools.
As for Quartz, it was an inefficient way of creating a Job (task) for every single alert. It was simpler an more efficient to implement a servlet context listener which ran in the background to check for active alerts and whether escalation conditions have been achieved. The listener ran as a single job instead of multiple Quartz jobs.
2) Water sensor wasn't reliable, sponsor wanted us to focus on the customisation of the rule engine instead.
3) A team member wasn't available for two iterations, which affected quality of functions.
4) Delayed, but completed UT 1 due to large number of bugs; Pushed UT 2 back to after midterm, UT 3 date stays.
Project Management
Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.
Project Status:
We highlight the status of our project and also state the todo (tasks we still need to do to bring the function to 100% implementation)
You may view the function details here.
Function | Status | Confidence (0-1) | Comments |
Core Functions | |||
Account | 90% implemented. Deployed, UT done, client approved. | 0.8 | Assigned to Wei Liang (Lead). Done by Samantha & Wei Liang. |
Caregiver Group | 70% implemented. Deployed, UT done, client approved. | 0.7 | Assigned to Wei Liang (Lead). Done by Samantha & Wei Liang. Todo: Validation & Detailed Testing with Rules. Note: Changes in Use Case affected module scope in Iter 8. We had 3 meetings with the client to get her further input, in addition to consultation with her. Supervisor also present. |
SMS | 90% implemented. Deployed, with US$40 in Twillio account. | 1.0 | Assigned to Avinash. Todo: Mobile Number Confirmation |
Medication Data Retrieval & Rule Engine | 100% implemented. Deployed, rules in rule engine approved by client. | 0.75 | Assigned to Irshad (lead), Avinash, Elliotz. Todo: Scheduling recurring schedule using input of time only; more detailed testing & debugging |
Presence Data Retrieval & Rule Engine | 70% implemented. Deployed, rules in rule engine approved by client. | 0.65 | Assigned to Irshad (lead), Avinash, Elliotz. Todo: More detailed testing & debugging
|
Movement Data Retrieval & Rule Engine | 100% implemented. Deployed, rules in rule engine approved by client. | 0.6 | Assigned to Irshad (lead), Avinash, Elliotz. Todo: More detailed testing & debugging |
Average | 87% | 0.75 |
Project Schedule (Plan Vs Actual):
Compare the project plan during acceptance with the actual work done at this point. Briefly describe a summary here. Everything went as plan, everything has changed and the team is working on a new project with new sponsors or the supervisor is missing. A good source for this section comes from the project weekly report.
Provide a comparison of the plan and actual schedule. Has the project scope expanded or reduced? You can use the table below or your own gantt charts.
Iterations | Planned | Actual | Comments | ||
1 | Customer CRUD | 1 Sept 2010 | 25 Aug 2010 | Fiona took the Sales CRUD as well. | |
Trend Analytic | 1 Sept 2010 | 15 Sept 2010 | Ben is too busy and pushed iteration 1 back | ||
2 | User tutorial | 1 Oct 2010 | Removed proposed by Ben | ||
Psycho analysis | 1 Oct 2010 | New module proposed by sponsor |
Project Metrics:
You may view the metrics here.
Project Risks:
Update the proposal assumptions and risks. Describe what you learn from the risk update and mitigation steps taken.
Risk | Probability | Impact | Mitigation |
Sponsor want to use Joomla instead of Drupal | High | High | Team evaluating Joomla to write an impact analysis report |
Sponsor deployment machine approval and support | High | Medium (now it is low) | Use UPL machine |
Be sure to prioritize the risks.
Technical Complexity:
Describe and list the technical complexity of your project in order of highest complexity first. For example, deploying on iPhone using Objective-C, customizing Drupal with own database, quick search for shortest flight path, database structure, etc.
Quality of product
Provide more details about the quality of your work. For example, you designed a flexible configurable system using XML.config files, uses Strategy Design Pattern to allow plugging in different strategy, implement a regular expression parser to map a flexible formula editor, etc.
Intermediate Deliverables:
There should be some evidence of work in progress.
Stage | Specification | Modules |
Project Management | Minutes | Sponsor weeks -10 -5 3 7 Supervisor weeks -2 3 5 7 |
Metrics | Bug metrics | |
Requirements | Story cards | CRUD Customer, Trend Analytic |
Analysis | Use case | overall |
System Sequence Diagram | client, server | |
Business Process Diagram | Here | |
Screen Shots | CRUD Customer, Trend Analysis | |
Design | ER Diagram | 1, 2, 3 |
Class Diagram | 1, 2, 3 | |
Testing | User test plan | instructions |
Not all parts of the deliverables are necessary but the evidence should be convincing of the progress. Try to include design deliverables that shows the quality of your project.
Deployment:
In an iterative approach, ready to use system should be available (deployed) for client and instructions to access the system described here (user name). If necessary, provide a deployment diagram link.
Testing:
Describe the testing done on your system. For example, the number of user testing, tester profile, test cases, survey results, issue tracker, bug reports, etc.
Reflection
In this section, describe what have the team learn? Be brief. Sometimes, the client writes a report to feedback on the system; this sponsor report can be included or linked from here.
Team Reflection:
Any training and lesson learn? What are the take-away so far? It would be very convincing if the knowledge is share at the wiki knowledge base and linked here.
Benjamin Gan Reflection:
You may include individual reflection if that make sense at this point. The team is uncooperative and did not follow my instructions.