HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2017T2 Tetris Midterm"

From IS480
Jump to navigation Jump to search
(Updated project progress)
(Updated scope completed for core functions)
Line 53: Line 53:
 
{| border="1"
 
{| border="1"
 
|- style="background:blue; color:white"  
 
|- style="background:blue; color:white"  
||Task/function/features, etc
+
||Function
 
|align="center"|Status
 
|align="center"|Status
 
|align="center"|Confident Level (0-1)
 
|align="center"|Confident Level (0-1)
|align="center"|Comment
+
|align="center"|Todo
 
|-
 
|-
  
|| Customer CRUD
+
||<b>Core Functions </b>
|| Fully deployed and tested 100%
 
|| 1
 
|| Fiona
 
 
|-
 
|-
  
|| Trend Analytic
+
|| Account
|| 25%
+
|| 100%
|| 0.9
+
|| 0.8
|| Ben is researching analytic algoritms
+
|| Validation & Profile Photo
 +
|-
 +
 
 +
|| Caregiver Group
 +
|| 70%
 +
|| 0.7
 +
|| Validation & Detailed Testing with Rules. Changes in Use Case affected module scope in Iter 8.
 +
|-
 +
 
 +
|| SMS
 +
|| 90%
 +
|| 1.0
 +
|| Mobile Number Confirmation
 +
 
 +
|-
 +
|| Medication Data Retrieval & Rule Engine
 +
|| 75%
 +
|| 0.75
 +
|| Scheduling recurring schedule using input of time only; more detailed testing & debugging
 +
 
 +
|-
 +
|| Presence Data Retrieval & Rule Engine
 +
|| 65%
 +
|| 0.65
 +
|| More detailed testing & debugging
 +
 
 +
 
 +
|-
 +
|| Movement Data Retrieval & Rule Engine
 +
|| 60%
 +
|| 0.6
 +
|| More detailed testing & debugging
 +
 
 +
|-
 +
|| <b>Average</b>
 +
|| <b>76.7%</b>
 +
|| <b>0.75</b>
 +
||
 +
 
 
|}
 
|}
  
Line 115: Line 150:
 
===Project Metrics:===
 
===Project Metrics:===
  
Summary of analysis for the metrics collected. You may refer to another page for the details about the metrics and how it is collected.
+
You may view the metrics [https://wiki.smu.edu.sg/is480/IS480_Team_wiki%3A_2017T2_Tetris_Project_Metrics here].
  
 
===Project Risks:===
 
===Project Risks:===

Revision as of 14:37, 20 February 2017

Tetris' Logo


TtMenu Home.png TtMenu About.png TtMenu Overview.png TtMenu PM.png TtMenu Docu.png


TtMenu MidtermWiki s.png


Project Progress Summary

Team Tetris is on Iteration 11, and our next milestone is Midterms on 21 February 2017. We have deployed our project online and completed 1 round of User Testing. You may download our proposal submitted for Acceptance and our Midterm presentation slide deck for your reference.

Highlight changes since project acceptance.

Describe the project progress briefly here. Have the project continued as planned? If not, is the team confident to complete? This is a crossroad for the team to make a decision. Proceed with confident or file an incomplete.

Project Highlights:

What unexpected events occurred?

1) Took 3 weeks to learn Drools for Rule Engine & Quartz for timing / scheduling jobs

2) Water sensor wasn't reliable, sponsor wanted us to focus on the customisation of the rule engine instead.

3) A team member wasn't available for two iterations

4) Delayed UT 1 due to large number of bugs; Pushed UT 2 back, UT 3 date stays.

Project Management

Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.

Project Status:

Highlight changes to modules, the completion status (implemented, user testing done, client approved, deployed, etc), the confidence level (0-1 where 0 is no confident of getting it done, 1 is 100% confident in getting it done) and comments (who has been assigned to do it, new scope, removed scoped, etc). Please use a table format to summarize with links to function details.

Function Status Confident Level (0-1) Todo
Core Functions
Account 100% 0.8 Validation & Profile Photo
Caregiver Group 70% 0.7 Validation & Detailed Testing with Rules. Changes in Use Case affected module scope in Iter 8.
SMS 90% 1.0 Mobile Number Confirmation
Medication Data Retrieval & Rule Engine 75% 0.75 Scheduling recurring schedule using input of time only; more detailed testing & debugging
Presence Data Retrieval & Rule Engine 65% 0.65 More detailed testing & debugging


Movement Data Retrieval & Rule Engine 60% 0.6 More detailed testing & debugging
Average 76.7% 0.75

Project Schedule (Plan Vs Actual):

Compare the project plan during acceptance with the actual work done at this point. Briefly describe a summary here. Everything went as plan, everything has changed and the team is working on a new project with new sponsors or the supervisor is missing. A good source for this section comes from the project weekly report.

Provide a comparison of the plan and actual schedule. Has the project scope expanded or reduced? You can use the table below or your own gantt charts.

Iterations Planned Actual Comments
1 Customer CRUD 1 Sept 2010 25 Aug 2010 Fiona took the Sales CRUD as well.
Trend Analytic 1 Sept 2010 15 Sept 2010 Ben is too busy and pushed iteration 1 back
2 User tutorial 1 Oct 2010 Removed proposed by Ben
Psycho analysis 1 Oct 2010 New module proposed by sponsor

Project Metrics:

You may view the metrics here.

Project Risks:

Update the proposal assumptions and risks. Describe what you learn from the risk update and mitigation steps taken.

Risk Probability Impact Mitigation
Sponsor want to use Joomla instead of Drupal High High Team evaluating Joomla to write an impact analysis report
Sponsor deployment machine approval and support High Medium (now it is low) Use UPL machine

Be sure to prioritize the risks.

Technical Complexity:

Describe and list the technical complexity of your project in order of highest complexity first. For example, deploying on iPhone using Objective-C, customizing Drupal with own database, quick search for shortest flight path, database structure, etc.

Quality of product

Provide more details about the quality of your work. For example, you designed a flexible configurable system using XML.config files, uses Strategy Design Pattern to allow plugging in different strategy, implement a regular expression parser to map a flexible formula editor, etc.

Intermediate Deliverables:

There should be some evidence of work in progress.

Stage Specification Modules
Project Management Minutes Sponsor weeks -10 -5 3 7 Supervisor weeks -2 3 5 7
Metrics Bug metrics
Requirements Story cards CRUD Customer, Trend Analytic
Analysis Use case overall
System Sequence Diagram client, server
Business Process Diagram Here
Screen Shots CRUD Customer, Trend Analysis
Design ER Diagram 1, 2, 3
Class Diagram 1, 2, 3
Testing User test plan instructions

Not all parts of the deliverables are necessary but the evidence should be convincing of the progress. Try to include design deliverables that shows the quality of your project.

Deployment:

In an iterative approach, ready to use system should be available (deployed) for client and instructions to access the system described here (user name). If necessary, provide a deployment diagram link.

Testing:

Describe the testing done on your system. For example, the number of user testing, tester profile, test cases, survey results, issue tracker, bug reports, etc.

Reflection

In this section, describe what have the team learn? Be brief. Sometimes, the client writes a report to feedback on the system; this sponsor report can be included or linked from here.

Team Reflection:

Any training and lesson learn? What are the take-away so far? It would be very convincing if the knowledge is share at the wiki knowledge base and linked here.

Benjamin Gan Reflection:

You may include individual reflection if that make sense at this point. The team is uncooperative and did not follow my instructions.