Difference between revisions of "IS480 Team wiki:"
m (Added in Schedule Metrics) |
m (Updated Links to Metrics and UT Results) |
||
Line 136: | Line 136: | ||
|rowspan="3"| Project Management | |rowspan="3"| Project Management | ||
|| Minutes | || Minutes | ||
− | || Meeting Minutes | + | || [//wiki.smu.edu.sg/is480/IS480_Team_wiki%3A_2015T2_BeeSkilledDocumentation Meeting Minutes] |
|- | |- | ||
|| Metrics | || Metrics | ||
− | || Bug metrics | + | || [//wiki.smu.edu.sg/is480/IS480_Team_wiki%3A_2015T2_BeeSkilledDocumentation Bug metrics] |
|- | |- | ||
|| Metrics | || Metrics | ||
− | || Schedule metrics | + | || [//wiki.smu.edu.sg/is480/IS480_Team_wiki%3A_2015T2_BeeSkilledDocumentation Schedule metrics] |
|- | |- | ||
|| Requirements | || Requirements | ||
Line 162: | Line 162: | ||
|| Testing | || Testing | ||
|| User test plan | || User test plan | ||
− | || User Test Results | + | || [//wiki.smu.edu.sg/is480/IS480_Team_wiki%3A_2015T2_BeeSkilled_User_Test User Test Results] |
|} | |} | ||
Revision as of 00:17, 25 February 2015
Project Progress Summary
MidTerm Presentation Slides File:Midterm BeeSkilled.pptx
Website Link: SMU tBank Automated Clearing House
Milestones Achieved | Explanation |
---|---|
7 Iterations | Completed 7 iterations with a average SM Score of 103 |
User Testing | 1 Heuristics Evaluation 1 User Testing |
Project Highlights:
Project Management
Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.
Project Status:
Completed Functions
Outstanding Functions
Project Schedule (Plan Vs Actual):
Planned Timeline
Actual Timeline
Compare the project plan during acceptance with the actual work done at this point. Briefly describe a summary here. Everything went as plan, everything has changed and the team is working on a new project with new sponsors or the supervisor is missing. A good source for this section comes from the project weekly report.
Provide a comparison of the plan and actual schedule. Has the project scope expanded or reduced? You can use the table below or your own gantt charts.
Iteration | Issues Faced | Solutions | Mitigation |
2 | 'Functionality:Create Gateway' took a longer time than expected to be stable and ready for use in the Payment Orchestration | After researching and talking to our Project Sponsor, it has been decided that we would spend extra time to re-construct the gateways and spend more time in testing to ensure that both Automated Clearing House (ACH) and Payments Service Hub (PSH) Gateways are tested and ready to be integrated with upcoming functionalities. | 'Functionality: Payment Orchestration' will be pushed back to subsequent iteration as both Gateways are the supporting foundation and should be thoroughly tested |
4 | Underestimated time required for 'Functionality: Payment Orchestration' | Continue with development of 'Functionality: Payment Orchestration' and push back other fuctionalities to subsequent iterations. Concurrently, Payment Orchestration was broke up into 2 phases for development and separated into 2 interations | Reschedule next iteration with added functionalities and milestones |
Project Metrics:
Project Risks:
Technical Complexity:
Quality of product
Intermediate Deliverables:
There should be some evidence of work in progress.
Stage | Specification | Modules |
Project Management | Minutes | Meeting Minutes |
Metrics | Bug metrics | |
Metrics | Schedule metrics | |
Requirements | Design Specifications | Project Scope |
Analysis | Use case | Use Case Diagram |
Architecture Diagram | Architecture Diagram | |
Testing | User test plan | User Test Results |
Not all parts of the deliverables are necessary but the evidence should be convincing of the progress. Try to include design deliverables that shows the quality of your project.
Deployment:
In an iterative approach, ready to use system should be available (deployed) for client and instructions to access the system described here (user name). If necessary, provide a deployment diagram link.
Testing:
Describe the testing done on your system. For example, the number of user testing, tester profile, test cases, survey results, issue tracker, bug reports, etc.
Reflection
In this section, describe what have the team learn? Be brief. Sometimes, the client writes a report to feedback on the system; this sponsor report can be included or linked from here.
Team Reflection:
Any training and lesson learn? What are the take-away so far? It would be very convincing if the knowledge is share at the wiki knowledge base and linked here.
Benjamin Gan Reflection:
You may include individual reflection if that make sense at this point. The team is uncooperative and did not follow my instructions.