IS480 Team wiki: 2017T2 Mavericks Midterms Wiki
- 1 Project Progress Summary
- 2 Project Management
- 3 Quality of product
- 4 Reflection
Project Progress Summary
Midterm slides and deployed site link will be posted here by 18 February 2018.
This project has been split into 15 sprints - we are currently at iteration 11 (12 February to 25 February 2018). We have completed 96% of core functions, and 50% of our secondary functions. Throughout the 11 iterations, we have conducted 3 User Testings with actual students from Singapore Management University and student interns from Ngee Ann Polytechnic. We have also completed a major milestone of conducting our Proof of Deployment Lab Session at Ngee Ann Polytechnic, with 15 users. An unexpected event that occurred would be underestimation of time taken to complete certain functionalities due to certain APIs unavailable, as well as the team's unfamiliarity with the technologies used. To resolve this, the team has come together to discuss and revise new project schedule, to ensure all deadlines to be met. Therefore, the team is confident of the completion of this project.
- Requirement changes
- Added new functionality for Customer Request module: Standing Instruction
- Revised process flow for Educational module
- Included new features in Admin Analytics Dashboard
- Descriptive Analysis
- Cluster Analysis
- Completed 3 User Testing sessions
- Conducted Proof of Deployment at Ngee Ann Polytechnic
- 15 students from Financial Informatics course
- Achieved and exceeded midterms X-factor
Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.
Highlight changes to modules, the completion status (implemented, user testing done, client approved, deployed, etc), the confidence level (0-1 where 0 is no confident of getting it done, 1 is 100% confident in getting it done) and comments (who has been assigned to do it, new scope, removed scoped, etc). Please use a table format to summarize with links to function details.
|Task/function/features, etc||Status||Confident Level (0-1)||Comment|
|Customer CRUD||Fully deployed and tested 100%||1||Fiona|
|Trend Analytic||25%||0.9||Ben is researching analytic algoritms|
Project Schedule (Plan Vs Actual):
Compare the project plan during acceptance with the actual work done at this point. Briefly describe a summary here. Everything went as plan, everything has changed and the team is working on a new project with new sponsors or the supervisor is missing. A good source for this section comes from the project weekly report.
Provide a comparison of the plan and actual schedule. Has the project scope expanded or reduced? You can use the table below or your own gantt charts.
|1||Customer CRUD||1 Sept 2010||25 Aug 2010||Fiona took the Sales CRUD as well.|
|Trend Analytic||1 Sept 2010||15 Sept 2010||Ben is too busy and pushed iteration 1 back|
|2||User tutorial||1 Oct 2010||Removed proposed by Ben|
|Psycho analysis||1 Oct 2010||New module proposed by sponsor|
Summary of analysis for the metrics collected. You may refer to another page for the details about the metrics and how it is collected.
Update the proposal assumptions and risks. Describe what you learn from the risk update and mitigation steps taken.
|Sponsor want to use Joomla instead of Drupal||High||High||Team evaluating Joomla to write an impact analysis report|
|Sponsor deployment machine approval and support||High||Medium (now it is low)||Use UPL machine|
Be sure to prioritize the risks.
Describe and list the technical complexity of your project in order of highest complexity first. For example, deploying on iPhone using Objective-C, customizing Drupal with own database, quick search for shortest flight path, database structure, etc.
Quality of product
Provide more details about the quality of your work. For example, you designed a flexible configurable system using XML.config files, uses Strategy Design Pattern to allow plugging in different strategy, implement a regular expression parser to map a flexible formula editor, etc.
|Analysis||System Architecture Diagram|
|Testing||User Test 1|
|User Test 2|
|User Test 3|
In an iterative approach, ready to use system should be available (deployed) for client and instructions to access the system described here (user name). If necessary, provide a deployment diagram link.
Describe the testing done on your system. For example, the number of user testing, tester profile, test cases, survey results, issue tracker, bug reports, etc.
- Internal Testing is performed for every new function developed
- Team Mavericks has scheduled for 5 User Testings in total
- We have accomplished 2 User Testings as of Iteration 9
For more information regarding User Testing and view results, please click here.
This project has made us realise that positive synergy is the holy grail of team work. Despite our differences in backgrounds, capabilities and perspectives, we have built common goals - to excel and to enjoy the most from this learning process. Through open communication and respect for others, we were able to find common consensus when solving problems. Since acceptance, our team has improved on our overall performance in both project management and web application development. In addition, we managed to achieve and excel for our midterms x-factor. This is highly encouraging for our team, thus, moving forward, Team Mavericks is even more motivated to push through and ensure project success.
Yi Xiang's Reflection:
We should be open to change in order for us to improve. Even if we have a good idea, which we believe to be the best, we should be open to feedback and make necessary changes. The ones who decide whether if the idea is good or not are the target users. Thus, it is important to have proof of deploment and user testing to get users' feedback for improvements.
What I have learnt is that in a software project, non-functional requirements are extremely important. In fact, it might even be more difficult to fulfil a non-functional requirement, and these are often overlooked or underestimated in project plans.