IS480 Team wiki: 2016T1 Folium Midterm Wiki
Revision as of 00:22, 26 September 2016 by Bo.hui.2014 (talk | contribs)
Project Progress Summary
Midterm Slides | Deployed Site |
---|---|
Deployment Progress
- Current Iteration: Iteration 10
- Iteration period: 25 September 2016 to 9 October 2016
- Major milestone: Midterm Presentation
Project Highlights
Project Management
Project Status
Planned vs Actual Scope
Planned | Actual |
---|---|
Major changes made
# | Changes | Iteration | Module | Description | Reason | Status |
---|---|---|---|---|---|---|
1 | Allow student submitting files concurrently (+) | 7 | Submission Management | In the previous version, students are not able to submit a second solution when an existing solution is still in the submission queue awaiting grading. In the current version, students can submit multiple files to the queue. For example, if lab2a.rb is still in the queue, a student can submit lab2b.rb. He will be able to see both submissions in the queue. | Supervisor suggested addition of unique functions. | Finished and notified Sponsors. |
2 | Allow admin to manage grading machine IP address and set database username & password in admin interface (+) | 8 | Administration Module | Allow admin to add the grading machines and set the database username, password from admin interface. | Supervisor request to manage the database account and grading machine in admin interface. | Finished and notified Sponsors. |
3 | Enabling grading machine to auto generate and forward email to admin in case of unforeseen circumstances (+) | 9 | Submission Management | In times of grader undesired system behavior, admin will be notified with an auto-generated email. | Sponsor wants to be notified when graders has any undesired behavior. | Development in progress. |
4 | Optimizing scoreboard ranking algorithm (+) | 9 | Ranking Module | Implementing new scoreboard ranking algorithm to enable admin to rank based on total score, which is computed from time, quality score and score index. | Sponsor suggested to rank students’ submission based on different quality score & time score index value. | Development in progress. |
5 | Able to retrieve historical data (-) | 9 | Administration Module | Allows admin to retrieve historical data of submission, including username, question and submission file of specific submission. | This feature is similar to the previous feature “Create central repository for all submissions”. After discuss with sponsors, we drop the feature. | Settled with Sponsors. |
Planned vs Actual Project Schedule
Project Metrics
Schedule Metrics
View our Schedule Metrics Here!
Schedule Metrics Highlights
Iteration | Planned Tasks | Actual Tasks | Schedule Metric Score | Action | Status |
---|---|---|---|---|---|
3 | 4 | 3 | 75% | Estimates are generally accurate and on track. Delayed slightly due to not fully understand the feature related to command line prompt. After further study, it turned out that it is not a bug needed to be fixed and sponsors have been updated with the result.
Follow up action: feature has been pushed to next iteration but been removed afterwards. So no buffer day has been used. |
Completed |
5 | 3 | 2 | 67% | Estimates are optimistic about the test case feature. Delayed due to underestimate the complexity of test case feature. Team spent some time studying the grading architecture and figuring out the principle behind the feature.
Follow up action: feature has been extended until next iteration to finish. Buffer days are used. |
Completed |
6 | 4 | 3 | 75% | Same test case feature failed to finish and dragged until iteration 7 to fully finish.
Follow up action: Another developer has been assigned to do pair programming to fasten the process of developing this feature. Team manage to finish the test case feature in iteration 7 with no buffer day spent. |
Completed |
Bug Metrics
Bug Count
Bug Score
Project Risks
Risk Type | Risk Event | Likelihood | Impact | Mitigation Strategy | Status |
---|---|---|---|---|---|
Sponsor Management | Sponsors may have changes in function requirement which may affect our overall schedule. | Medium | High | Edit our timeline and requirements for our future.
Perform regression testing. |
Ongoing |
Project Management Risks | We will collect students’ feedback and if encounter any critical bugs, we will need to stop the ongoing process and debug to ensure students can use the system smoothly. | Medium | High | Decide on which part to focus on and debug first
Ensure students’ feedback being responded in 24 hours. |
Ongoing |
Technical | Team has no experience with new technologies which will be implemented to our project. | High | Medium | Dedicate time and effort doing research. | Ongoing |
Technical | System crashes or files are not backdated. | Low | High | Ensure that they all have the latest version.
Always update codes to the latest versions locally and on the repository. |
Ongoing |
Project Management | Team needs to reshuffle project features based on sponsors' request after the Acceptance Milestone. | High | High | Team needs to reschedule the project features and reallocate developers on each tasks. | Overcame |
Technical Complexity
Quality Of Product
Immediate Deliverables
Stage | Specification | Modules |
Project Management | Meeting Minutes | Internal, Supervisor & Sponsor Meeting Minutes |
Project Schedule | Project Schedule | |
Metrics | Project Metrics | |
Risk Management | Risk Management | |
Change Management | Change Management | |
Requirements | Project Scope | Project Scope |
Analysis | Use Case | Use Case |
System Architecture | System Architecture | |
Technology Interactions | Technology Interactions | |
Operated Technologies | Operated Technologies | |
Design | Prototypes | Design & Prototypes |
Testing | User Test Plan & Results | User Test Plan & Results |
Deployment
Testing
# | Test | Level | Total Users | Objective |
---|---|---|---|---|
1 | User Test | Sponsor Level | 2 |
|
2 | User Test | Sponsor Level | 2 |
|
3 | User Test | Student Level | 14 |
|
4 | Live Test | Student Level | 187 |
|
5 | User Test | Student Level | 170 |
|
For testing results, assess here:
Reflection
Team Reflection
- Quality assurance as an essential aspect of software development process.
- Readily available technical support and continuous system maintenance after 'go-live' are essential elements in enhancing customer experience.
- Critical stakeholders’ feedback as effective way in improving system performance and usability