IS480 Team wiki:2017T1 Ducky King Finals

From IS480
Revision as of 03:46, 21 November 2017 by Chiahui.mah.2015 (talk | contribs)
Jump to navigation Jump to search

Duckyking home logo.png   HOME


Duckyking about us.png   ABOUT US


Duckyking project overview.png   PROJECT OVERVIEW


Duckyking project management.png   PROJECT MANAGEMENT


Ducky king documentation.png   DOCUMENTATION


DuckyKing Finals Header.png

Project Progress Summary

DuckyKing Project Progress Finals.png

DuckyKing Total Meetings.png


Project Highlights

DuckyKing Project Highlights Finals.png


DuckyKing Finals X-factor.png

Overall Project Scope


DuckyKing Finals Scope.png

Project Management

Project Schedule

*Changes to our schedule have been denoted in red

DuckyKing Finals Schedule.png

Project Metrics

DuckyKing Finals ProjectOverview.JPG

The development of all functionalities has been completed. Thus, we have completed the sprint points for the final Sprint 14. However, what is left is the official handover and documentation on our solution.

Release Burndown Chart


The release burndown shows the state of work burn, during the entire period. However due to the continuous request feature request, additional sprint points are continuously added to the product backlog and the sprints.

Thus, if the features are defined much earlier and clearly, the burndown chart would look like this.

DuckyKing ReleaseBurndownNew.JPG

Sprint Velocity Chart


Scrum Sprint Backlogs Chart

As mentioned in the earlier section, all functions has been completed. The reason why Sprint 14 still exist is because of the uncertainty of the milestone dates (Finals) and also to cater time for documentation and handover.

DuckyKing Sprint14.JPG

Bus Factor

A project’s bus factor (or truck factor) is a number equal to the number of team members who, if run over by a bus, would put the project in jeopardy. The smallest bus factor is 1. Larger numbers are preferable. In order to increase our project’s bus factor, our team has tried our best to maintain collective code ownership and ensure frequent communication among team members. Our final bus factor is 4.


Production Ready Metrics

Production Ready Metrics is updated after every sprint. The metric also guides the team in sprint planning.

Production Ready Metrics.png

Bug Metrics

Test Lifecycle

Our team has come up with the following testing lifecycle for our internal testing process. More details can be found on out internal testing wiki page here

DK Test Lifecycle.png

Internal Testing

DuckyKing Bug Table.png

Ducky king Updated bugs found.png DuckyKing Bug Score.png Ducky king Updated total bugs.png

The Bug Found shows how many bugs there are in each sprint, the Bug Score shows the overall bug score for each sprint and the Total Test Cases shows the number of test cases that are created for testing. The testing is separated to 2 kinds of testing, automated and manual testing. The automated testing is done through testing libraries such as Mocha, Chai and Truffle. The team aims to resolve the bugs before the sprint ends and carry as few bug as possible to the next sprint. If there are unresolved bugs, the team will assess the bug's severity and resolve the bugs that are more severe first. The team will also follow the mitigation plan in place when fixing the bugs. If the bug score is 10 and below, the bugs can be fixed during the buffer time. If the score is more than 10 and less than 20, the team will use the planned debugging time in the sprint to resolve the bugs. If the bug score is higher than 20, the scrum master will allocate additional manpower to resolve the bugs immediately. The scrum master reschedules the project if necessary.

Automated Testing

Our team firmly believes that automated testing is essential in development and especially so for new technologies such as ours. As such, we used 2 automated testing, 1 for the solidity contracts and another for the middleware. For the middleware, we use Mocha and Chai, a Javascript unit testing library. For the solidity contracts, we used the Truffle unit testing library. The automated test cases allows us to easily find out where the errors are in development. Below is an example of how Mocha and Chai testing is done on the middleware.

Mochachaitesting1.PNG Mochachaitesting2.PNG

Project Risks

DuckyKing Proj Risks.png

Technical Complexity

1. Deployment Architecture
The following shows FlowLabs deployment architecture. We set up multiple servers to simulate how different Flow Nodes interact with each other.
Ducky King Deployment Architecture.PNG

2. Job Queue Mechanism
Ducky King Job Queue.PNG

3. Smart Contracts
Ducky King Smart contracts.PNG

Ducky King Technical Workaround.PNG

4. Solidity
Ducky King Solidity.PNG

5. Encryption & Privacy
Ducky King Encryption.PNG

6. Flow Admin
Ducky King System health.PNG

Ducky King Notification.PNG

Ducky King Peer managment.PNG

7. Logging
Ducky King Kibana.PNG

Ducky King Logging.PNG

7. UAT
In order to ensure quality of our developed products, we have completed 3 rounds of User Acceptance Tests, which can be shown below:
The UAT details for FlowLabs Middleware can be found here
The UAT details for FlowAdmin Dashboard can be found here

Project Deliverables

Component Description Specification
Project Management Project Schedule Project Schedule
Meeting Minutes Meeting Minutes
Metrics Metrics
Risk Management Risk Management
Change Management Change Management
Requirements Project Overview Project Overview
Project Scope Project Scope
Analysis Personas and Scenarios Personas & Scenarios
Diagrams Diagrams
Project Implementation Technology Technology
Testing (Documents and Results) Internal Testing Internal Testing
FlowLabs Middleware User Testing FlowLabs Middleware
FlowAdmin User Testing FlowAdmin
UAT 1 (Middleware & FlowAdmin) UAT 1
UAT 2 (Middleware & FlowAdmin) UAT 2
UAT 3 (Middleware & FlowAdmin) UAT 3
Handover Documentation FlowLabs Handover Documentation Documentation


Sponsor's Feedback

Value to sponsor DuckyKing.png

Team Reflections

TeamDuckyKing Photo.png

DuckyKing Finals Reflections.png