IS480 Team wiki: 2017T1 Team Atoms Final
Contents
Project Progress Summary
User: demo | Password: demopassword123
Project Management
Project Status
Project Schedule (Plan vs. Actual)
There were afew changes made to the project schedule and scope as the team decided to push themselves and added new refinements to the project. In addition, the sponsor has requested for an additional technical documentation due to their intentions to open source the project and submit a software research publication around the application. The changes made did not affect the timely completion of the project. We have also brought the project handover forward to allow more time for the sponsor the take over the project and provide sufficient time for support.
Planned Project Schedule
Actual Project Schedule
Project Metrics
Change Management
Below is the change log for Iteration 11 to 14 (after Mid-Terms):
Iteration | Date | Type | Change Request | Rationale | Feasibility | Outcome | Priority | Status of Request | Issued By |
---|---|---|---|---|---|---|---|---|---|
11 | 10/10/2017 | Scope | Add interactive GIF tutorial (For first time users) | Supervisor recommended including a interactive tutorial guide for first time users to allow them to easier learn how to use the system | Fits into schedule without any expected delay as a consequence | Accepted | Low | Closed | Supervisor |
12 | 16/10/2017 | Scope | Change denogram implementation for clustering (Distance cut off) | Sponsor requested for a better way to view denogram and allow users to utilise it more effectively through distance cut off | Fits into schedule without any expected delay as a consequence | Accepted | High | Closed | Sponsor |
12 | 16/10/2017 | Scope | Add interactive association visualisation | Team decided to make visualisaiton for association link graph interactive as it adds more value to the visualisation and project | Fits into schedule without any expected delay as a consequence | Accepted | Low | Closed | Team |
12 | 18/10/2017 | Scope | Deployment documentation instructions | Team decided to create an additional set of deployment documentation to facilitate project handover | Fits into schedule without any expected delay as a consequence | Accepted | Low | Closed | Team |
12 | 20/10/2017 | Scope | Technical documentation of system | Sponsor request for technical documentation as they will like to open source the project and submit for software publication | Fits into schedule without any expected delay as a consequence | Accepted | High | Closed | Sponsor |
13 | 30/10/2017 | Schedule | Change implementation for association rules | Lab4 release have to be delayed by 1 week due to changes in association computation/ algorithm | Fits into schedule without any consequence | Accepted | High | Closed | Sponsor |
Project Risks
Existing & Potential Risk
Currently there are no outstanding risk. All identified risk and challenges have been addressed. However, from the period of Mid-Terms till before Finals , we have faced and resolved concerns arising from 1) Client Management Risk as described below:
Risks & Challenges Faced
S/N | Risk Type | Risk Event | Likelihood | Impact | Category | Mitigation Strategy |
---|---|---|---|---|---|---|
1 | Client Management Risk | Team Atoms will face modifications to be made within a short period of time before every lab release. As the team will have to provide a highly customized and similar lab session experience with the existing lab exercises used in class. In addition, the team will also have to provide a new highly customized system user guide with every lab release. | High | High | A | Team will actively engage sponsor as early as possible before each lab release to provide sufficient time for changes required. In addition, the team will also dedicate time to specifically prepare for each lab release. |
Technical Complexity
System Architecture
Recap
A summary from our previous complexities achieved - https://wiki.smu.edu.sg/is480/IS480_Team_wiki%3A_2017T1_Team_Atoms_MidTerm#Technical_Complexity
S/N | Technical Complexity | Description |
---|---|---|
1 | Canvas Graph Traversing Algorithm | Team Atoms designed a graph traversing algorithm to handle all the possible combinations the user could draw in the canvas |
2 | Concurrency issue with Django + Matplotlib | Team Atoms handled concurrency issues causing visualization charts to overlap each other (which becomes unreadable) |
3 | Ensemble Algorithm | Team Atoms wrote our own Ensemble algorithm where we implemented our own Voting Classifier. |
The following are new technical challenges faced by the team from Mid-Terms till Finals:
Frontend
1. Interactive Association Graph
In order to provide a better visualization when rule based mining is applied, we used d3.js to visualize the association rules on an interactive graph instead of a static image visualization. This will enable the users to move the nodes around to get a clearer view of the different rules generated. In addition, we were also able to achieve a high level of customization for the visualization by configuring the line color, node color and even node size based on its confidence and lift levels and items length.
As the team was new to the D3 visualization library, there was a steep learning curve for us. The library was challenging to understand and grasp effectively in a short period of time. In addition, changes also have to be made on the backend where we had to modify the structure of the dataset in order to make it compatible with D3.
Backend
2. XGBoost Classification Implementation
What is XGBoost? XGBoost is short for eXtreme gradient boosting. It is a library designed and optimized for boosted tree algorithms. It's main goal is to push the extreme of the computation limits of machines to provide a scalable, portable and accurate for large scale tree boosting. The term “Gradient Boosting” is proposed in the paper Greedy Function Approximation: A Gradient Boosting Machine, by Friedman.
Implementation
We implemented the XGBoost Classification algorithm using the Python API (http://xgboost.readthedocs.io/en/latest/python/python_api.html) and plugged it into our functionalities. Since the library also provides a Scikit-Learn Wrapper, it should work natively with our features:
- Generate Accuracy Report
- Generate Graph (Using Graphviz)
- Generate Confusion Matrix
- Ensemble
- Predict
Complications
However, complications arose when we used a dataset with columns that contained spaces -- the Graphviz library did not handle spaces properly when performing text splitting, causing the plot to crash.
Outcome: Rules not split correctly - Strings broken up
Fixes Attempt #1: Initially, we wanted to override the library to write a custom split but it can cause future implications when the new version releases etc. So we replaced the spaces between the strings with an underscore. Everything seems to work fine until we connected the XGBoost to an Ensemble and it did not work because other classifiers were having different feature names (without spaces)!
Attempt #2:
Instead of changing the feature names permanently, we only modified it when it went into the graphing library and replaced it back with the original (into the XGBoost Model attribute) after the export graph was complete.
Results:
Quality of Product
Recap
A summary from our previous measures in place to ensure that we deliver a quality product-
https://wiki.smu.edu.sg/is480/IS480_Team_wiki%3A_2017T1_Team_Atoms_MidTerm#Quality_of_Product
S/N | Quality Measures | Description |
---|---|---|
1 | Deployment script | Manual deployment can lead to multiple human error. Hence, we have created a deployment shell script that partially automates the process of the deployment of our web application. |
2 | Bench marking for Visualization | Team Atoms measured how much time it would take for different dimensions of data sets, we generated datasets of different columns and rows and ran each charting function to see how much time it took. |
3 | Secure API - System security | For each API request to modify files, there is an implementation to verify if the file belongs to the user before the operation. |
4 | Google Analytics tracking implementation | Team Atoms have also implemented Google analytics tracking to understand how students are interacting with our KDD Labs website, where they’re coming from and how often they visit, what parts of the site are capturing their attention and what parts aren’t sparking interest. |
5 | System logger | Our system will consistently monitor and log down critical user actions and the problems the user encounters. This will help us to automatically track errors made in the system which will be used for our internal feedback when users utilize the KDD Labs system. |
Extra Implementations
The following are new measures in place by the team from Mid-Terms till Finals to ensure that we deliver a quality product:
1. Application configuration file
We made used of a ConfigParser that allows us to read parameters from a configuration files into the python application. The configuration file contains the parameters and initial settings required for the application. Using a configuration files to store these parameters have several benefits.
- System administrator does not need to modify any source code
- Sensitive information not exposed on in source code
2. Detailed Technical documentation (Handover)
In order to facilitate a proper handover and successful continuity of the project. We had to ensure that the system is well documented. The documentation contains detailed information on the usage of KDDLabs along with examples and illustration. In addition, we have also created a comprehensive installation and deployment guide for future developers.
Documentation can be found here: - https://jinyuan.github.io/KDDLab/
- For user to view and learn how each KDDLabs function works.
- For system admin to view how to install and deploy the KDDLabs project.
3. System security - For Data sharing function
Since the data sharing function allow user to sent out the shareable link outside the system, it is crucial to secure the data exposed in the link. So we made use of hash function to encrypt the data in the link and we also provide user accessibility check when the user click the link. The shareable link will expire in 24 hours after the shareable link generated.
Intermediate Deliverables
Topic of Interest | Link |
---|---|
Project Management | Project Schedule |
Minutes | |
Metrics | |
Risk Management | |
Change Management | |
Project Overview | Project Overview |
Team's Motivation | |
Project Scope | |
Project Documentation | Diagrams |
Technologies Implemented | |
Low & Mid Fidelity Prototypes | |
Testing | Testing Documentation |
Deployment
To view application, visit
Test server: http://kddlabs.com
Username: demo
Password: demopassword123
Note: This public server is currently being utilized for Live Usage by the IS424 Data Mining and Business Analytics students for their labs and project completion.
Production server: https://kddlabs.cn/
Testing
Internal Testing
We engage in comprehensive manual testing in every iteration. The developers will conduct individual testing before committing their codes on our shared repository, GitHub. We believe in testing the application manually at this level because tests can be specially adjusted to cater to changes in the application, both on the front and back end. Furthermore, manual testing brings about the human factor, allowing us to better discover problems that might surface during real usage due to natural human behavior.
Once the developers have fixed the bugs, the fixed set of codes will be merged and integrated with the other functionalities. Subsequently, the integrated code is then deployed on the test server and the lead quality assurance will run a final check against the set of test cases created. This helps to ensure that the deployed application works with no major incidents.
The team's lead quality assurance then performs regression testing on the test server where previous functionalities developed are tested again. This helps to ensure that existing functionalities in the application are not affected by the integration. Once bugs have been identified, the lead quality assurance will then update the bug-tracking Excel sheet and notify the relevant developers of the issues and the corresponding priority level.
The team’s list of test cases can be found on our private repository here.
User Acceptance Test 1,2 & 3
Team Atoms has conducted 3 user tests which allowed us to better manage sponsor expectations as well as improve on usability of our application interface.
For more detailed version of Team Atoms user acceptance test results, access it here:
Live Usage
Through our live usage and roll out, IS424 students were able to complete their take home lab assignments on our system. Thus we were able to gather feedback from our end users about KDD Labs system directly. In addition, we were also able to compare the user experience with the existing alternative used in class (SAS EM). From our feedback, we have received positive response about the KDD Labs system stating that the students were able to complete their in class lab exercise. Not only that, they also found that the KDD Labs system was easier to use as compared to SAS EM.
Lab1
Release Date: 01 Sep 2017, Friday
Duration: 2-3 hours per user
Number of Users(s): 45
Lab1 User Guide: Created instructions can be found here
Lab1 Feedback Results: Live user feedback can be found here
Lab2
Release Date: 15 Sep 2017, Friday
Duration: 2-3 hours per user
Number of Users(s): 45
Lab2 User Guide: Created instructions can be found here
Lab2 Feedback Results: Live user feedback can be found here
Lab3
Release Date: 16 Oct 2017, Monday
Duration: 2-3 hours per user
Number of Users(s): 45
Lab3 User Guide: Created instructions can be found here
Lab3 Feedback Results: Live user feedback can be found here
Lab4
Release Date: 6 Nov 2017, Monday
Duration: 2-3 hours per user
Number of Users(s): 45
Lab4 User Guide: Created instructions can be found here
Lab4 Feedback Results: Live user feedback can be found here
Project Handover
Handover Components
Team Atoms have already provided a comprehensive handover to the sponsor, the team provided a detailed walk through of each component and proper hand-over documentation. Below are the documents that we handed over to our client:
Component | Description | Links |
---|---|---|
Source Code | The source code is in a github repository and have already been shared with the sponsors. | |
Technical Documentation | This a comprehensive documentation of our systems usage and developer guide for future continuity. | https://jinyuan.github.io/KDDLab/
|
Design Documents | These are the design documents of our existing system (Database architecture, Sequence diagrams, System architecture, System interaction, Technologies used) | https://www.dropbox.com/sh/uleaa2j3d69r4nv/AACNr_m-Tn_zySqkzWzrhQcka?dl=0
|
Deployment Script | These are the automated scripts that can be run for automated deployment and documentation of how it can be done. | https://www.dropbox.com/sh/weyy8wvaoh2ml4x/AADvY74vzmtB3PVMvX7UzBSka?dl=0 |
Sample Configuration Files | These are samples of configuration files for system modification. | https://www.dropbox.com/sh/7f2r4qjjuq5sjn8/AAALB0XAAut74eEJ8oBgh0s8a?dl=0 |
Live Lab usage Feedback & Testing Results | These are feedback collected along the way at the end of every lab assignment and detailed results of our test findings. | https://www.dropbox.com/sh/wuuzs3sx6z5k9ai/AABoch0_5n3Tb2kZJqbociBTa?dl=0 |
Presentation Slides | These are slides for our acceptance, midterms and finals. | https://www.dropbox.com/sh/ikr48enwgs0oknl/AADISgdBMWg0Q_435tjixBHna?dl=0 |
Poster & Video | These are marketing material for the project | https://www.dropbox.com/sh/6dg1gw01scnwyv2/AADWJVMAGofBGaN368gvE0xRa?dl=0 |
Outstanding Issues | This a compiled list of known issues to take note for the existing system's limitations and possible future improvements. | https://www.dropbox.com/s/sapl462ofli49qv/KDDLabs%20Known%20Issues.docx?dl=0 |
Reflection
Team Reflection
This journey has proven to be an enriching learning experience for Team Atoms. The project had many new learning points for the team as it was highly technical- we had to understand and grasp the concepts of the data mining process and algorithms within a short period of time. In addition, we also learnt the importance of good stakeholder management which allows us to better react to unforeseen circumstances. Through an active team participation and communication we were able to mitigate existing issues and deliver a quality project on time.
Sponsors' Testimonial
"Team ATOMS is a capable and sincere team, that has done very well in the course of the project. KDD Labs project is very challenging and in particular requires a very diverse set of technical skills. ATOMS have made substantial efforts in acquiring new skills and integrating them to deliver a quality product in a timely manner. They have stoically faced the technical issues and challenging feature/change requests, and demonstrated an excellent work ethic in delivering on their targets. Discussions with them have been thought provoking and rewarding, and have significantly contributed towards improving the product quality" - Sponsor, Doyen Sahoo
Individual Reflections