Difference between revisions of "IS480 Team wiki: 2013T2 Zora Final Wiki"
Line 101: | Line 101: | ||
|} | |} | ||
− | |||
− | |||
===Project Highlights=== | ===Project Highlights=== |
Revision as of 11:21, 13 April 2014
Contents
Project Progress Summary
Deliverable | Link |
---|---|
Final Presentation Slides | |
Deployed Site Link |
Completion of project |
100% (Core, Secondary, Good to Have) |
Number of Iterations |
|
Number of Heuristics |
1 |
Number of User Testing |
3 |
Deployment |
|
Changes since Mid-Term |
|
Milestones
Completed Milestones | Remaining Milestones |
---|---|
|
|
Completed Functionalities
Project Scope & Progress |
---|
View detailed completed features |
Core Modules | Secondary Modules | Good to Have Modules |
---|---|---|
|
|
|
Project Highlights
Issue Description | Consequences | Mitigation Plan |
---|---|---|
Client request for 2 additional features in leave module and claim module. | If not handled appropriately, might result in big delay in schedule and project not being able to complete by final presentation. |
|
Project Challenges:
Describe areas of the project that were particularly difficult and how they were dealt with, whether successfully or not. Again, a few sentences are enough. If there are no challenges, remove this section.
Project Achievements:
Methods, technologies, processes, teamwork, etc. which were particularly successful – highlight things which worked very well towards completing the project. A bulleted list of one to two sentences each will do. If there are no achievement, remove this section.
Project Management
Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.
Project Schedule (Plan Vs Actual):
Compare the project plan during midterm with the actual work done at this point. Briefly describe a summary here. Everything went as plan, everything has changed and the team is working on a new project with new sponsors or the supervisor is missing. A good source for this section comes from the project weekly report.
Provide a comparison of the plan and actual schedule. Has the project scope expanded or reduced? You can use the table below or your own gantt charts.
Iterations | Planned | Actual | Comments | ||
1 | Customer CRUD | 1 Sept 2010 | 25 Aug 2010 | Fiona took the Sales CRUD as well. | |
Trend Analytic | 1 Sept 2010 | 15 Sept 2010 | Ben is too busy and pushed iteration 1 back | ||
2 | User tutorial | 1 Oct 2010 | Removed proposed by Ben | ||
Psycho analysis | 1 Oct 2010 | New module proposed by sponsor |
Project Metrics:
Summary of analysis for the metrics collected. You may refer to another page for the details about the metrics and how it is collected.
Bug Metrics
Bug Score Tracking |
---|
Technical Complexity
Application |
|
|
Browser |
|
|
Database |
|
- One challenge faced is the integration of various libraries into our application. We would need to figure out how to utilise the various libraries. While there are documentation for the libraries, they are sometimes not adequate when we are debugging our application
- The 2 libraries we have used so far is TCPDF and PHPass
- TCPDF is used to generate PDF documents such as claim slips, pay slips etc.
- PHPass is used for password hashing and encryption
k-nearest neighbor algorithm |
---|
We have used the k-nearest neighbor (kNN) algorithm to classify and detect fraudulent claims. Based on past claims data, we have built a training set on fraudulent claims. When a claim is submitted, our application will calculate the distance of the new claim in relation to all the claims that is already classified. Based on the nearest k - neighbours, the algorithm will classify the claim. As shown in the picture above, where k is 3, the new claim is classified as fraudulent as 2 out of the 3 neighbours is fraudulent.
We have used the following variables in calculating distance:
|
Quality of product
Design Considerations
- We have chosen to use Twitter Bootstrap for our User Interface (UI) as users a familiar with the layout of the design.
- Our team can move away from reinventing the wheel and focus on providing our users a more intuitive interface instead.
- In the design of our UI, we use the Nielsen's 10 heuristics as our guideline
Nielsen's Heuristics | Screenshot | |
|
||
|
Security considerations
- Password salting and hashing
- Using PHpass library, each password is assigned a unique salt and hash
- Cross Site Scripting Validation
- Validation is done at input to prevent cross site scripting
- Dynamic Access Control
Project Deliverables:
Stage | Specification | Links |
---|---|---|
Project Management | Meeting Minutes | Meeting Minutes |
Metrics | Metrics | |
Requirement Gathering | Business Requirements | |
Analysis | Use Case | Business Requirements |
System Sequence Diagram | System Architecture | |
Business Process Diagram | Business Requirements | |
UI Protoypes | UI Prototypes | |
Design | ER Diagram | System Architecture |
Class Diagram | System Architecture | |
Testing | Test Plans | Testing |
Handover | Manuals | User Manuals |
Deployment Diagram | System Architecture |
Quality:
Explain the quality attributes (non functional) of your project deliverables. Have you designed the architecture, use a design pattern, etc? Does your architecture address scalability, performance, reliability, availability, fault tolerance, usability, etc. Does your design address maintainability, flexibility, configurability, etc. Be brief here but you can link to diagrams or code detail pages. Do not repeat the technical complexity part, link to it if necessary.
Deployment:
In an iterative approach, ready to use system should be available (deployed) for client and instructions to access the system described here (user name). If necessary, provide a deployment diagram link.
Testing
User Test 3
User Test 3 | |
---|---|
Details |
|
Objectives |
|
Scope of Test | ||
---|---|---|
Claims |
|
|
Leave |
|
|
Remuneration |
|
|
Performance Management & Appraisal |
|
|
Training & Development |
|
|
System Administration |
|
Common Feedback (Before & After) | ||
---|---|---|
Claims |
Before
|
|
Leave |
Before
|
|
Training & Development |
Before
|
Quantitative Feedback - User Test 2 versus User Test 3 | ||
---|---|---|
Ease of Use |
|
|
Overall Layout |
|
|
Analysis for the same 5 participants from user test 2 |
This is the analysis done from user test 2: Reasons for the time taken:
Reasons for changes in timing:
|
User Test 2 Analysis |
---|
Bug Log
Bug Log |
---|
Click here to view the Bug Log & Metrics file. |
Reflections
Team Reflection
Individual Reflections
Fiona Woo | Lim Ken Khoon | Tan Yun Xi |
|
|
|
Lee Yi Xian | Tan Mei Zhen | Darryl Leong |
|
|
Sponsor Comment:
Sometimes, the client writes a report to feedback on the system; this sponsor report can be included or linked from here.