Logiciel Mid Term Wiki
- 1 Project Progress Summary
- 2 Project Management
- 3 Risk Management Plan & Risk Backlog
- 4 Development Overview & Technical Summary
- 5 Testing
- 6 Team Learning Outcomes
Project Progress Summary
View our Midterms Slide
View our Demonstration Walkthrough.
View our Project Overview & Description.
- Currently at Sprint 6 (14 Feb 2013 to 21 Feb 2013). View our Sprint Progress here.
- Completed approx 60% of our Project.
- Usability Testing conducted on Sprint 5. View our UT Results here.
- Last Client Feedback Session on Sprint 4 (25 Jan 2013). View the Meeting Minutes here.
- Upcoming Sprint 7 (21 Feb 2013 to 28 Feb 2013). Sprint 7 have not started, view our Product Backlog here.
Overview of Project Schedule
Our Project currently has 9 Sprints in total. Below is the comparison between our current schedule and our initial schedule we drafted at acceptance.
View our Product Backlog to see the Complete Breakdown for each sprint.
Below are all our Burndown chart since inception. These Burndown chart tracks how much work has been completed and the rate at which we complete our tasks during the sprint.
View our Logiciel Project Management Wikipage to see the breakdown of tasks for each sprint.
Overall Burndown Chart
Sprints Burndown Comparison
|Sprint 0||Sprint 1||Sprint 2||Sprint 3||Sprint 4||Sprint 5||Sprint 6|
Risk Management Plan & Risk Backlog
The table below showcase our Outstanding Risks as of the current Sprint, Sprint 6.
View our Risk Backlog to see our mitigated Risks.
View our Risk Management Plan to find out how we identify Risks and formulate our action plan.
|Risk||Resolve-By Date||Risk Score||Action Plan||Contingency Plan||Current Status / Updates|
|Adoption resistance One risk our project faces is that of resistance in adopting our portal as a strategy formulation tool – to be used concurrently with SalesForceDotCom. As the use of SFDC is entrenched into our client’s business process, and managers and senior management are used to their usual modes of reporting and gathering data, a problem will be to integrate the use of our portal in a way that is as seamless as possible||26/4/2013||8||Dedicating a portion of the UAT to finding out the preferences of our client; assessing the tweaks or things we should implement to aid adoption||Hold training and briefing sessions, as well as provide demonstrations of the use of our portal so as to allow the main groups of users to be familiarized with the portal. This will be more so catered when we hand over the project – should not all the groups be available, we will conduct the briefing/handover to the IT team at TR.||NIL|
|Graphs Rendering Response time for graphs may be unacceptable by client. Usually graphs render under 1s, but it can take 3s sometimes||14/3/2013||6||Do caching for graphs. Have separate table that stores requests for json data, so that identical requests will read from the same data.||If we are unable to implement caching, we will need to negotiate Quality Attributes with the client. If the change does not produce significant results, we will need to do logging to find out where is the delay||NIL|
|Sentimental Analysis Library maui, may not be able to run effective topic extraction and possible polarity analysis on answers||21/2/2013||4||Perform thorough research on the functions to become more familiar with the technology||Look for other implementations of KEA, or consider using vanilla KEA to perform task. Also consider dropping polarity of topic analysis||Research in progress|
|Search Engine Apache Solr, may not be able to implement multiple document types on a single page||21/2/2013||4||Perform thorough research on the functions to become more familiar with the technology||Stick with single document type for Solr||Research in progress|
View our Bug Log and Bug Metrics
Development Overview & Technical Summary
Read more about our Client Side, Server-Side, Development Patterns and Architecture Diagram on our Logiciel Technology page.
Find out about the technologies that we use, what we are trying to achieve, why we chose them, how we use them, what were the actual development work done while using them, how we felt using it, and what we learnt. Read all about it in a story form on our Technical Blog.
Recent Usability test was conducted in Sprint 5, 30 Jan 2013 to 1 Feb 2013.
Qualitative & Quantitative Test Findings
|Qualitative UT Findings||Quantitative UT Findings|
Upcoming Client UAT
We have 2 Upcoming User Acceptance Tests (UAT). For more Details on our testing schedule, view our team Logiciel Testing Schedule.
- UAT Phase 1 (27 Feb 13 or 6 Mar 13)
- UAT Phase 2 (14 Mar 13 or 21 Mar 13)
User Acceptance Test Scope (Phase 1 Only)