HeaderSIS.jpg

Difference between revisions of "Logiciel Finals Wiki"

From IS480
Jump to navigation Jump to search
 
(10 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
[[IS480 Team wiki: 2012T2 Logiciel|Return to Main]]
 
[[IS480 Team wiki: 2012T2 Logiciel|Return to Main]]
 
==<div style="background: #006064; padding: 13px; font-weight: bold; line-height: 0.4em; text-indent: 20px;font-size:26px;"><font color= #ffffff>Project Overview</font></div>==
 
==<div style="background: #006064; padding: 13px; font-weight: bold; line-height: 0.4em; text-indent: 20px;font-size:26px;"><font color= #ffffff>Project Overview</font></div>==
View our '''[https://www.dropbox.com/s/u86zah11nrqrhv6/FYP%20Logiciel%20Midterm%20Presentation.pptx Finals Presentation Slides]'''
+
View our '''[https://www.dropbox.com/s/98y2cp1k7yishoo/Team%20Logiciel%20IS480%20Final%20Presentation.pptx Finals Presentation Slides]'''
  
 
View our '''[http://202.161.45.127/EagleEye/ Deployed Site]'''
 
View our '''[http://202.161.45.127/EagleEye/ Deployed Site]'''
Line 108: Line 108:
 
| style="background:#ffffcc;"|Workflow - interview forms are likely to take longer than expected due to unforseen complexities and interaction between different roles||style="background:#ffffcc;"|28/2/2013||style="background:#ffffcc;"|6||style="background:#ffffcc;"|Increase the estimated points for this task. Will probably have to drop certain upcoming tasks
 
| style="background:#ffffcc;"|Workflow - interview forms are likely to take longer than expected due to unforseen complexities and interaction between different roles||style="background:#ffffcc;"|28/2/2013||style="background:#ffffcc;"|6||style="background:#ffffcc;"|Increase the estimated points for this task. Will probably have to drop certain upcoming tasks
 
|-
 
|-
| Search Engine - Apache Solr, may not be able to implement multiple document types on a singl page.||21/2/2013||4||Perform thorough research on the functions to become more familiar with the technology
+
| Search Engine - Apache Solr, may not be able to implement multiple document types on a single page.||21/2/2013||4||Perform thorough research on the functions to become more familiar with the technology
 
|-
 
|-
 
|  
 
|  
Line 238: Line 238:
  
 
|rowspan="1" style="background:#ffffcc"| Requirements
 
|rowspan="1" style="background:#ffffcc"| Requirements
|style="background:#ffffcc"| Schedule||style="background:#ffffcc"|Project Timeline
+
|style="background:#ffffcc"| Schedule||style="background:#ffffcc"|[https://www.dropbox.com/s/e42od3bpo1k5n7r/Timeline.png Project Timeline]
 
|-
 
|-
  
  
|rowspan="3"style="background:#ffffcc| Analysis
+
|rowspan="3" style="background:#ffcc99"| Analysis
|| Use Case||[https://www.dropbox.com/s/4uml0hvucbbtpu3/Use%20Case%20Diagram.pdf Use Case][https://docs.google.com/spreadsheet/ccc?key=0AlCZlfsG7fcSdGlQTzZSOWNLQ2txMUNQVmhPN05jclE#gid=0 Use Case Scenario]
+
|style="background:#ffcc99"| Use Case||style="background:#ffcc99"|[https://www.dropbox.com/s/4uml0hvucbbtpu3/Use%20Case%20Diagram.pdf Use Case][https://docs.google.com/spreadsheet/ccc?key=0AlCZlfsG7fcSdGlQTzZSOWNLQ2txMUNQVmhPN05jclE#gid=0 Use Case Scenario]
 
|-
 
|-
|style="background:#ffffcc"| Architecture Diagram||style="background:#ffffcc"|[https://wiki.smu.edu.sg/is480/Image:Architectural_Diagram.png Architecture Diagram]
+
|style="background:#ffcc99"| Architecture Diagram||style="background:#ffcc99"|[https://wiki.smu.edu.sg/is480/Image:Architectural_Diagram.png Architecture Diagram]
 
|-
 
|-
|| Stakeholders||[https://wiki.smu.edu.sg/is480/Eagle_Eye%E2%80%99s_Stakeholders Stakeholder Information],[https://docs.google.com/spreadsheet/ccc?key=0ArFKIjb54F1IdEZaRG5vYUNyVVgtQUtubXlSb3Q1UFE#gid=0 Internal Stakeholder]
+
|style="background:#ffcc99"| Stakeholders||style="background:#ffcc99"|[https://wiki.smu.edu.sg/is480/Eagle_Eye%E2%80%99s_Stakeholders Stakeholder Information],[https://docs.google.com/spreadsheet/ccc?key=0ArFKIjb54F1IdEZaRG5vYUNyVVgtQUtubXlSb3Q1UFE#gid=0 Internal Stakeholder]
 
|-
 
|-
  
Line 254: Line 254:
 
|-
 
|-
  
|rowspan="1"style="background:#ffffcc"| Testing
+
|rowspan="1" style="background:#ffcc99"| Testing
|| User Tests||[https://docs.google.com/presentation/d/1BRLQe5zYO7tusseNJ8vkd1le6Iwi7h6g2nGG5UlpBzw/edit#slide=id.gda4c4b96_2_99 UT 1 Plan Details],[https://docs.google.com/forms/d/1bVFrr54oNUiyBIKNNcbByMQpvu2p1_pndbkSNqZDQj4/viewform UT 1 Survey],[https://docs.google.com/presentation/d/12cfdOr-NHFDCJ2MrJveIThJcFG7EILDVU0s75aoMa0g/edit#slide=id.p UT 1 Results & Findings],[https://docs.google.com/document/d/13XJOn63fMVlXcAiVw5BYIaUk1zDFG9GEbcu4ybAxNCI/edit UT 1 Tools Use & Justification],[https://docs.google.com/presentation/d/1p-Nwly8nwNPbpx_0MW1JJ8Y1gmRhErJ3h544Tbgd43A/edit#slide=id.p UT 2 Plan Details],[https://docs.google.com/forms/d/150TO3OuM6c11twy4nbp-ugpYSikjMfkUW6QchKQiYCA/viewform UT 2 Survey],[https://docs.google.com/document/d/1eF_mxOf0_ZvLQIqvGULIgCGqWbUAptGxI0j7Aiee2kI/edit  Test Schedule],[https://drive.google.com/folderview?id=0B7FKIjb54F1IWExmZ3NaUDdZNWc&usp=sharing UT 2 Other Documents],[https://docs.google.com/spreadsheet/ccc?key=0ArFKIjb54F1IdFhuU3dHZjg5bllpOGFWb0RhSExjcWc#gid=0  Test Cases],[https://docs.google.com/spreadsheet/viewform?formkey=dEt3U1lBblNidUlVbVBUWDNzeXZ2a0E6MQ#gid=0  Usability Test],[https://docs.google.com/spreadsheet/ccc?key=0Av7OMCSbuh1KdGNHb0NwendLbUhteGxnU3JmWEdoWVE&usp=sharing User Test Findings],[https://wiki.smu.edu.sg/is480/Eagle_Eye_Quality_%26_Assurance Client User Testings]
+
|style="background:#ffcc99"| User Tests||style="background:#ffcc99"|[https://docs.google.com/presentation/d/1BRLQe5zYO7tusseNJ8vkd1le6Iwi7h6g2nGG5UlpBzw/edit#slide=id.gda4c4b96_2_99 UT 1 Plan Details],[https://docs.google.com/forms/d/1bVFrr54oNUiyBIKNNcbByMQpvu2p1_pndbkSNqZDQj4/viewform UT 1 Survey],[https://docs.google.com/presentation/d/12cfdOr-NHFDCJ2MrJveIThJcFG7EILDVU0s75aoMa0g/edit#slide=id.p UT 1 Results & Findings],[https://docs.google.com/document/d/13XJOn63fMVlXcAiVw5BYIaUk1zDFG9GEbcu4ybAxNCI/edit UT 1 Tools Use & Justification],[https://docs.google.com/presentation/d/1p-Nwly8nwNPbpx_0MW1JJ8Y1gmRhErJ3h544Tbgd43A/edit#slide=id.p UT 2 Plan Details],[https://docs.google.com/forms/d/150TO3OuM6c11twy4nbp-ugpYSikjMfkUW6QchKQiYCA/viewform UT 2 Survey],[https://docs.google.com/document/d/1eF_mxOf0_ZvLQIqvGULIgCGqWbUAptGxI0j7Aiee2kI/edit  Test Schedule],[https://drive.google.com/folderview?id=0B7FKIjb54F1IWExmZ3NaUDdZNWc&usp=sharing UT 2 Other Documents],[https://docs.google.com/spreadsheet/ccc?key=0ArFKIjb54F1IdFhuU3dHZjg5bllpOGFWb0RhSExjcWc#gid=0  Test Cases],[https://docs.google.com/spreadsheet/viewform?formkey=dEt3U1lBblNidUlVbVBUWDNzeXZ2a0E6MQ#gid=0  Usability Test],[https://docs.google.com/spreadsheet/ccc?key=0Av7OMCSbuh1KdGNHb0NwendLbUhteGxnU3JmWEdoWVE&usp=sharing User Test Findings],[https://wiki.smu.edu.sg/is480/Eagle_Eye_Quality_%26_Assurance Client User Testings]
 
|-
 
|-
  

Latest revision as of 12:29, 18 April 2013

Return to Main

Project Overview

View our Finals Presentation Slides

View our Deployed Site

View our EagleEye Overview & Description.

Project Status

Project Management

Overview of Project Schedule

Our Project has 9 Sprints in total. Below is comparison between our Final-Term schedule and our Mid-Term Schedule after Mid-Terms Presentation.

Logiciel Timeline Comparion Final.png
Sprint 7 Sprint 8 Sprint 9
What happened?
  • Scheduled Metric fell below threshold

How did we respond?

  • Review schedule as according to action plan
  • Drop individual configurable graphs in favor of configurable general filters
  • Extend Sprint 7 to end of UT1
What happened?
  • Pushed back due to extension of Sprint 7
  • Need to cater scope for UT1 changes
  • UT2 was delayed due to late start and end

How did we respond?

  • Plan an additional week after estimating scope from UT1, using buffer catered in Sprint 9
What happened?
  • Reduced length due to changes in Sprint 7 & 8
    • Sprint 9 was designed for buffer
  • To include UT2 changes as scope
Main Functionality Developed
  • Workflow Items
  • Graph Data Cache
Main Functionality Developed
  • Custom Reports
  • UT 1 Changes
  • Client User Testing Phase 2
Development Completed
  • UT 2 Changes

View our Product Backlog to see the Overall Breakdown for each Sprint and view Individual Sprint Burndown Tracking for all 9 Sprints below.

Burndown Spreadsheets
Sprint 0 Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5 Sprint 6 Sprint 7 Sprint 8 Sprint 9

Schedule/Burndown Charts

Below are our Burndown charts for the last 3 Sprints. These Burndown Chart tracks how much work has been completed and the rate at which we complete our tasks during the sprint.

View our Logiciel Project Management Wikipage to see all the breakdown of task for each sprint.

Overall Burndown Chart

Logiciel Overall Burndown Finals.png

Sprints Burndown Comparison

Logiciel Project Progress.png Logiciel Burndown789.jpg

Risk Management Plan & Risk Backlog

View our Risk Backlog to see our mitigated Risks.

View our Risk Management Plan to find out how we identify Risks and formulate our action plan.

The table below highlights some of our more prominent risks and how we mitigated them.

Risk Issue Resolved Date Risk Score Action Taken
Client scoping 31/1/2013 9 To confirm all deliverables by 'resolve-by' date with client by proposing functions and confirming business value.
Specific graph descriptions, data fields and purpose have not been determined. This may cause scope to drastically change if defined only late in the project. 13/12/2012 9 Risk can be avoided by liaising with sponsor to determine exact charts
D3.js: This technology is mainly used for our visual analytics, and our team has no prior experience with this tool. Learning time may be high and the technology might not be able to produce the results we want. 25/10/2012 6 The team will find examples of applications of D3.js, and understand the Javascript library to see the feasibility. We will also have lessons for members to understand the code.
Export Graphs: There are several technologies that cannot work with our application to export pngs of our graphs. Several new technologies have been found which might work but it's take quite a long time to learn and test out each one of them. 5/2/2013 6 Resolve the export function by the resolve-by date, or else consider dropping of the functionality
Workflow - interview forms are likely to take longer than expected due to unforseen complexities and interaction between different roles 28/2/2013 6 Increase the estimated points for this task. Will probably have to drop certain upcoming tasks
Search Engine - Apache Solr, may not be able to implement multiple document types on a single page. 21/2/2013 4 Perform thorough research on the functions to become more familiar with the technology

Bug Tracking

Below are the Bug Tracking Graphs for the entire project. View our Bug Log and Bug Metrics

Logiciel Bug Breakdown.pngLogiciel Bug Points.png

  • As noticed, there's a huge spike in bugs in Sprint 8. Sprint 8 marks the completion of the workflow, role management, and custom report functions. As these functions entails an integrated use of codes, we were bound to uncover bugs that were not covered in previous test cases. The complexity of these functions created bugs of its own as well.
  • Although there are little functions to develop and test in Sprint 9, we discovered more bugs as we wrote more integrative and thorough test cases for existing functions.

Development Overview & Technical Summary

Technical Complexities

Read more about our Client Side, Server-Side, Development Patterns and Architecture Diagram on our Logiciel Technology page.

Technical Complexities Description What is Complex?
Solr.png Apache Solr is the popular, blazing fast open source enterprise search platform from the Apache LuceneTM project. Its major features include powerful full-text search, hit highlighting, faceted search, near real-time indexing, dynamic clustering, database integration, and rich document (e.g., Word, PDF) handling. We leverage on this existing library to power our search engine. Configuration
  • Because Solr was intended to index documents by default, and uses a separate folder to pickup new configuration files, we spent a good amount of time configuring it to import data from a database via a SQL query, define the right data types for full-text queries, and configure both Solr and Tomcat settings for it to work on a Tomcat server.
  • The query for Deal object can be seen here.

Multi-core Solr implementation

  • In order to deliver multiple result types, we had to configure multiple 'cores' on solr, which has their own URL endpoints for the query, different object types, different SQL query structures due to the way the different tables are crafted in the relational database.
  • The query for a Location object, which is different from a Deal object, can be seen here.

What did we do?

  • Other than downloading and pasting Solr into the Tomcat webapps directory, we had to
    • Change tomcat init settings to set up Solr
    • Configure Solr to use a JDBC connection as a data source rather than default document types
    • Create 6 separate nested sql queries as a means to prepare data for indexing for the different cores
    • Research on different query parsers and parameters, and chose a parser that can parse natural language
    • Break down user-input and construct a Solr query with the appropriate parameters
      • Advanced search builds a complex query based on the type of field requirements (range searches, boost. Now deprecated.).
Logiciel Specialization List.png This Specialisation Algorithm is required for all Drill-down page. It ranks items such as Products, Deal Owners, Factors & Sales Teams so that the Sales Manager will know how well this item is performing with respect to that Drill-down page. Example: List of Deal Owners who are "Specialised" at selling a particular Thomson Reuters Product. The Ranking is based on more than 1 determinant. It is, in fact, an aggregation of 3 factors; Winning Rate (%), Total number of Deals Won (#), Total Value ($). For more details, see here.
  • The same algorithm is to be used for more than 1 entity type. As such, more thought needs to be put into the code design.

Code Design

  • In order to perform the calculations, we made use of a HashMap to identify unique entries and count them, similar to a map process in a MapReduce algorithm. Thereafter, we compute the results by reducing the HashMap by key in a separate method.
  • Then, in order to support multiple entity types, we abstracted each entity to the Object class in the method.
Logiciel Topological Map.png Renders a Topological World map that indicates the Strengths and Weakness of a particular entity. Sales Managers are able to click and zoom-in onto those region they are interested in to find out more details.

Preparing data

  • The shape data for Earth we managed to find was in TopoJSON format, which was ready for use by D3.js. However, the shape data was not assigned to countries.
  • What we did was to parse TopoJSON into a java object, add in the country name, and the calculated score for each country, before packing it back into JSON format for the client.
  • In addition to that, we needed the countries to be grouped in Thomson Reuter's defined business regions as per database. So we added that data into the JSON file as well.

D3.js

  • On the client, we used a SVG projection method defined by D3.js to project the 3D globe shape data onto a 2D canvas.
  • In order to customize the zoom to the region, we had to modify the zooming mechanism, and dynamically calculate the bounds of a region on javascript, in order for the zoom to be for a region.

Other Challenges

Challenges Description Why is it Challenging?
Maui.png

Maui automatically identifies main topics in text documents. Depending on the task, topics are tags, keywords, key-phrases, vocabulary terms, descriptors, index terms or titles of Wikipedia articles. It can also be used for terminology extraction and semi-automatic topic indexing.


Project Progress - Maui Key Generation.png

Finding the right tool

  • The original intention for keyword extraction was to associate keywords extracted from customer’s feedback (this could be text from open-ended questions in the interview form or any customer comments given and noted/tied to the deal). The keywords would help to summarize the main takeaways from the feedback.

Finding the right training data

  • In order to perform Term Assignment/Keyword Extraction on a document text, we needed to customize domain specific learning data for Thompson Reuters. Though some training data came along with the MAUI Indexer library, it was insufficient. The main topic for the text that we will be using is under "feedback and comments". However, there were no data sets available for this topic, although there was for medical terms, agriculture, and such.
  • After delving into research and sourcing for credible sources to reference from, we decided to employ the use of SemEval-2010 Keyphrase Extraction Track Data (Maui format – about 140+ .key & .text pairs) as it was identified as being suitable for Keyphrase extraction (http://en.wikipedia.org/wiki/SemEval). SemEval data are sets of data that are derived from real humans performing topic extraction and similar exercises on provided data. While we were able to generate a keyphrase from a sample text, there was the issue of the process taking too long for a single keyword extraction task (4 to 5 mins) as the data was too big. Concurrently, we also prepared a set of training data to experiment with the Machine Learning capability that MAUI supposedly encompasses.

Refining the results

  • In addition, we needed to ensure that Stemming and Stopwords filtering were being performed on the text. Stemming identifies the root word of the extracted keyword (e.g running -> run) while Stopwords filtering eliminates common words (such as: the, is, are) that appear often but has no real significance in the main focus of the text content. Though we were able to generate keyphrases from a document text, we are unable to successfully implement the Stemming and Stopwords filtering function.
  • Having spent an extended amount of time on this functionality with little progress after hitting a wall, the team decided to drop this functionality in favor of post UT 2 enhancements and other fixes which were more crucial to the core functions of the application.
  • Though we proposed this initially as a value-add to the application, and there would have been real benefit to the client, it was more a long term consideration for our client which was not a pressing need, but an area that could be explored. Thus, we shifted our focus to addressing post UT 2 enhancements, bugs and fixes.
Logiciel Workflow.png
The workflow tool is to used to facilitate the capturing of a deal by keying in the data taken from the responses from both the sales representative and the customer, which is represented by the interviewer. It allows a flow where one party will complete his side of the form, and then generating a task for the other party to complete the entire process. Additionally, the users will be able to save their drafts, and view completed forms too.

Scope definition

  • Since the workflow is a new initiative that is proposed by the client, our team have to go through several rounds of discussion and changes before formulating the final workflow.
  • Throughout the process of discussions and testings, various amendments have to be made to what permissions each role can hold, and how the form should look like, how the work flow should flow, etc.

Implementation

  • The team spent quite an amount of time planning using the swim lane model and deriving the database model to make sure that the workflow is in correct order.

Enabling Flexibility

  • Because this is a project supporting a new business process, the client would prefer not to be pigeon-holed into a single fixed process that require many members. To enable the client to modify the process and potentially remove the 'busy' Sales Representative from the direct process, we brainstormed ways to enable the interviewer to pick up the component and let just interviewers conduct interviews on both sales and customer end.
Logiciel Custom Report.png To provide flexibility and improve relevance of analytical information presented to the Sales Manager, we implemented Custom Reports. The user may choose the subject of focus, it can be a DealOwner, a Product, a Region, or even a Factor. Then, the user may choose the filters as provided on the dashboard, and what kind of content to view. The report will then be generated.

Usability

  • Effort was put into preventing the user from making errors or selecting irrelevant parameters on the report creation page. It may be daunting to unfamiliar users and we did not want to deter them from using the function. (eg. Subject types without fact sheets will have the fact sheet option disabled)
  • To ease the user's learning curve, we only revealed input parameters as and when they were needed, and hid the parameters to focus the user's attention on missing data when appropriate (eg. changing subject type will require user to select a subject before filling out the filters)

Consolidating Data

  • The nature of the function (which makes use of all previous provided data types less search) tests the re-usability of the codes that were previous developed to suit static pages. It prompted us to redesign a few classes for them to be more dynamic.

Dynamic Presentation of Data

  • Different element types that are to be presented take up different amounts of space on the page. For an example, a graph may be twice the height of a specialist table. As of such, we had our own CustomLayout class that makes use of the knowledge of bootstrap to arrange elements in the most aesthetically appropriate manner possible.
  • When both Strength and Weakness for a subject type is chosen, the page also automatically combines them into a single side-by-side table for easier comparison.

Quality of Product

Project Deliverables

Stage Specification Modules
Project Management Minutes Sprint Minutes, Weekly Supervisor Meetings, Client Meetings
Metrics Risk BackLog, Risk Management Plan, Bug Metric
Requirements Schedule Project Timeline
Analysis Use Case Use CaseUse Case Scenario
Architecture Diagram Architecture Diagram
Stakeholders Stakeholder Information,Internal Stakeholder
Design Coding Considerations Logiciel Technical Blog,Technologies
Testing User Tests UT 1 Plan Details,UT 1 Survey,UT 1 Results & Findings,UT 1 Tools Use & Justification,UT 2 Plan Details,UT 2 Survey,Test Schedule,UT 2 Other Documents,Test Cases,Usability Test,User Test Findings,Client User Testings
Handover Manuals User tutorial, Developer manual, Setup manual
Code Client server
Deployment (On-Site)

Quality

Design Patterns

  • Multiple thoughts have been put into design patterns, method naming, class designs, to make the code and service infrastructure reusable. It may be noted that within 1 sprint, we developed Custom Reports that made use of almost all data types due to the design of the classes. To see more details and examples, follow the link below to our patterns section.
  • Link to patterns

Fault Tolerance

  • To handle any server faults, users are directed to a user-friendly error page that abstracts the problem away from the user, and allows the user to re-login. An option is available on the page to view more details about the error when necessary.
  • When creating a new deal, if the deal already exists(we know from an ajax request), a cue is provided to the user so that he/she knows to use another id.
  • More covered under usability.

Performance Considerations

  • To improve the loading times of graphs, we used a post-processing caching mechanism to cache all identical requests for graph data. The cache will stay for a day, or until the user 'force' updates the graphs. This improved loading times by 10 times, from an average of 659ms to 66ms.
  • Care is taken when performing calculations in the back-end to make sure that as few iterations are used to accomplish a task as possible.
  • Graph data are calculated on separate threads on the server while loading, to improve loading times.

Security

  • To protect the user's privacy, the passwords are hashed using SHA1 hash function such that our database does not store any users' password.
  • User access to pages are limited and controlled by their roles. When a user tries to access a page that he/she does not have access to, the user will be redirected to their home page.

Usability & Others

  • The web-application is designed to prevent the user from doing wrong things. We take care to validate controls and limit choices that the users can make, and provide feedback to the user so that he/she knows what is going on.
    • Drop-down lists are populated dynamically, they only contain options that are available in the database
    • Cues(loading symbol, text) on the dashboard to indicate if the graphs are still loading, or if there are no available data.

Deployment

The following items will eventually be deployed and hosted on the client's server

  • WAMP Server – Used to establish the connection with the database
  • MySQL Workbench – Main database
    • SQL Scripts – Imported into the database with a handful of administrator users
  • Apache Tomcat Server – Application web server to handle various applications
    • EagleEye Application – Main application that will be deployed and ready for use
  • Java PHP Bridge – Enables the use of PHP script for exporting functions
  • Apache Solr Search Engine – Leverage on the multi-core queries to access indexed data in the database
    • Solr Configuration Files – Configured settings to allow Solr to run smoothly with Tomcat
  • PHP for Windows

User Testing

Client UT Phase 1

Client User Testing 1 was conducted in Sprint 7, 5 Mar 2013 to 15 Mar 2013.

No. Objectives of Test
1 Check if the application functionality support end user daily operations
2 Check if the application functionality meet user requirements specifications
3 Overall Look & Feel of the Application
4 Find Bugs in the Application
5 Find out if the interactions with the visualizations are intuitive & that users can perform desired tasks
6 Obtain feedback about the overall look and feel of the application
7 Discover any unprecedented errors or bugs


No. Scope of Test
1 All visualizations
2 Visualizations’ detail page
3 Visualizations’ filter
4 Login/Logout
5 General and Advance Search

Findings

Logiciel UT1 Findings1.png
Logiciel UT1 Findings3.png
Logiciel UT1 Findings2.png

Team

  • How to deploy, configure and use a web-service based search engine: Apache Solr.
Description Before After
Implement Help Tooltip for Clarity
LogicielUT101B.jpg
LogicielUT101A.png
Increase Font Size for Readability
LogicielUT102B.jpg.png
LogicielUT102A.jpg.png
Show Teams as Labels
LogicielUT103B.png
LogicielUT103A.png
Show Currency Unit on Axis
LogicielUT104B.png
LogicielUT104A.png
Show Competitors as Labels and Enhanced Competitor Name Display
LogicielUT105B.png
LogicielUT105A.png
Show Products as Labels
LogicielUT106B.png
LogicielUT106A.png
Add Period Filters and Timestamp to Dashboard
LogicielUT107B.png
LogicielUT107A.png

Client UT Phase 2

Client User Testing 2 was conducted in Sprint 8, 2 Apr 2013 to 10 Apr 2013.

No. Objectives of Test
1 Validate the changes made after UAT 1
2 Test continue to support end user daily operations
3 Ensure a bug free application with all the test cases passed
4 Client acknowledge and confirm the application satisfy the application specification


No. Scope of Test
1 UT changes made after UAT 1
2 Win/Loss workflow
3 Manage Win/Loss Form
4 Manage Win/Loss Form's questions
5 Revamped search
6 Manage account
7 Custom Report


Findings

Logiciel UT2 Findings1.png

Logiciel UT2 Findings2.png Logiciel UT2 Findings3.png

Changes & Developments

Description Before After
Colour Gradient for increase distinctiveness
LogicielUT201B.png
LogicielUT201A.png
Competitor is no longer compulsory
LogicielUT202B.png
LogicielUT202A.png

Learning Outcomes

Team

  • How to deploy, configure and use a web-service based search engine: Apache Solr amongst other technological tools.
  • Communication is key - to stakeholder management, internal goal alignment and conflict management
  • Prudence in planning - it is one thing to set a schedule with healthy pressure, but another thing to manage stakeholder's expectations on the schedule.

Individual

Amelia-Photo.jpg Amelia Low

Persevere

  • Persevere to finding solutions to problems.

Persistence

  • Revisit alternatives and keep trying.
Elvin-Photo.jpg Elvin Lim

Meticulous

  • Small Details snowball into Huge Successes.

Styling

  • Looks good, feels good but not necessarily essential.
Frank-Photo.jpg Frank Lim

Experimental

  • Always experiment. You will never know what you will find!
Ryan-Photo.jpg Ryan Ng

Anticipate

  • The point of failure, probably occurance and follow up!

Evaluate

  • Proposals and suggestions. Is it technically feasible? Do we have the expertise to implement? Likehood of adoption?
QH-Photo.jpg Qian Hui

Specific

  • Failing to plan is planing to fail.

Quality

  • Rubbish in , Rubbish out Generate quality feedback which provide insights for the team to evaluate the application.
Vernon-Photo.jpg Vernon Lek

Coach

  • Each person have their own strengths, so always use your strengths to coach my peers. It not only clear your teammate’s obstacles, but it carries the team forward also.

Appreciate

  • In a team, you will help out each other and your peers will guide you along in times of difficulties. You don’t work alone on the project so you learn to appreciate each other in the group.