Difference between revisions of "IS480 Team wiki: 2011T1 Indies' Chronicle"
Jessie.2008 (talk | contribs) |
|||
Line 127: | Line 127: | ||
=== <font color=#5F9EA0>Timeline Schedule</font> === | === <font color=#5F9EA0>Timeline Schedule</font> === | ||
− | |||
− | |||
− | |||
− | + | [[Image:InitialPlannedActual1to4.png | 540px]][[Image:InitialPlannedActual5to8.png| 544px]] | |
− | |||
− | |||
− | [[Image: | ||
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br> | <br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br> | ||
− | |||
− | |||
− | |||
− | |||
− | |||
== <font color=#708090>Quality Assurance</font> == | == <font color=#708090>Quality Assurance</font> == |
Revision as of 06:11, 25 November 2011
Contents
ValueMonitor Live
Check out our the latest deployed application here: ValueMonitor
Midterm Progress Update
View our progress page here
Final Wiki
View project final's details here
About Indies' Chronicle
Project Overview
Project Description
ValueMonitor is a one stop assessment tool for consultants and clients. It is used to perform assessment, generate a report and make recommendations. ValueMonitor can also be used offline as a Silverlight out of browser application.
About Lodestone
Lodestone Management Consultants Pte. Ltd. is a Swiss based global management consultancy firm specializing in IT consulting, particularly SAP and other enterprise services. One of the key steps in delivering their premium services and solutions is to conduct performance assessment of the client’s as-is processes and systems in order to identify pain points as well as to understand the client’s current strategic position in comparison with the industry benchmark.
Deliverables
- A fully-functional Silverlight application deployed on a hosting site (either school or client-specified site).
- The out-of-browser functionality of the Silverlight application will also be installed and tested on at least 3 consultants' laptops.
- An integration of 3 versions - light web version, full web version, and desktop application
Motivation
To achieve this, we will be developing on the Silverlight Framework, which allows us to download web applications on to the laptop of the consultant (the out-of-browser feature). We wish to provide Synchronization functions which will allow the consultant to perform their assessment even when they are offline and sync their data to the centralized database once they have access to the internet. We will be providing the Administrator with the access to edit the questions and update the consultant’s desktop applications with the latest questions, hence providing version control capabilities. The light web version (the preview) for Lodestone’s client will be a part of the web version and we will restrict access using permission controls.
LodeSuite will also provide report generation capabiities for the consultants. The consultants will spend less time generating the report/powerpoint to the Lodestone clients and can spend more time providing quality service and recommendations. This tool also enables Lodestone to attract more potential clients from the internet, by providing them with a preview of the services that LodeStone offers.
Stakeholders
End Users | Company |
---|---|
Philip Kwa (Director) | Lodestone Management Consultants Pte. Ltd. |
Ana Zhou Rui (Senior Consultant) | Lodestone Management Consultants Pte. Ltd. |
Consultants and Clients | Lodestone Management Consultants Pte. Ltd. |
Scope
Aside from consultants, clients, and admins, ValueMonitor will also be used by Lodestone’s potential clients to perform a light version of the assessment and get a preview report. Interested clients will be contacted by Lodestone upon registration. The details of the functionalities will be discussed further in the latter sections. ValueMonitor can also be accessed by administrators to manage the assessment questions and access other analytics functionalities.
There are 3 main users identified:
- Administrators: Any Lodestone employee(s) trusted to manage the assessment.
- Clients: Any Lodestone clients that are not yet registered in the system.
- Consultants: Any Lodestone consultants.
Consultants
Consultants are the main users of the application. After logging in to the system, they are able to perform the complete assessment and generate/download the assessment report in Microsoft Powerpoint, PDF, and other graphical or image formats. This report will then be used by the consultants in developing possible solutions and recommendations for the clients.
In addition, all consultants are able to download the whole assessment application to conduct the assessment process offline. By doing so, the consultants can still perform the assessment process regardless of the internet connection availability at the client’s place. This makes our solution holistic, while seamlessly integrating the web and desktop versions.
Clients
Any potential clients are able to access the ValueMonitor system and perform a free self-service performance assessment. However, these potential clients will only get a ‘preview’ report with limited explanations. Interested clients can then register (“sign up”) in order to notify the consultants to receive further premium services and solutions.
Administrators
Upon logging in to the system as an administrator, user of the ValueMonitor is able to manage the assessment questions provided by the system. The administrator is the only authorized person to manage the assessment questions (add/edit/delete) in the assessment tool.
In addition, administrators are also able to view the system statistics (presented in various graphical formats) in order to better understand the past and ongoing assessments performed by the clients and consultants.
X Factor
Our application is Silverlight based and C# based. Silverlight allows us to build a Rich Internet Application with out-of browser capabilities, which can be downloaded and used on a laptop. Also, the web-application can function the same way without internet access and the results can be synchronized when internet access is available.
Consistent User Experience
Same user experience across online and offline versions.
Ease of Synchronization and version control
Changes to questions are updated automatically, when online
Auto-saved offline changes will be synchronized automatically, when user goes online
Better data visualization
Dashboard function for administrator and consultants. Charts to view statistics of the companies.
Better usability
Provide users with simple and easy-to-use UI.
For eg. single page for create/edit domain.
Project Documentation
View more details about our project documentation here
Project Management
Project Plan
Project Status
Completed Functionality
The table lists all the functionality that has been completed. There are no remaining functionality to be completed.
Gantt Chart
Timeline Schedule
Quality Assurance
Schedule Metric
Usage
The schedule metric is used to track our project schedule. It ensures that the planned tasks meet the deadline, and that we allocated enough buffer time for it. It is also used to track whether we have met all of the the client's requirements or not.
Throughout the project, our team was on 75% on schedule and only 25% behind the schedule.
Calculation Formula
Metrics Ratio (MR) = Actual total days per task / Estimated total days per task
Guidelines and Action Taken
- Iteration 4 and 6 were almost behind schedule, because in iteration 4, we had deployment problems, while in iteration 6, we found a lot of bugs, and it took more than expected to clear them, hence causing the delays.
Bug Metric
The bug metric is used to track bugs found in our system. It ensures that the project quality is at its highest, and also it decides the allocation of bugs-crushing to the team members. Aside of that, it is also used to avoid code conflicts.
Bug Tracking
Internally, we use Google docs to track our bugs. As the team member entered the bug, that person has to enter the severity of that bug (1, 3, or 10) in the 'Bug Level' column. Each of us will know who entered the bug by the 'Entered by' column, and we will also know who resolved the bug from the 'Resolver' column.
Calculation Formula
Total points = (1 * Low-N) + (3 * High-N) + (10 * Critical-N)
1*Low-N = total of UI related bugs that do not affect functionality
3*High-N = total of bugs that affect output
10*Critical-N = total of bugs that severely harm the system
Guidelines and Action Taken
Green Zone | Point <= 50 | Bugs recorded in the bug excel sheet will be solved in the next "bug squashing" period", one of the buffer period set aside by the team |
Bug Alert | Point > 50 | Team member who discovers this bug will notify the PM to arrange for an immediate "bug squashing period" |
Bug Metric
- From iteration 5 onwards, as we develop more functions, we found more and more bugs as we continuously test our product. Hence, as each iteration goes by, the number of bugs also increased.
UAT at a Glance
Testing is done internally and externally. Internal testing focuses more on the bug testing while external testing focuses on heuristic evaluation and user experience.
For Internal testing, all of our team members have to test the system thoroughly, and then document any findings in a google docs. Team members do not have to do the testing at the same time, but rather on different timings. The goal is to conduct a continuous testing by different people to minimized number of bugs in the deployed system. We create a test plan as a guide for internal testing, but each individuals are welcomed to try out different unthinkable scenarios.
Different from internal testing which mainly focuses on bug testing, we also conduct an external testing (which is usually known as User Acceptance Test) that focuses more on user experience. Our aim is to have an interactive user interface where user does not even need to spend time thinking what they need to click. Therefore, we implement 2 parts of testing: unguided testing and guided testing. As the name implies, we do not provide any guidelines/ details of steps for testers during unguided testing. We give them some tasks to perform and based on the objective of each tasks, testers need to find out how to complete the tasks themselves. Based on observation and recorded time for each tasks during testing process, we can infer how interactive the system is. After completing the unguided testing, testers proceed with guided testing where testers are given some tasks with steps details. The objective of guided testing is for testers to focus on the flow and to ensure the system matches with target user's requirements.
UAT Timeline
UAT Result
UAT 1 Result :
UAT 2 Result:
Meeting Minutes
View team and supervisor meeting minutes here
Technologies
Technologies Used
LOMS
View learning outcomes here
Reflection
Team Reflection
TBU
Individual Reflection
Satya | Antoni | Ronny | Athina | Jessie | Yessita