HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2012T2 box.us MidTerms"

From IS480
Jump to navigation Jump to search
Line 57: Line 57:
  
 
*[http://blue.smu.edu.sg/IS480/2012-2013/Grading.html Grading Criteria]
 
*[http://blue.smu.edu.sg/IS480/2012-2013/Grading.html Grading Criteria]
 
== To Do List ==
 
*[https://docs.google.com/spreadsheet/ccc?key=0AsGBoq64IRhgdHhsMmdFVDRRMGVFYmpvVjVpSk9VZVE#gid=0 Mid Terms To Do List]
 
  
  

Revision as of 09:24, 12 February 2013

degree=90
HOME   PROJECT OVERVIEW     PROJECT MANAGEMENT   DOCUMENTATION  
         



Acceptance Slides:

  • Mid Terms Slides: Coming Soon
  • Mid Terms Deployed Site: Coming Soon

For proposal, please see Requrements at the Project Deliverables. This will help us understand your scope. Note wiki policy here.


Describe the project progress briefly here. Have the project continued as planned? If not, is the team confident to complete? This is a crossroad for the team to make a decision. Proceed with confident or file an incomplete.

Project Highlights:

What unexpected events occurred?

  • "Task" functionality took longer than expected to scope and develop
  • Small details that led to major changes
  • Drop of mobile functionalities
  • No clear defined business process initially
  • Difference in understanding of coding conventions


Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.

Project Status:

Functions Prototyping Implemented User Tested Client Approved Confident Level Comment
Registration Completed Completed User Testing 1 Pending 1 Finishing up user interface for registration of Empact and NPO. Going on to do heuristics evaluation.
Task Completed Completed User Testing 1 Pending 1 Left to complete the user interface for pending. Includes Task Review items.
Questions Completed Completed User Testing 1 Pending 1 Final touchups to be done and then to subject to heuristics testing.
Search Completed In Progress User Testing 2 Pending 1 Task search and NPO search is not yet completed.
Volunteer Records Completed In Progress User Testing 2 Pending 1 Task search and NPO search is not yet completed.
Statistics Not started Not started User Testing 3 Pending 1 Includes the dashboard for volunteers, Empact and NPOs.
Notification Not started Not started User Testing 3 Pending 1 Includes notification


Project Schedule (Plan Vs Actual):

Compare the project plan during acceptance with the actual work done at this point. Briefly describe a summary here. Everything went as plan, everything has changed and the team is working on a new project with new sponsors or the supervisor is missing. A good source for this section comes from the project weekly report.

Provide a comparison of the plan and actual schedule. Has the project scope expanded or reduced? You can use the table below or your own gantt charts.

Iterations Planned Actual Comments
1 Customer CRUD 1 Sept 2010 25 Aug 2010 Fiona took the Sales CRUD as well.
Trend Analytic 1 Sept 2010 15 Sept 2010 Ben is too busy and pushed iteration 1 back
2 User tutorial 1 Oct 2010 Removed proposed by Ben
Psycho analysis 1 Oct 2010 New module proposed by sponsor


Project Metrics:

Schedule Ratio over Time Bug Metrics Graph
  • We are generally meeting our schedule as the time goes by. Most of the delays are caused by a difference in the estimated time that is required to do each task.
  • Spike in Iteration 7 is due to the changes in the requirements of the Task components.
(Bug metrics coming soon)


Project Risks:

Priority Type Risk Consequence Likelihood Impact Level Risk Assessment Level Mitigation Strategy
1

Team

Small issues that are raised during the meeting get lost track as we go along in the iterations

  1. Inconsistency of information between stakeholders as a result of assuming which issues are closed and which are not closed
  2. Inability to track issue statuses and lose track of small changes
  3. Inability to properly forecast project schedule as a result of not knowing the size of issues raised

Likely

Major

High

  1. Implement an issue tracking system on top of the Change Management process to track issues that are related to user interfaces and minor changes to functionalities
  2. Ensure that stakeholders are aware of Issue Tracking System and know the actions to be taken
  3. Project Manager to review issues on a regular basis and to ensure issues have been closed
2

Business

Difference in understanding of terminologies and terms used in describing the business process (e.g. task, questions, assignments)

Confusion over terms that may lead to the wrong picture being conveyed between client and team

Possible

Minor

Medium

  1. Implement a project dictionary where common terms of use are being recorded down
  2. Proactive logging down of common terms of use in the project dictionary
  3. Implement an Excel sheet that would be able to capture all the description of the fields that are being used by Empact
3

Resources

Server is unable to cope with the capacity of the users

  1. Delay of Project progress during User Testings
  2. Disrupt Empact's daily operation

Possible

Moderate

Medium

  1. Source out additional server providers as backups
  2. Track the server consumption at the end of User Testing to determine if there is a need to make changes to the server
  3. Cut over to new server if overload of server becomes frequent and choose cloud solutions that allows you to add on additional memory

Lessons Learnt from Managing Risks

Risk Management helps us to predict risk early and make necessary changes to the team even before the actual event happens.

  1. Issue Log has helped the team to have a systematic way of tracking the risks that can arise during the interactions between the team and between the meeting.
  2. Prototyping process has helped the team to visualize the end product early, but it also served as a communication tool between the different stakeholders in the project.



Technical Complexity:

Describe and list the technical complexity of your project in order of highest complexity first. For example, deploying on iPhone using Objective-C, customizing Drupal with own database, quick search for shortest flight path, database structure, etc.

Provide more details about the quality of your work. For example, you designed a flexible configurable system using XML.config files, uses Strategy Design Pattern to allow plugging in different strategy, implement a regular expression parser to map a flexible formula editor, etc.

There should be some evidence of work in progress.

Stage Specification Modules
Project Management Minutes Sponsor weeks -10 -5 3 7 Supervisor weeks -2 3 5 7
Metrics Bug metrics
Requirements Story cards CRUD Customer, Trend Analytic
Analysis Use case overall
System Sequence Diagram client, server
Business Process Diagram Here
Screen Shots CRUD Customer, Trend Analysis
Design ER Diagram 1, 2, 3
Class Diagram 1, 2, 3
Testing User test plan instructions

Not all parts of the deliverables are necessary but the evidence should be convincing of the progress. Try to include design deliverables that shows the quality of your project.

In an iterative approach, ready to use system should be available (deployed) for client and instructions to access the system described here (user name). If necessary, provide a deployment diagram link.

Our testing approach focuses on 2 areas: Usability Testing and Code Quality Testing.

User Testing 1

User Test 1

We conducted our 1st user testing to try out the new features and at the same time to also test how would the users respond to the system. Until then, there was no information about how would the different users react to the system.
Interesting Findings

  1. Questions module was not intuitive enough
  2. Task matching results could displayed in a more interactive way

Actions Taken

  1. Touched up Question module
  2. Finalized a user interface template
  3. Revamped the entire structure of how the Task module was being managed.


Check out our User Testing 1 wiki

Heuristics Evaluation
  1. Heuristics Evaluation: 15 February 2013
Upcoming User Testings
  1. User Testing 2: 26 February 2013
  2. User Testing 3: 19 March 2013

In this section, describe what have the team learn? Be brief. Sometimes, the client writes a report to feedback on the system; this sponsor report can be included or linked from here.

Team Reflection:

Any training and lesson learn? What are the take-away so far? It would be very convincing if the knowledge is share at the wiki knowledge base and linked here.

Benjamin Gan Reflection:

You may include individual reflection if that make sense at this point. The team is uncooperative and did not follow my instructions.