2011T1 Victorious Secrets Mid-Term Wiki Page

From IS480
Jump to navigation Jump to search

Project Progress Summary

Project Highlights

The individual workload on the project has increased substantially since the starting of the school term as we have to juggle both the project and other course subjects. We experienced short of hands when we were conducting User Acceptance Test where half of the four-person team were dedicated to design test plans, test scripts, and gather list of testers while the other half of them continued on application development, managing project progress and documenting various project artifacts. To ensure the team deliver the project with good quality and on time, we have made changes to our project management styles and constantly revised our scope, task status and interacted consistently with our client and supervisor such that we are both efficient and task-oriented!

  • Change in project management style

Part of the SCRUM methodology which was adopted by our team is to have daily stand-up meetings where members will physically meet up and brief each other on the progress of assigned tasks and highlight any issue to the team which may impede the project progress. Nonetheless, the stand-up meeting is rather time-consuming and inconvenient, so we decided to reduce the frequency of stand-up meeting to every three days, excluding our weekly formal meetings. We also developed a concept called remote daily scrum update where each member will compile up things such as work progress and issue status and email to other team members. In this way, we are able to discover new issues easily and it works cohesively with our project management portal. For meetings, we used to have two formal meetings per week, and now we learned to combine the 2nd meeting (usually happens on Friday) with our client meeting every two weeks. During the client meeting, we will first update client on our project status, show him the latest prototype with the new features and gather feedback. After that, we will work with client to gather test requirements for the modules to be developed for the next sprint. Once the client left, our team will review internally on our progress over the past sprint. Following that. we will have sprint planning meeting for the next sprint and created tasks in our project management portal in order to track our project progress.

  • Change in project requirements & technologies with better substitues

We decided to implement Facebook style comments and displays for our Group and Forum module under the client's request. We reworked our portal's theme design to have a mixed of our proposed design (white and dark themes). We implemented dynamic Google site translation first instead of static content internationalization which now supports more than 20 languages. The advantage of doing this is that the entire site content can be translated on the fly whereas the static content translation is much laborious and do not translate user posted contents. However, we are still in the process of deciding whether to remain the static translation feature as it provides better accuracy for key information but it would require cohesive effort from the community to improve on its usefulness. We have also replaced our Mibbit IRC chat which has already been customized and integrated with a new "Envolve" chat feature. The latter is much more powerful and supports more features such as creating private chat topics, invite people online to the chat group as well as public chats. The new chat also has better view design as the chat box can be minimized at the bottom of the browser window or open in a new window whereas the original IRC chat do not have such flexibility. The new chat also support cool features such as dynamic translation where a message will be auto-translated into the language detected from the browser.

  • Change in member roles & work progress

To cope with the growing workload, our team members have shifted our spectrum and learn to take up different roles in our project activities. While each of us does part of the programming, We have a niche area that each other focuses at. For example, when we develop a new functional module, the chief programmer will help to digest the requirements and sub-divided them into separate functional requirements. Similar for design and testing, the key member in the team will have to detail what are the necessaries steps needed to carry out in order to complete a certain task. One member may be gather user for UAT under the Tester's request, the other may be doing Photoshop design to help speed up the work of the designer. The chief programmer may also suggest how the site's theme can be better integrated to enjoy more vibrancy. Sometimes we will also rotate our roles and pair up, a good example will be preparing UAT documents and project artifacts which often involve two members at a given point of time.

In terms of programming, we have developed a workflow to streamline our coding practices. For all the functionality to be implemented, we will first draft the acceptance test cases which detail the functional and non-functional requirements of the client. The team will proceed by distributing the tasks pertaining to all these requirements. Since we are building our portal using Drupal, we have the flexibility that each of us can be designated to complete certain functionality by ourselves without delaying the team's progress. A member can be tasked to test out a third party module which has not been fully supported but may potentially meet our project requirement. He will first create a task ID with description in our project management portal he proceeds with testing and development of the module with our latest deployed prototype. If a bug has been found, he will have to examine the code of the contributed module in Eclipse and tries to debug the issue at the first place. If the bug has not been solved in the given time period of the sprint, the member will assign it to others who are available to continue working on the issue or and start off looking for alternative solutions. The process is similar for other project activities such as debugging, design, developing optional features etc. Once a member think that the issue has been solved, he will update issue status on project management portal as resolved, and commit any code change to subversion with the attached task-ID. One advance feature we have is that if the member writes the subversion commit log message that mentions keywords such as "Ref No: 145 by XX: Issue is RESOLVED", our project management portal will automatically update the task status since the management portal and subversion are located on the same server.

Project Management

Project Status

Functional Module Task Status Confidence Level (0-1) Comment
Account Registration & Login Completed 100% & Tested in UAT 1.0 Account registration is only enabled for member role
Portal Forum Completed 100% & Tested in UAT 0.9 Minor bug fix
Portal Event Organizer Completed 80% & Tested in UAT, UI in progress 0.7 Plan to research more websites for external event aggregation
School & Course Review Completed 100% 0.8 Implemented activity stream to feature latest posts
School Search Completed 80% 0.8 Plan to improve on the usability of school filter by changing layout
Social Networking Features Completed 100% & Tested in UAT 0.9 This feature includes personalize profile, join groups, add friends, upload documents, live-chat & in-site message
Multi-language Support Completed 100% 1.0 Implemented Google translate API which now supports more than 20 languages
Content Personalization Completed 20% 0.6 This module has the highest technical complexity in our portal & will be our key forum in the coming sprint
Publishing Workflow Completed 50% 0.8 Implemented for event creation and internally tested. The remaining work is to implement for advertisement submission
Mapping Completed 100% 0.9 Plan to research more into Google Map API
Social Media Integration Completed 100% 1.0 Supports Facebook, Twitter, Digg and other 40+ social media tools to share site contents
E-mail Notifications Completed 100% 0.8 Supports account activation, in-site message notification & site invitations
Advertisements Completed 100% 1.0 Improving on the design and layout of page columns to dedicate to advertisements

Project Schedule (Plan VS Actual)

Sprint Iterations Planned Completion Date Actual Completion Date Comment
Sprint 5 Develop social networking module 26th August 2011 26th August 2011 Completed as planned
Implement Mibbit IRC Chat 26th August 2011 Replaced Mibbit IRC Chat with Envolve Chat 2nd September 2011 Postponed to Sprint 6
Develop publishing workflow 2nd September 2011 30th September Postponed to Sprint 7
Sprint 6 Continued development for forum and implemented Envolve Chat system 2nd September 2011 Code revision for UAT
Develop internationalization for multi-language support 2nd September 2011 Implemented multi-language support using Google translate API and static content translation which is provided by Drupal Core 2nd September 2011 Swapped technology and completed as planned
Sprint 7 Develop Mapping & Advertisement 30th September 2011 30th September 2011 Currently in Progress
Prepare for Mid-term Presentation & mid-term wiki section 26th September 2011 The content of presentation and wiki will be liable to the overall progress of the project

Project Metrics

As a Scrum team, we implemented a few metrics measurements to assist us in managing the project in terms of predictability, quality, predictability and value. Our team believes that a good metric will be something that can effective gauge and reflect our team's progress as well as providing useful insights for us to improve on our performance and regulate our scope of project such that a good final product can be delivered on schedule.

Sprint Completion Bar (Productivity Metric) - measures the percentage of work completed by the team at the end of every sprint as compared to the planned tasks of the sprint. The sprint completion bar is a stacked bar with 3 elements which consists of Open (lighter green color portion of the bar, that represents tasks which have been created and currently in progress) and Closed (darker green color portion of the bar, that represents tasks which have been created and completed) and New ( white color portion of the bar, that represents tasks which have been created but not started). Each element is presented in a separate color. The stacked bar shows work completed in each sprint. Longer Close portion of the bar is a good indicator. Longer New or In Progress is a poor indicator. By end of the sprint, New / Open tasks will be shifted to the following sprint and any tasks which is deemed unfeasible will be treated as waste - as defined by work that was subsequently removed due to not needed. Thus, each sprint may not be 100% completed. However, if there are too many wastes, it represents a poor indicator of our task planning. If there are too often or many open tasks which are shifted to later sprints, it indicate that we are falling behind schedule or the scope is to big. This serves as a warning sign for us to reduce the scope the project with client and reschedule our project task sprints. [Sprint Completion Bar]

BurnDown Chart

Development Velocity / Sprint Burn-down Chart (Predictability Metric) - Velocity is measured by the number of story points (we assigned story points to tasks depending on the expected number of man-hours / complexity of the task) completed per sprint against the planned story points to be completed. It helps us to gauge the consistency in the productivity throughout the sprint. Most of the time, we will use the velocity together with the sprint burn-down chart which measures both velocity and scope change throughout a sprint as a measure of meeting a set goal. The number of remaining story points serve as a basis for us to predict if we will be able to complete assigned tasks with the given time frame. If the story points is not completed as fast as we expected or when the actual production line falls far off from the planned production line, the team will respond immediately by relocating man-power as well as reschedule tasks to be completed in the following sprint to allow buffer time for amendments.

The chart is plotted based on the number of story points which has not been resolved as well as the number of remaining hours. The ideal is the burn rate which is optimal for the team and the team shall aim to achieve this burn rate throughout all the sprint development cycles.The horizontal axis is the number of days in a planned sprint (excluding weekends) and the number of remaining hours is calculated based on the remaining days in each sprint. The story points to be completed in each sprint is calculated by subtracting the story points at the top left corner with the story points at the top right corner.

Team Satisfaction Metric

Team Satisfaction Metric (Value Metric) - The satisfaction is introduced to gauge the quality of team's work completed by the end of every sprint development. Each member of the team is asked to give a rating on his evaluation of the team's work done over the past sprint on a scale of 1 to 7 from extremely dissatisfied to extremely satisfied. The average score from all members will be evaluated and actions will be taken by the project manager. The goal of the metric is to encourage each member to work hard as well as providing an alert signal to the team on the quantity and quality of work completed during each sprint. Since the score will be normalized across the sprints and average has been taken from all the members, it shall serve an useful indicator for monitoring the team's over performance and raise the team's morale when everyone is generally satisfied with the team work.

Project Risks

Technical Complexity

3rd party module integration / customization / bugfix

  • > 50 third party modules

eg. cck, views, context, date, calendar, heartbeat, flags, phpbbforum, og, fbss, user_relationships, envolve, notifications, i18n, privatemsg, geocode, openlayers, etc.

Custom module & content type

  • Custom school & course content types written to integrate with views, geocode, organic groups, and openlayers.

Custom module for content personalization

  • Usage of third party sentiment analysis web service
  • Building of user interests database
  • Modifying third party module results on the fly

Quality of Product

Intermediate Deliverables

Category Description Deliverable
Project Management Team Meeting Minutes Team Meeting Minutes
Client Meeting Minutes Client Meeting Minutes
Supervisor Meeting Minutes Supervisor Meeting Minutes
Sprint Progression Bar Sprint Completion Bar
Sprint Burn Down Chart / Velocity Points Sprint Burndown Charts
Team Satisfaction Metrics Team Satisfaction Metric
User Requirements and Analysis Use Case Diagram Use Case Diagrams
User Stories User Stories
UI Mock-ups UI Mock-ups
Design Application Architecture Design Application Architecture Diagram (Updated)
Portal Theme Design Theme A, Theme B
Testing Portal User Manual User Manual
UAT 1 Test Plan UAT Test Plan
UAT 1 Test Script UAT Test Script
UAT Bug Tracker Bug Tracker




UAT I Test Plan

Currently, our team has several functionalities completed, including Register, Login, Forum, Social Networking and Events. In order to simulate the test response as close to the final end users as possible, we have approached our project sponsor to help gather testers for our UAT 1. Due to certain sponsor constraints, we have decided to hold an offline UAT this time round. We plan to conduct an onsite testing for the next UAT, UATII.

Functionality Tested:

  1. Login
    1. Successful Kogin
    2. Warning Messages for Login Failure
  2. Social Networking
    1. Create Group
    2. Request & Accept Group Membership
    3. Post Content on Group Wall
    4. View and Edit User Profiles
    5. Add Friend & Accept Friend Request
    6. View Friends list
    7. Send and Receive Private Message
  3. Events
    1. View List of Event Details
    2. Sign-up for Event
    3. Create Event
    4. Event Invitation
  4. Forum
    1. Post Thread
    2. Reply Thread

Testing Period: 0000hr Wednesday 7th September to 2359hr Saturday 10th September
Time required: 1 to 2 hours
Total Number of Testers: 10


  1. Test Preparation
    1. Prepare UAT test script (test scenarios covering functionalities to be tested)
    2. Prepare user feedback form
    3. Prepare user manual for testers’ reference
    4. Prepare test instructions
    5. Prepare test accounts
  2. Internal testing
    1. Internal testing by all team members
    2. Amend and edit test cases
    3. Fix minor bugs/UI
  3. Test Execution
    1. Send test instructions, test script and user manual to testers
    2. Test execution by testers
    3. Send completed test scripts and user feedback form back to project team
  4. Results Collection
    1. Collate all test results and identify Failed test cases
  5. Review and Code Fix Planning
    1. Review failed test cases and identify bugs
    2. Review priority and assign bug fix schedule
    3. Code fixing

UAT Schedule - Planned vs Actual

Activity Planned Completion Date Actual Completion Date Comment
UAT 1 Prepare UAT Test Plan & Test Script 7th September 2011 8th September 2011 Revised test scripts after internal testings
Conduct off-site UAT with volunteers 10th September 2011 15th September 2011 Some volunteers did not return completed test scripts on time
Code & UI revision 16th September 2011 16th September 2011 Managed to fixed bugs found as planned and postponed to UI revision to UAT II

UAT Bug Tracker

Program bugs will be recorded and reviewed periodically. Urgent bug reviewing meeting will be scheduled depending on the number of bugs and the urgency of bug to be resolved

UAT I Actual Test Details

Testers' Profiles
As mentioned, we aimed to keep the test environment as close to actual usage in Production. As seen in the table of testers' profile below, 60% of testers are international students, the actual intended user group. Click to see the Testers' Profiles

Test Accounts Allocation

Test Accounts have been created for each tester individually to stream-flow the test execution process. Every user has been given three sets of accounts with different roles and preassigned group name and event name to complete the all planned test cases. Click to see the Testers' Accounts

UAT I Test Cases

Click to see the Test Cases

UAT I Test Results & Reflections

Out of 21 test cases, only 1 test case had testers indicating failure. This give a 95% UAT pass rate. This high pass rate provide our team encouragement on our good work done till date. However, we are aware that a high pass rate does not necessary indicate a successful UAT. We encountered several problems in the course of conducting our UAT I. The problems encountered serve as lessons learnt and room for improvement in our next UAT.

Firstly, we did not plan well enough for our UAT preparation. We spent more time than expected in doing up all the test cases, and at the same time ensuring that our site is "fit" for UAT. There were the existence of minor bugs which were discovered in the course of constructing the test cases, therefore delaying our time spent on UAT preparation.

Secondly, an offline UAT presented to us some limitations. We found it difficult to coordinate with our testers. First of all, because it was an offline UAT, we had to thoroughly make sure that the test script is clear and coherent enough for all testers to understand and follow. We also faced the issue of getting testers to complete the UAT within the 3 days allocated test duration. We finally received 80% of our test scripts back after approximately 1 week. We were unable to get results from all testers who volunteered their help at the end of the day.

Thirdly, we were unable to witness for ourselves how the testers went through their test scripts and therefore our observations were only limited to the test scripts they sent back.

Last but not least, due to the unforeseen circumstances, our team felt that we should have requested for more testers instead of limiting ourselves to only 10. The sample size of our test results now remain small, and we would have benefited much greatly with more user feedback and suggestion.

On a positive note, our team did well in ensuring that all test instructions, including the test script, was written clearly and coherently. Most of our testers feedback that the test instructions were clear and easily understood.

Improvements to UAT

With the lessons learnt as mentioned above, we will be including the following improvements and changes in our next UAT, UAT II.

  1. Conduct an onsite UAT 2 to allow us to observe testers
  2. Allocate more time for UAT Preparation
  3. Gather a bigger pool of UAT testers

Individual Reflections

TAN Hui Wen Hayden

The journey from the initiation of the project to the current stage has definitely been enriching. Drupal and the Scrum methodology were both foreign to me, and as time passed, I start to gain better understanding on how the Scrum process is carried out, and how Drupal works. Drupal is a more technical knowledge to pick up, while Scrum is more tightly related to project management. Project management was relatively easy in the first couple of sprints, as we could easy meet the 2-weeks sprint duration. As the semester progressed, school work became one of the main cause to problems such as unfinished tasks, last minute work and etc. However, the more we failed, the more we learnt. On reflection, I realised that one will hardly ever learn if he/she does not encounter issues.

Problems and issues are inevitable in the course of a project. One key to resolving these issues, is proper communication. It is a good practice to keep both project sponsor and supervisor abreast of developments in the project. Change requests from sponsor will be better managed, expectations between project team and sponsor can be better aligned, at the same time, constant communication with project supervisor allows the team to be properly guided and kept on track. Communication among team members is most important, because successful team work can only be achieved when each and every member has their expectations aligned with each other, and that each member understands the abilities, strengths and weaknesses of one another.

MA Cheng

It has been a great journey for FYP as I have the opportunity to work with my wonderful teammates. My key take-away is that the most important thing to keep a team going is perseverance and the dedication to provide excellent work. As a team, we have been continuously working on alternative solutions to deliver the product and frequently revise on the User Interface as well as structure of the site. We feel a sense of pride when we gradually build up a system and improve on its various features. I also learn to think both as a developer and a user which helps me to gain insights on the prowess and user-friendliness of our application. We are motivated to deliver the best product we can and as one will describe, this FYP is a chance for us to merge both technicalities of the business world with integrate of technology.

Heng Jia Qi

I have learnt a lot since FYP. As a designer, I have to first discuss with the client to understand what appeals to him, and what user-interface he has in mind. It has been quite a task communicating design ideas to the client because his ideas are ever changing and easily influenced by other designs. I realised that designs and user-interfaces are very subjective, and in the course of the FYP, I learnt how to communicate my design ideas to the client, such that both the client's idea and my idea merges into one finalised user-interface design. Next, our client was not able to provide us with photos and images that were suitable for the portal.Thus, searching extensively through royalty-free image galleries for suitable images and also personally create some designs were needed. I learnt that it is important to discuss with the client on project materials which he can provide right from the start, and review through these materials right from the start, so as to reduce any unnecessary surprises last minute.

Overall, managing client's expectations is the most valuable lesson learnt. My team and I have learnt to have requirements written out explicitly in order to limit changes in client's expectations and requirements. We also learnt to work towards the in between, between client's expectations and our beliefs and skills. Therefore, it is important to balance the client's expectation's and our team's schedule and abilities.

Jinson XU Guang Zu

After working 3 months on the project, I learn that it is very important to manage the scope and plan the tasks to be completed in each sprint. To do that, we need to understand the team's strength and weakness, and allocate the task to the most fitted candidate. To further improve on our productivity, we need to constantly review on our schedule and prioritize task based on client’s requirement. It is also important not to over promise and make sure everything can be delivered on time. As a project manager, I learn to lead the team to over challenges. Once of my task is to motivate the team when we face challenges and initiate discussions among the team on how we can solve the problems. It is a great learning opportunity for me to practice project management skills while embark on a real project

Click on Return to go back to the team's home page