2011T1 Victorious Secrets Final Wiki Page
- 1 Project Progress Summary
- 2 Project Management
- 2.1 Project Status
- 2.2 Project Schedule (Plan VS Actual)
- 2.3 Project Metrics
- 2.4 Project Risks
- 2.5 Technical Complexity
- 3 Quality of Product
- 4 Individual Reflections
Project Progress Summary
The individual workload on the project has increased substantially since the starting of the school term as we have to juggle both the project and other course subjects. We experienced short of hands when we were conducting User Acceptance Test where half of the four-person team were dedicated to design test plans, test scripts, and gather list of testers while the other half of them continued on application development, managing project progress and documenting various project artifacts. To ensure the team deliver the project with good quality and on time, we have made changes to our project management styles and constantly revised our scope, task status and interacted consistently with our client and supervisor such that we are both efficient and task-oriented!
- Change in project management style
Part of the SCRUM methodology which was adopted by our team is to have daily stand-up meetings where members will physically meet up and brief each other on the progress of assigned tasks and highlight any issue to the team which may impede the project progress. Nonetheless, the stand-up meeting is rather time-consuming and inconvenient, so we decided to reduce the frequency of stand-up meeting to every three days, excluding our weekly formal meetings. We also developed a concept called remote daily scrum update where each member will compile up things such as work progress and issue status and email to other team members. In this way, we are able to discover new issues easily and it works cohesively with our project management portal. For meetings, we used to have two formal meetings per week, and now we learned to combine the 2nd meeting (usually happens on Friday) with our client meeting every two weeks. During the client meeting, we will first update client on our project status, show him the latest prototype with the new features and gather feedback. After that, we will work with client to gather test requirements for the modules to be developed for the next sprint. Once the client left, our team will review internally on our progress over the past sprint. Following that. we will have sprint planning meeting for the next sprint and created tasks in our project management portal in order to track our project progress.
- Change in project requirements & technologies with better substitues
We decided to implement Facebook style comments and displays for our Group and Forum module under the client's request. We reworked our portal's theme design to have a mixed of our proposed design (white and dark themes). We implemented dynamic Google site translation first instead of static content internationalization which now supports more than 20 languages. The advantage of doing this is that the entire site content can be translated on the fly whereas the static content translation is much laborious and do not translate user posted contents. However, we are still in the process of deciding whether to remain the static translation feature as it provides better accuracy for key information but it would require cohesive effort from the community to improve on its usefulness. We have also replaced our Mibbit IRC chat which has already been customized and integrated with a new "Envolve" chat feature. The latter is much more powerful and supports more features such as creating private chat topics, invite people online to the chat group as well as public chats. The new chat also has better view design as the chat box can be minimized at the bottom of the browser window or open in a new window whereas the original IRC chat do not have such flexibility. The new chat also support cool features such as dynamic translation where a message will be auto-translated into the language detected from the browser.
- Change in member roles & work progress
To cope with the growing workload, our team members have shifted our spectrum and learn to take up different roles in our project activities. While each of us does part of the programming, We have a niche area that each other focuses at. For example, when we develop a new functional module, the chief programmer will help to digest the requirements and sub-divided them into separate functional requirements. Similar for design and testing, the key member in the team will have to detail what are the necessaries steps needed to carry out in order to complete a certain task. One member may be gather user for UAT under the Tester's request, the other may be doing Photoshop design to help speed up the work of the designer. The chief programmer may also suggest how the site's theme can be better integrated to enjoy more vibrancy. Sometimes we will also rotate our roles and pair up, a good example will be preparing UAT documents and project artifacts which often involve two members at a given point of time.
In terms of programming, we have developed a workflow to streamline our coding practices. For all the functionality to be implemented, we will first draft the acceptance test cases which detail the functional and non-functional requirements of the client. The team will proceed by distributing the tasks pertaining to all these requirements. Since we are building our portal using Drupal, we have the flexibility that each of us can be designated to complete certain functionality by ourselves without delaying the team's progress. A member can be tasked to test out a third party module which has not been fully supported but may potentially meet our project requirement. He will first create a task ID with description in our project management portal he proceeds with testing and development of the module with our latest deployed prototype. If a bug has been found, he will have to examine the code of the contributed module in Eclipse and tries to debug the issue at the first place. If the bug has not been solved in the given time period of the sprint, the member will assign it to others who are available to continue working on the issue or and start off looking for alternative solutions. The process is similar for other project activities such as debugging, design, developing optional features etc. Once a member think that the issue has been solved, he will update issue status on project management portal as resolved, and commit any code change to subversion with the attached task-ID. One advance feature we have is that if the member writes the subversion commit log message that mentions keywords such as "Ref No: 145 by XX: Issue is RESOLVED", our project management portal will automatically update the task status since the management portal and subversion are located on the same server.
- Reduce project scope by dropping the publishing workflow module
After experimenting with the publishing workflow we have planned for the client, we come to realize that the module is rather counter-productive for the tier-ed approval process and will require extra effort from the client to complete the same goal. This is also due to the design that we do allow anonymous users to suggest events via web forms. Thus, the screening of submitted contents via web form would have achieved the same purpose as publishing workflow by using Rule module in Drupal which will be notified about the newly suggested event. By removing the publishing workflow module, it helps to streamline the content approval process and reduce the manpower required to approve event suggestion and make the program more user-friendly. We also aim to prevent installing additional modules which has high complexity that will require extra learning of usage from the client side and reduce the server load of running such modules.
|Functional Module||Task Status||Confidence Level (0-1)||Comment|
|Account Registration & Login||Completed 100% & Tested in UAT||1.0||Account registration is only enabled for member role|
|Portal Forum||Completed 100% & Tested in UAT||0.9||Minor bug fix|
|Portal Event Organizer||Completed 100% & Tested in UAT||0.7||Plan to research more websites for external event aggregation|
|School & Course Review||Completed 100% and Tested||0.8||Implemented activity stream to feature latest posts|
|School Search||Completed 100% and Tested in UAT||0.8||Plan to improve on the usability of school filter by changing layout|
|Social Networking Features||Completed 100% & Tested in UAT||0.9||This feature includes personalize profile, join groups, add friends, upload documents, live-chat & in-site message|
|Multi-language Support||Completed 100% & Tested in UAT||1.0||Implemented Google translate API which now supports more than 20 languages|
|Content Personalization||Completed 40%||0.6||This module has the highest technical complexity in our portal & will be our key forum in the coming sprint|
|Publishing Workflow||Abandoned||0.3||After testing out the publishing workflow, it shows that the process has low value to the customer and complicate the publishing process of events instead of adding value to it|
|Mapping||Completed 100% Tested in UAT||1.0||Plan to research more into Google Map API|
|Social Media Integration||Completed 100% and Tested in UAT||1.0||Supports Facebook, Twitter, Digg and other 40+ social media tools to share site contents|
|E-mail Notifications||Completed 100% & Tested in UAT||1.0||Supports account activation, in-site message notification & site invitations|
|Advertisements||Completed 100%||1.0||Improving on the design and layout of page columns to dedicate to advertisements|
Project Schedule (Plan VS Actual)
|Sprint Iterations||Planned Completion Date||Actual Completion Date||Comment|
|Sprint 5||Develop social networking module||26th August 2011||26th August 2011||Completed as planned|
|Implement Mibbit IRC Chat||26th August 2011||Replaced Mibbit IRC Chat with Envolve Chat||2nd September 2011||Postponed to Sprint 6|
|Develop publishing workflow||2nd September 2011||30th September||Postponed to Sprint 7|
|Sprint 6||Continued development for forum and implemented Envolve Chat system||2nd September 2011||Code revision for UAT|
|Develop internationalization for multi-language support||2nd September 2011||Implemented multi-language support using Google translate API and static content translation which is provided by Drupal Core||2nd September 2011||Swapped technology and completed as planned|
|Sprint 7||Develop Mapping & Advertisement||30th September 2011||30th September 2011||Completed as planned|
|Prepare for Mid-term Presentation & mid-term wiki section||26th September 2011||The content of presentation and wiki will be liable to the overall progress of the project|
|Sprint 8||Implement Site Newsletter and Notifications||7th October 2011||7th October 2011||Tested on staging server and need to be tested again on production server after Client acquires it for us|
|Develops content personalisation||28th October 2011||Paused development of Content Personalisation for UAT II|
|Prepare FYP poster||7th November||11th November 2011||The poster submission deadline has been postponed to 11th November 2011 due to public holiday|
|Prepare for all the final deliverable and documentations||18th November 2011||22th November 2011||Wrap up the project and prepare for the final presentation|
|Migrate the system from staging server to Client owned production server||22th October 2011||22th October 2011||Deploys the portal on production server and configures the settings on the VPS to optimize performance|
As a Scrum team, we implemented a few metrics measurements to assist us in managing the project in terms of predictability, quality, predictability and value. Our team believes that a good metric will be something that can effective gauge and reflect our team's progress as well as providing useful insights for us to improve on our performance and regulate our scope of project such that a good final product can be delivered on schedule.
Sprint Completion Bar (Productivity Metric) - measures the percentage of work completed by the team at the end of every sprint as compared to the planned tasks of the sprint. The sprint completion bar is a stacked bar with 3 elements which consists of Open (lighter green color portion of the bar, that represents tasks which have been created and currently in progress) and Closed (darker green color portion of the bar, that represents tasks which have been created and completed) and New ( white color portion of the bar, that represents tasks which have been created but not started). Each element is presented in a separate color. The stacked bar shows work completed in each sprint. Longer Close portion of the bar is a good indicator. Longer New or In Progress is a poor indicator. By end of the sprint, New / Open tasks will be shifted to the following sprint and any tasks which is deemed unfeasible will be treated as waste - as defined by work that was subsequently removed due to not needed. Thus, each sprint may not be 100% completed. However, if there are too many wastes, it represents a poor indicator of our task planning. If there are too often or many open tasks which are shifted to later sprints, it indicate that we are falling behind schedule or the scope is to big. This serves as a warning sign for us to reduce the scope the project with client and reschedule our project task sprints. [Sprint Completion Bar]
Development Velocity / Sprint Burn-down Chart (Predictability Metric) - Velocity is measured by the number of story points (we assigned story points to tasks depending on the expected number of man-hours / complexity of the task) completed per sprint against the planned story points to be completed. It helps us to gauge the consistency in the productivity throughout the sprint. Most of the time, we will use the velocity together with the sprint burn-down chart which measures both velocity and scope change throughout a sprint as a measure of meeting a set goal. The number of remaining story points serve as a basis for us to predict if we will be able to complete assigned tasks with the given time frame. If the story points is not completed as fast as we expected or when the actual production line falls far off from the planned production line, the team will respond immediately by relocating man-power as well as reschedule tasks to be completed in the following sprint to allow buffer time for amendments.
The chart is plotted based on the number of story points which has not been resolved as well as the number of remaining hours. The ideal is the burn rate which is optimal for the team and the team shall aim to achieve this burn rate throughout all the sprint development cycles.The horizontal axis is the number of days in a planned sprint (excluding weekends) and the number of remaining hours is calculated based on the remaining days in each sprint. The story points to be completed in each sprint is calculated by subtracting the story points at the top left corner with the story points at the top right corner.
Team Satisfaction Metric (Value Metric) - The satisfaction is introduced to gauge the quality of team's work completed by the end of every sprint development. Each member of the team is asked to give a rating on his evaluation of the team's work done over the past sprint on a scale of 1 to 7 from extremely dissatisfied to extremely satisfied. The average score from all members will be evaluated and actions will be taken by the project manager. The goal of the metric is to encourage each member to work hard as well as providing an alert signal to the team on the quantity and quality of work completed during each sprint. Since the score will be normalized across the sprints and average has been taken from all the members, it shall serve an useful indicator for monitoring the team's over performance and raise the team's morale when everyone is generally satisfied with the team work.
3rd party module integration / customization / bugfix
- > 50 third party modules
eg. cck, views, context, date, calendar, heartbeat, flags, phpbbforum, og, fbss, user_relationships, envolve, notifications, i18n, privatemsg, geocode, openlayers, etc.
Custom module & content type
- Custom school & course content types written to integrate with views, geocode, organic groups, and openlayers.
Custom module for content personalization
- Usage of third party sentiment analysis web service
- Building of user interests database
- Modifying third party module results on the fly
Performance optimization at production server
- Improve loading performance of Drupal modules
- Cloud structure via VPS
- Database performance optimization
Quality of Product
|Project Management||Team Meeting Minutes||Team Meeting Minutes|
|Client Meeting Minutes||Client Meeting Minutes|
|Supervisor Meeting Minutes||Supervisor Meeting Minutes|
|Sprint Progression Bar||Sprint Completion Bar|
|Sprint Burn Down Chart / Velocity Points||Sprint Burndown Charts|
|Team Satisfaction Metrics||Team Satisfaction Metric|
|User Requirements and Analysis||Use Case Diagram||Use Case Diagrams|
|User Stories||User Stories|
|UI Mock-ups||UI Mock-ups|
|Design||Application Architecture Diagram||Application Architecture Diagram|
|Solution Architecture Diagram||Solution Architecture Diagram|
|Database Diagram||Database Diagram|
|Deployment Diagram||Deployment Diagram|
|Portal Theme Design||Theme A, Theme B|
|Testing||Portal User Manual||User Manual|
|UAT 1 Test Plan||UAT Test Plan|
|UAT 1 Test Script||UAT Test Script|
|UAT Bug Tracker||Bug Tracker|
|UAT 2||Refer to section below|
- Undergoing performance optimization for production server before go-live on 24th November 2011
UAT II Test Plan
With all the functionalities completed, our UATII aims to test the entire web portal for bugs, usability and etc. After conducting only an offsite test for UATI, our team noticed several limitations that came with an offsite test. There was lack of control and our team was also unable to observe the users while they carry out the test. For UATII, our team has decided to take a two prong approach. We will conduct both an onsite and offsite test. Onsite test will be split into two sessions with 4 testers each to allow for better control. Our project sponsor will work with our team hand-in-hand to source for testers.
- Successful Kogin
- Warning Messages for Login Failure
- Social Networking
- Create Group
- Request & Accept Group Membership
- Post Content on Group Wall
- View and Edit User Profiles
- Add Friend & Accept Friend Request
- View Friends list
- Send and Receive Private Message
- View List of Event Details
- Sign-up for Event
- Create Event
- Event Invitation
- Google map lovation
- Post Thread
- Reply Thread
- View school details
- Search courses
- View course detail
- Private chat
- Group chat
- General chat
- Multi-language support
- Site navigation in foreign language
Offsite Testing Period: 0000hr Friday 28th October to 2359hr Sunday 30th October
Onsite Test Session 1: 28th October Friday 1600hrs to 1800hrs
Onsite Test Session 2: 4th November Friday 1600hrs to 1800hrs
Time required: 2 hours
Total Number of Testers: 16 (8 onsite, 8 offsite)
- Test Preparation
- Prepare UAT test script (test scenarios covering functionalities to be tested)
- Prepare user feedback form
- Prepare user manual for testers’ reference
- Prepare test instructions
- Prepare test accounts
- Prepare UAT test script (test scenarios covering functionalities to be tested)
- Internal testing
- Internal testing by all team members
- Amend and edit test cases
- Fix minor bugs/UI
- Test Execution (Offsite)
- Send test instructions, test script and user manual to testers
- Test execution by testers
- Send completed test scripts and user feedback form back to project team
- Send test instructions, test script and user manual to testers
- Prepare test instructions and test script
- Prepare hardware (laptop) for testers
- Test execution by testers
- Observation by team
- Results Collection
- Collate all test results and identify Failed test cases
- Collate all test results and identify Failed test cases
- Review and Code Fix Planning
- Review failed test cases and identify bugs
- Review priority and assign bug fix schedule
- Code fixing
- Review failed test cases and identify bugs
UAT Schedule - Planned vs Actual
|Activity||Planned Completion Date||Actual Completion Date||Comment|
|UAT II||Test Preparation||27 October 2011||27 October 2011||Revised test scripts after internal testings|
|Test Execution 1 (Onsite 1 and Offsite)||30 September 2011||30 September 2011||Volunteers for offsite test were kept to only close friends, to increase reliability|
|Test Review||31st October 2011||31st October 2011||No test case failures, only comments to UI|
|Code Fix; UI Fix||3 November 2011||3 November 2011||No bugs found, only minor edits to UI|
|Test Execution 2 (Onsite 2)||4th November 2011||4th November 2011||100% pass rate|
UAT Bug Tracker
Program bugs will be recorded and reviewed periodically. Urgent bug reviewing meeting will be scheduled depending on the number of bugs and the urgency of bug to be resolved
UAT II Actual Test Details
As mentioned, we aimed to keep the test environment as close to actual usage in Production. As seen in the table of testers' profile below, almost 90% of the testers are students aged between 18 to 25 years of age. Click to see the Testers' Profiles
Test Accounts Allocation
Test Accounts have been created for each tester individually to stream-flow the test execution process. Every user has been given three sets of accounts with different roles and preassigned group name and event name to complete the all planned test cases. Click to see a sample list of the Testers' Accounts
UAT II Test Cases
Click to see the Test Cases
UAT II Test Results & Reflections
UATII can be considered an overall success. In UATI, we conducted an offsite test, and we found that the pool of testers were rather small. At the same time, there were a lot of delays in UATI as testers did not have to abide stricty to tet instructions, and we did not have control over it as well. We therefore conducted both an onsite and offsite test in UATII. Offsite UAT has 8 testers, and occured over a window period. Onsite UAT had a total of 8 testers as well, split into two sessions with 4 testers per session.
On the overall UATII can be deemed as a success and big improvement as compared to UATI. Firstly, we allocated sufficient time for UAT preparation. With experience from UATI, we allocated sufficient time to prepare the test scripts and instructions, and at the same time, we catered buffer time for unexpected delays. Secondly, splitting onsite testing into two small sessions allowed our group to have better control during test execution itself and allowed us to better observe the testers during UAT. Thirdly, we had zero failure cases in UATII, which indicated that our team has done well in in terms of ensure our portal is bug-free. Last but not least, our UATII achieved 100% pass rate. There were no test failures indicated by all of our testers.
Onsite UAT gave our team a chance to observe the users navigating through our web portal. This gave our team to better understand usability issues which might be passed unnoticed in an offsite test. We observed that there were particular steps which took users longer to go through. We found out that many a time it was due to important links being not prominent enough. Onsite UATII also gave us the chance to talk to the users face-to-face to better understand how they felt about the web portal and their suggestions for improvement.
UATII was therefore an overall success and our group is focusing on designing certain portions of the UI based on user feedback. One area which could be improved, was that our team had an oversight on hardware issues which might arise. We encountered some issues with the network/internet connection during our first onsite UAT session, resulting a delay in the time taken to complete the UAT. We rectified the problem in the next onsite session and eliminated the unnecessary delays.
TAN Hui Wen Hayden
The journey from the initiation of the project to the current stage has definitely been enriching. Drupal and the Scrum methodology were both foreign to me, and as time passed, I start to gain better understanding on how the Scrum process is carried out, and how Drupal works. Drupal is a more technical knowledge to pick up, while Scrum is more tightly related to project management. Project management was relatively easy in the first couple of sprints, as we could easy meet the 2-weeks sprint duration. As the semester progressed, school work became one of the main cause to problems such as unfinished tasks, last minute work and etc. However, the more we failed, the more we learnt. On reflection, I realised that one will hardly ever learn if he/she does not encounter issues.
Problems and issues are inevitable in the course of a project. One key to resolving these issues, is proper communication. It is a good practice to keep both project sponsor and supervisor abreast of developments in the project. Change requests from sponsor will be better managed, expectations between project team and sponsor can be better aligned, at the same time, constant communication with project supervisor allows the team to be properly guided and kept on track. Communication among team members is most important, because successful team work can only be achieved when each and every member has their expectations aligned with each other, and that each member understands the abilities, strengths and weaknesses of one another.
It has been a great journey for FYP as I have the opportunity to work with my wonderful teammates. My key take-away is that the most important thing to keep a team going is perseverance and the dedication to provide excellent work. As a team, we have been continuously working on alternative solutions to deliver the product and frequently revise on the User Interface as well as structure of the site. We feel a sense of pride when we gradually build up a system and improve on its various features. I also learn to think both as a developer and a user which helps me to gain insights on the prowess and user-friendliness of our application. We are motivated to deliver the best product we can and as one will describe, this FYP is a chance for us to merge both technical know-how and our business senses to produce an integrated product.
Heng Jia Qi
I have learnt a lot since FYP. As a designer, I have to first discuss with the client to understand what appeals to him, and what user-interface he has in mind. It has been quite a task communicating design ideas to the client because his ideas are ever changing and easily influenced by other designs. I realised that designs and user-interfaces are very subjective, and in the course of the FYP, I learnt how to communicate my design ideas to the client, such that both the client's idea and my idea merges into one finalised user-interface design. Next, our client was not able to provide us with photos and images that were suitable for the portal.Thus, searching extensively through royalty-free image galleries for suitable images and also personally create some designs were needed. I learnt that it is important to discuss with the client on project materials which he can provide right from the start, and review through these materials right from the start, so as to reduce any unnecessary surprises last minute.
Overall, managing client's expectations is the most valuable lesson learnt. My team and I have learnt to have requirements written out explicitly in order to limit changes in client's expectations and requirements. We also learnt to work towards the in between, between client's expectations and our beliefs and skills. Therefore, it is important to balance the client's expectation's and our team's schedule and abilities.
Jinson XU Guang Zu
After working 3 months on the project, I learn that it is very important to manage the scope and plan the tasks to be completed in each sprint. To do that, we need to understand the team's strength and weakness, and allocate the task to the most fitted candidate. To further improve on our productivity, we need to constantly review on our schedule and prioritize task based on client’s requirement. It is also important not to over promise and make sure everything can be delivered on time. As a project manager, I learn to lead the team to over challenges. Once of my task is to motivate the team when we face challenges and initiate discussions among the team on how we can solve the problems. It is a great learning opportunity for me to practice project management skills while embark on a real project
Click on Return to go back to the team's home page