IS480 Team wiki: 2016T2 Team Sirius Final Wiki
- 1 Project Progress Summary
- 2 Project Management
- 3 Quality of Project
- 4 Reflection
- Current sprint: Sprint 13
- Sprint period: 31 March 2016 to 14 April 2016
- Major milestone: Final Presentation
We have also run experiments for 8 websites using Sirius AB. The team delivered 100% of the agreed-on scope as of 8 April 2016.
Tester for Sirius AB
"Sirius, is in its current form, very well thought of in terms of the various functions for A/B Testing, Engagement, Click, Page Scroll and Page Dwell as well as the reporting tools. I really like the click selection tool. We are looking at a go to market product that with more iterations and improvements to the UXID can be a great solution in the marketing tech. field." -- Aaron LEE Kwang Siong, Manager, Web & Digital Media - Office of the Dean (SMU SIS)
There were several challenges split into different milestones.
Configuring variations and goals on our platform and then injecting them into the desired page through our snippet code
Visual goal selection
There needs to be an easy way for users to configure goals, and that is through the visual goal selection. The team had to spent a lot of time studying an open-source library, Selector Gadget, in order to customise it to suit the needs of the system. The effort required was underestimated, but the team still managed to deliver this function on time.
The system has to support admin account(s), which have to be able to manage all experiments and user accounts.This was challenging as the authentication library used, Devise, does not handle user management. Hence, the team had to spend more time delving deeper into Devise in order to override existing functions to provide user management features. As a result, the team had to reprioritise the planned scope, and drop modules accordingly (as explained in our change management).
Following the scrum methodology has helped the team learn how to quickly adapt to changes in the project scope, and still be able to deliver a potentially shippable product at the end of each sprint. As every user story was crafted to suit the needs of the business, the team was always sure that there were no unnecessary features built. Having sprint reviews at the end of each sprint gave the team a tighter feedback loop, and any need for changes were only highlighted then. Consequently, scope creep was minimised, since the product owner had to prioritise tasks at every sprint review.
Practicing Continuous Integration (CI) has enabled the team to quickly track down integration bugs since small change sets are tested each time. The team never had to face last minute chaos during sprint reviews and supervisor meetings as the team could do frequent code pushes. CI has also enabled automated testing, helping us focus on developing functional and quality code.
Midterm vs Final Scope
Team Sirius has completed all the modules that were promised to the sponsor. Sponsor was happy and satisfied with the scope delivered by the team. Due to the high complexity and limited time left, the team had decided to drop the Visual Variation Editor during sprint 11. The team consulted both the supervisor and product owner, and both agreed that the team should focus on improving the features completed, and not start on a module that may not be completable in time.
Midterm vs Final Project Schedule
There are no major changes to project schedule since the Midterm presentation. The team conducted its User Test 3 on 16-17 March. The buffer time in Sprint 12 was used to fix bugs and improve on the UI based on the feedback given by our sponsor and User Test 3 testers. Handover to our sponsor was completed on 8 April before the team's exams. As mentioned, Visual Variation Editor was dropped due to its high complexity and limited amount of time left. Lastly, the team has achieved its X-Factor of getting 8 websites to run experiments on our system.
Formula: Average of accepted stories points of 3 sprints
Explanation: The average accepted story points is around 9 points for Sprint 11. Note that Sprint 10, 12 and 13 covered bug fixing and UI improvement activities, where each bug is logged as a story with 0 story point, and UI improvements do not have story points. Hence, there are no story points completed/accepted for those sprints.
- Planned: Total planned story points over number of days in a sprint
- Actual: Actual story points completed each day in a sprint
Burndown charts from some sprints since Midterm are highlighted here.
See our Midterm Wiki page for more details on sprints prior to Midterm.
|Risk Type||Risk Event||Likelihood||Impact||Mitigation|
|Technical Risk||Unable to develop WYSIWYG editor due to its high complexity||High||Medium||Active feasibility checks and notify sponsor if there is a need to rescope the function|
Explanation: This risk management was conducted after the Midterm Presentation as mentioned above. We worked with the sponsor to scope down and prioritise a smaller subset of the WYSIWYG editor features. While we agreed at the end of sprint 10 to scope down and prioritise the move/rearrange feature, after further investigation during sprint 11, we found that it was not feasible to complete with the remaining time. We eventually negotiated with the sponsor to drop this function.
See our Risks page for the full list of potential risks
Unaccepted Stories of Each Sprint
Explanation: If there are any bugs found at any point in time during the sprint, the bug is logged as a story with 0 points. If the bug impedes progress or prevents the team from delivering a shippable product, it will be prioritised and fixed immediately. If it does not impede progress, it will be prioritised together with the product backlog at the start of the next sprint. See our Stories page for each sprint's user stories.
1. Experiments Preview
As covered previously in the Midterm, the Same-Origin Policy applied by the browser creates several problems for our experiment previews. While the issues with references to CSS files, JS files, and images have been resolved, we noted that the self-hosted webfonts were not working. In particular, if a website is using self-hosted webfonts from a server that does not explicitly set the Cross-Origin Resource Sharing (CORS) headers to permit use from another domain, the browser will refuse to load the webfont.
We fix this issue by proxying all the webfonts through our server, so that the previewed HTML page and all these assets are considered to be part of the same origin. The browser thus stops checking for the CORS headers. This is done in a two-pronged approach.
Second, webfonts referenced by absolute URLs are handled by directly manipulating the CSS @font-face rules, which live in the browser's rather obscure CSS Object Model. Unfortunately, not only is the CSSOM not well-documented, there seems to be a bug in Chrome that stops us from modifying the @font-face rule in-place, so we have to remove the rule and re-insert the one with the modified URL at the same position.
This covers most of the edge cases for the experiment preview. However, there remains one limitation: the webfonts still cannot be loaded for sites using TypeKit or similar subscription services, where the usage is metered on a per pageview or per domain basis. Those services apply techniques that check the requesting domains before sending the webfont to the requesting browser, such as TypeKit checking the Referer header, but naturally preview.sirius.lol is not on anyone's set of declared domains. As a result, those services return an error instead of returning the webfont file. This is something that cannot be fixed, since the Referer header is something that we cannot control as a core part of the web security model.
2. Flash of Unvaried Content
An early request from our sponsor was that they wanted to avoid the momentary 'jump' or 'flash' that can happen when the variation is being applied to the page. This is especially acute for variations that take longer to take effect, such as those that add or replace images. They wanted to have the option to trade off page load time to ensure that the visitor doesn't see that sudden change in the page.
When we tried moving our variation code to run synchronously at the end of the page <body>, instead of asynchronously after the page's Document Object Model (DOM) was loaded, we found that the 'flash' was still persisting. It appears that browsers do some optimisation to improve the page load experience, and start painting the page even before the entire page's HTML is parsed.
However, once we started hiding the page body, we found that the more 'dynamic' pages with photo carousels and dynamically laid out pages were not rendering properly. It appears that they had some issues detecting the page dimensions when the page body was hidden with
display: none;, as that CSS directive instructs the browser to remove the element from the layout flow entirely.
Hence, we added an additional workaround that uses a different method of hiding the page: instead of setting the
display: none; directive, we keep it "visible" and in the layout flow, but move the position of the entire <html> element to the top and left of the canvas, such that it does not render in the viewport at all. This works pretty well, and lets the page render correctly and unseen until it finishes loading, and we can move it back to the correct position.
|Project Management||Meeting Minutes||Internal, Supervisor & Sponsor Meeting Minutes|
|Project Schedule||Project Schedule|
|Risk Management||Risk Management|
|Requirements||Project Scope||Project Scope|
|User Stories||User Stories|
|Analysis||Persona and Scenario||Persona & Scenario|
|Market Research||Market Research|
|Architectural Design||Architectural Design|
|Design||Prototypes||Mid & High Fidelity Prototypes|
|Testing||User Test Plan & Results||User Test Plan & Results|
|Project Handover||Introduction Slides||Delivered via private folder on Google Drive|
|User Manual||User Manual|
|Developer Manual||Delivered via private folder on Google Drive|
|Style Guide||Delivered via private folder on Google Drive|
|Source Code||Delivered via private folder on Google Drive|
Usability: While Web A/B Testing could be a known term, there are still terms within this concept that may not be understood by potential users. Hence, in order to make Sirius AB more usable, multiple user tests were conducted and we gathered valuable feedback. Through understanding our users' behaviour, we made sure that help was there when needed. On our platform, you would be able to see help buttons that explains certain terms and walks the user through certain features.
This site is running 'live' on an EC2 instance inside the sponsor's Amazon Web Services account.
User Test 1
User Test 2
User Test 3
Venue: SMU SIS IS480 Lab
Date: 16 Mar 2016, Thursday and 17 Mar 2016, Friday
Duration: ~60 minutes
Number of Participants: 29
User Test: Tester Instructions
User Test Results: Test Results
UI Fixes based on User Test 3
The journey so far has been tough but a great learning experience for Team Sirius. We have learnt about each other’s strengths and weaknesses, and worked towards complementing each other, producing quality work. Although we want to produce the best in the interest of our sponsors, we also need to ensure that no one burns out. We are really proud of where we have gotten so far!
Supervisor TestimonialMr Prakash:
The world has entered a digital era and businesses are experiencing a paradigm change in the ways they reach out to customers and make sales. With every business aspiring to go online it is imperative that the businesses know what will appeal to their customers and the solution is A/B testing and data analytics. This projects adds value to conventional A/B testing by bringing in more flexibility to business in the ways an experiment is configured and therefore I see good potential in the work done by team Sirius to be adapted by industry. All the best!
Team Sirius produces good quality of work, and provides regular updates on the progress on the product development. The team is strong in technical, and is willing to explore ways on improving the user experience design on the product.
This is one of the most motivated and competent FYP team from SMU I have ever worked with. The problem statement was an extremely challenging one. The functions and quality of the developed product has far exceeded our expectation!
It was a very enriching experience and I learnt to really step out of my comfort zone to speak to people about web A/B testing. It also helped to have understanding teammates, supervisor, and sponsors.
I have learnt that software estimation is more difficult than I expected, both in terms of estimating the time and effort required for user stories, as well as how much we can get done in a sprint.
I have learnt to appreciate how scrum and its sprint reviews enable a better change management process for a real project - where real business requirements were provided.This has helped the team better manage their workload and at the same time deliver quality work and value to our sponsors.
I have learnt that user experience is important for users to understand and operate well on your system. Sophisticated user testings must be conducted to create the best user experience to our users.