IS480 Team wiki: 2016T2 Team Sirius Final Wiki

From IS480
Jump to navigation Jump to search
Sirius logo.png

Sirius home icon.png   HOME


Sirius aboutus icon.png   ABOUT US


Sirius projectoverview icon.png   PROJECT OVERVIEW


Sirius projectmgmt icon.png   PROJECT MANAGEMENT


Sirius doc icon.png   DOCUMENTATION


Project Progress Summary

Sirius presentation slides icon.pngFinal Slides Sirius deployed link icon.pngDeployed Site Sirius poster icon.pngPoster Sirius pitch video icon.pngPitch Video

Project Progress

  • Current sprint: Sprint 13
  • Sprint period: 31 March 2016 to 14 April 2016
  • Major milestone: Final Presentation

Project Highlights

Sirius map.png

Before the team accepted this project from our sponsor, the team was warned of the high technical complexity. Nonetheless, the team still accepted the project and took about 4 weeks to familiarize with Ruby on Rails and JavaScript. Along the way, there were new requirements given to the team. These were prioritised in collaboration with the Product Owner, and some modules like Admin and Mobile Device Experiments were added and completed, while Multi-page experiments, Multi-variant experiments and Visual Variation Editor were deprioritised and later dropped.

We have also run experiments for 8 websites using Sirius AB. The team delivered 100% of the agreed-on scope as of 8 April 2016.

Tester for Sirius AB
"Sirius, is in its current form, very well thought of in terms of the various functions for A/B Testing, Engagement, Click, Page Scroll and Page Dwell as well as the reporting tools. I really like the click selection tool. We are looking at a go to market product that with more iterations and improvements to the UXID can be a great solution in the marketing tech. field." -- Aaron LEE Kwang Siong, Manager, Web & Digital Media - Office of the Dean (SMU SIS)

Project Challenges

There were several challenges split into different milestones.

Configuring variations and goals on our platform and then injecting them into the desired page through our snippet code
This was the first pain point that the team had to deal with. The team had to prove that this could be done by acceptance so as to proceed with this project. In order to ensure that this milestone is met, the team catered enough time for familiarisation with Ruby on Rails and JavaScript before implementation started.

Visual goal selection
There needs to be an easy way for users to configure goals, and that is through the visual goal selection. The team had to spent a lot of time studying an open-source library, Selector Gadget, in order to customise it to suit the needs of the system. The effort required was underestimated, but the team still managed to deliver this function on time.

Admin module
The system has to support admin account(s), which have to be able to manage all experiments and user accounts.This was challenging as the authentication library used, Devise, does not handle user management. Hence, the team had to spend more time delving deeper into Devise in order to override existing functions to provide user management features. As a result, the team had to reprioritise the planned scope, and drop modules accordingly (as explained in our change management).

Project Achievements

Following the scrum methodology has helped the team learn how to quickly adapt to changes in the project scope, and still be able to deliver a potentially shippable product at the end of each sprint. As every user story was crafted to suit the needs of the business, the team was always sure that there were no unnecessary features built. Having sprint reviews at the end of each sprint gave the team a tighter feedback loop, and any need for changes were only highlighted then. Consequently, scope creep was minimised, since the product owner had to prioritise tasks at every sprint review.

Practicing Continuous Integration (CI) has enabled the team to quickly track down integration bugs since small change sets are tested each time. The team never had to face last minute chaos during sprint reviews and supervisor meetings as the team could do frequent code pushes. CI has also enabled automated testing, helping us focus on developing functional and quality code.

Project Management

Project Status

Team Sirius has delivered 100% of the agreed-on project scope to our sponsor.
Sirius project final status.jpeg

Midterm vs Final Scope

Midterm Final
Sirius scope v8.png

Siriu scope version v9.png

Team Sirius has completed all the modules that were promised to the sponsor. Sponsor was happy and satisfied with the scope delivered by the team. Due to the high complexity and limited time left, the team had decided to drop the Visual Variation Editor during sprint 11. The team consulted both the supervisor and product owner, and both agreed that the team should focus on improving the features completed, and not start on a module that may not be completable in time.

See our Project Scope page for more details on previous changes

Midterm vs Final Project Schedule

Sirius midterms schedule.png
Sirius final schedule.png There are no major changes to project schedule since the Midterm presentation. The team conducted its User Test 3 on 16-17 March. The buffer time in Sprint 12 was used to fix bugs and improve on the UI based on the feedback given by our sponsor and User Test 3 testers. Handover to our sponsor was completed on 8 April before the team's exams. As mentioned, Visual Variation Editor was dropped due to its high complexity and limited amount of time left. Lastly, the team has achieved its X-Factor of getting 8 websites to run experiments on our system.

Project Metrics

Team Velocity

Formula: Average of accepted stories points of 3 sprints
Sirius sprint velocity !3.png
Explanation: The average accepted story points is around 9 points for Sprint 11. Note that Sprint 10, 12 and 13 covered bug fixing and UI improvement activities, where each bug is logged as a story with 0 story point, and UI improvements do not have story points. Hence, there are no story points completed/accepted for those sprints.

Sprint Burndown


  • Planned: Total planned story points over number of days in a sprint
  • Actual: Actual story points completed each day in a sprint

Burndown charts from some sprints since Midterm are highlighted here.
See our Midterm Wiki page for more details on sprints prior to Midterm.

Sirius burndown sprint 10.png
Explanation: There were no story points cleared in this sprint because we were fixing bugs and addressing points raised during Midterm presentation.

Sirius burndown sprint 11.png
Explanation: All planned bug fixes and stories were completed on time and before User Test 3.

Risk Management

Risk Type Risk Event Likelihood Impact Mitigation
Technical Risk Unable to develop WYSIWYG editor due to its high complexity High Medium Active feasibility checks and notify sponsor if there is a need to rescope the function

Explanation: This risk management was conducted after the Midterm Presentation as mentioned above. We worked with the sponsor to scope down and prioritise a smaller subset of the WYSIWYG editor features. While we agreed at the end of sprint 10 to scope down and prioritise the move/rearrange feature, after further investigation during sprint 11, we found that it was not feasible to complete with the remaining time. We eventually negotiated with the sponsor to drop this function.
See our Risks page for the full list of potential risks

Unaccepted Stories of Each Sprint

Sirius midterm scrum.png
Explanation: If there are any bugs found at any point in time during the sprint, the bug is logged as a story with 0 points. If the bug impedes progress or prevents the team from delivering a shippable product, it will be prioritised and fixed immediately. If it does not impede progress, it will be prioritised together with the product backlog at the start of the next sprint. See our Stories page for each sprint's user stories.

Technical Complexity

1. Experiments Preview

As covered previously in the Midterm, the Same-Origin Policy applied by the browser creates several problems for our experiment previews. While the issues with references to CSS files, JS files, and images have been resolved, we noted that the self-hosted webfonts were not working. In particular, if a website is using self-hosted webfonts from a server that does not explicitly set the Cross-Origin Resource Sharing (CORS) headers to permit use from another domain, the browser will refuse to load the webfont.

We fix this issue by proxying all the webfonts through our server, so that the previewed HTML page and all these assets are considered to be part of the same origin. The browser thus stops checking for the CORS headers. This is done in a two-pronged approach.

First, webfonts referenced by relative URLs are handled by proxying the CSS file, so that the webfont relative URLs are now based off the proxied CSS file's URL. We accomplish this by injecting some more JavaScript when loading the preview, that goes through the entire page and changes the URL of all CSS stylesheet <link> tags.

Second, webfonts referenced by absolute URLs are handled by directly manipulating the CSS @font-face rules, which live in the browser's rather obscure CSS Object Model. Unfortunately, not only is the CSSOM not well-documented, there seems to be a bug in Chrome that stops us from modifying the @font-face rule in-place, so we have to remove the rule and re-insert the one with the modified URL at the same position.

This covers most of the edge cases for the experiment preview. However, there remains one limitation: the webfonts still cannot be loaded for sites using TypeKit or similar subscription services, where the usage is metered on a per pageview or per domain basis. Those services apply techniques that check the requesting domains before sending the webfont to the requesting browser, such as TypeKit checking the Referer header, but naturally preview.sirius.lol is not on anyone's set of declared domains. As a result, those services return an error instead of returning the webfont file. This is something that cannot be fixed, since the Referer header is something that we cannot control as a core part of the web security model.

2. Flash of Unvaried Content

An early request from our sponsor was that they wanted to avoid the momentary 'jump' or 'flash' that can happen when the variation is being applied to the page. This is especially acute for variations that take longer to take effect, such as those that add or replace images. They wanted to have the option to trade off page load time to ensure that the visitor doesn't see that sudden change in the page.

When we tried moving our variation code to run synchronously at the end of the page <body>, instead of asynchronously after the page's Document Object Model (DOM) was loaded, we found that the 'flash' was still persisting. It appears that browsers do some optimisation to improve the page load experience, and start painting the page even before the entire page's HTML is parsed.

The workaround we implemented was to return our snippet to the <head> tag, and then hide the entire page body as early as possible, even before the browser starts parsing our JavaScript support libraries. This ensured that the visitor is less likely to see anything on the page at all, until we unhide the page body after the variations' changes are fully loaded.

However, once we started hiding the page body, we found that the more 'dynamic' pages with photo carousels and dynamically laid out pages were not rendering properly. It appears that they had some issues detecting the page dimensions when the page body was hidden with display: none;, as that CSS directive instructs the browser to remove the element from the layout flow entirely.

Hence, we added an additional workaround that uses a different method of hiding the page: instead of setting the display: none; directive, we keep it "visible" and in the layout flow, but move the position of the entire <html> element to the top and left of the canvas, such that it does not render in the viewport at all. This works pretty well, and lets the page render correctly and unseen until it finishes loading, and we can move it back to the correct position.

Quality of Project

Project Deliverables

Stage Specification Modules
Project Management Meeting Minutes Internal, Supervisor & Sponsor Meeting Minutes
Project Schedule Project Schedule
Metrics Project Metrics
Risk Management Risk Management
Requirements Project Scope Project Scope
User Stories User Stories
Analysis Persona and Scenario Persona & Scenario
Market Research Market Research
Architectural Design Architectural Design
Design Prototypes Mid & High Fidelity Prototypes
Testing User Test Plan & Results User Test Plan & Results
Project Handover Introduction Slides Delivered via private folder on Google Drive
User Manual User Manual
Developer Manual Delivered via private folder on Google Drive
Style Guide Delivered via private folder on Google Drive
Source Code Delivered via private folder on Google Drive


Performance: Our JavaScript snippet containing the supporting libraries, our snippet code, and the experiment configuration data is served to every website visitor. To reduce the impact on website load time, we enabled GZip compression on the Nginx webserver that is in front of the app server. This brought the typical snippet size down from 121 KB to 42 KB (65% decrease in filesize).

Also, our JavaScript snippet is designed to further minimise the impact on page load time by default. When the snippet is loaded by the browser in the page <head>, it is being executed in synchronous mode on the rendering critical path. Therefore, we ensure that only the absolutely required work of checking for debug/mode flag and attaching two event handlers is done then. All other work is deferred to after the page's DOM is fully loaded, at which point the remaining code is run asynchronously without blocking page load.
Explanation of terms when viewing results

Maintainability: To ensure that our code is maintainable after handover, we follow the standard JavaScript and Ruby coding conventions, as well as the community style guides. We configured rubocop, a Ruby static code analyzer that checks our source code based on the community Ruby style guide. Comments are also included for code that requires explanation on why it was done that way. We also ensure that our Git commit messages are always meaningful and consistent with the best practices for Git commit messages.

Usability: While Web A/B Testing could be a known term, there are still terms within this concept that may not be understood by potential users. Hence, in order to make Sirius AB more usable, multiple user tests were conducted and we gathered valuable feedback. Through understanding our users' behaviour, we made sure that help was there when needed. On our platform, you would be able to see help buttons that explains certain terms and walks the user through certain features.

Tooltips on mouseover
Help messages show up after clicking on "?"
Step by step Visual Goal Selector instruction


Deployed site: http://sirius.lol (User Manual)

This site is running 'live' on an EC2 instance inside the sponsor's Amazon Web Services account.


User Test 1

Venue: IDA
Date: 30 Oct 2015, Friday
Time: 3.00pm
Duration: ~25 minutes
Number of Participants: 2
User Test: Tester Instructions
User Test Results: Test Results

User Test 2

Venue: IDA
Date: 4 Feb 2016, Thursday
Time: 9.30am
Duration: ~45 minutes
Number of Participants: 4
User Test: Tester Instructions
User Test Results: Test Results

User Test 3

Venue: SMU SIS IS480 Lab
Date: 16 Mar 2016, Thursday and 17 Mar 2016, Friday
Time: 11.30am
Duration: ~60 minutes
Number of Participants: 29
User Test: Tester Instructions
User Test Results: Test Results

UI Fixes based on User Test 3

Sirius ut3 change1.pngSirius ut3 change2.png Sirius ut3 change3.png


Team Reflection

Team Sirius

The journey so far has been tough but a great learning experience for Team Sirius. We have learnt about each other’s strengths and weaknesses, and worked towards complementing each other, producing quality work. Although we want to produce the best in the interest of our sponsors, we also need to ensure that no one burns out. We are really proud of where we have gotten so far!

Supervisor Testimonial

Mr Prakash:
Team Sirius with Supervisor, Mr Prakash.

The world has entered a digital era and businesses are experiencing a paradigm change in the ways they reach out to customers and make sales. With every business aspiring to go online it is imperative that the businesses know what will appeal to their customers and the solution is A/B testing and data analytics. This projects adds value to conventional A/B testing by bringing in more flexibility to business in the ways an experiment is configured and therefore I see good potential in the work done by team Sirius to be adapted by industry. All the best!

Team Sirius with Sponsor, Eyung and Product Owner, Jean

Team Sirius produces good quality of work, and provides regular updates on the progress on the product development. The team is strong in technical, and is willing to explore ways on improving the user experience design on the product.

This is one of the most motivated and competent FYP team from SMU I have ever worked with. The problem statement was an extremely challenging one. The functions and quality of the developed product has far exceeded our expectation!

Individual Reflection

It was a very enriching experience and I learnt to really step out of my comfort zone to speak to people about web A/B testing. It also helped to have understanding teammates, supervisor, and sponsors.

Chiang Fong:
I have learnt that software estimation is more difficult than I expected, both in terms of estimating the time and effort required for user stories, as well as how much we can get done in a sprint.

I have learnt to appreciate how scrum and its sprint reviews enable a better change management process for a real project - where real business requirements were provided.This has helped the team better manage their workload and at the same time deliver quality work and value to our sponsors.

I started off wanting to attain a greater understanding of the Rails platform, but my role as the frontend lead has instead made me appreciate the sheer complexity of interactions between Javascript, CSS and HTML in the modern web browser. The number of things that can be done on the client side just keeps increasing

Lai Ho:
I have learnt that user experience is important for users to understand and operate well on your system. Sophisticated user testings must be conducted to create the best user experience to our users.