HeaderSIS.jpg

IS480 Team wiki: 2016T2 Team Sirius User Tests

From IS480
Jump to navigation Jump to search

Sirius home icon.png   HOME

 

Sirius aboutus icon.png   ABOUT US

 

Sirius projectoverview icon.png   PROJECT OVERVIEW

 

Sirius projectmgmt icon.png   PROJECT MANAGEMENT

 

Sirius doc icon.png   DOCUMENTATION

 


User Test 3


User Test 3

Venue: Singapore Management University
Date: 16 and 17 Mar 2016, Wednesday and Thursday
Time: 1.00pm to 6.00pm
Duration: ~60 minutes
Number of Participant(s): 29
User Test: Instruction here

Objectives

  • To gather feedback on usability of the user interface from existing users
  • Identify usability issues user might face

Scope for User Testing 3

  • Login/Logout
  • Create an experiment
  • Add a new goal to an existing experiment
  • Interpret results from the experiment

Participants

We have recruited 29 students with programming background as our participants.

Goals and Results

S/N Goals Reached/Reached?
1 Participants should be able to login and identify the number of experiments running Goal reached. All participants were able to complete the tasks.
2 Participants should be able to create a new experiment Goal reached. All participants were able to complete the tasks. All participants took a longer time to complete the task.
3 Participants should be able add new goals to an existing experiment

Goal reached. All participants were able to complete the tasks. However, some participants felt that the layout and the number steps required was an issue.

4 Participants should be able to understand the experiment results

Goal reached. All participants were able to complete the tasks. However, some participants found that the labels used was an issue for them to understand the result.

Key Findings

Overall Findings Summary
Quantitative Analysis The experiment list page is full of experiments, no filter provided.
Participants felt there are no feedback given that an experiment should be stopped before making changes to it.
Participants were confused with the goal selectors' colours.
Participants were confused with the results tab. They were unable to find the statistical significance.
Some of the buttons that were purposely placed to guide the user to complete the task without any guidance were unused.
Possible changes before handover Revise the goal selector and "how it works"
Provide a prompt to user if the experiment is not stopped before making edits
Provide a filter to filter out the experiments of different status
Revise results tab, changed "Raw Data" to "Statistical Significance"

Detailed Findings

Participants should be able to login and identify the number of experiments running

Goal #1 was reached

Sirius ut3 task1.png

  • 82% of the participants rated the usability of this function as very easy.
  • Most of the participants were able to identify the number of experiments running.
  • However, 18% of the participants found the ‘total number of experiments running’ label was not obvious enough.
Participants should be able to create a new experiment Goal #2 was partially reached

Sirius ut3 task2.png

  • 42% of the participants rated the usability of this function as easy and above.
  • Some participants found the layout is simple that makes it easy to select and understand to create a experiment.
  • However, many participants found the CSS selector was hard to use, precision was required to select the elements.
  • Some of the participants were unable to locate the “How does it work?”. Hence, they were unable to complete the tasks without any guidance.

Sirius ut3 task2p2.png

  • 86% of the participants do not understand the difference between the ‘Original’ and ‘Variation’ tabs
  • They might not understand the concept of AB testing tools.
  • Most of the participants mentioned that the ‘Original’ is the previous layout of the page and ‘Variation’ is the edited layout after adding the JQuery code.
  • Some of the participants assumed that the ‘Original’ was verison A and ‘Variation’ was version B which can be technically correct.
Participants should be able add new goals to an existing experiment Goal #3 was partially reached

Sirius ut3 task3.png

  • 58% of the participants rated the usability of this function as neutral and below.
  • Some participants found that the instruction given was not comprehensive and intuitive.
  • Some participants commented that they does not know that they must a stop the experiment before editing. No prompt or instruction was given.
  • No confirmation message was given after adding a new goal on an existing experiment.
Participants should able to find goals of from existing experiments Goal #4 was partially reached
  • 71% of the participants rated the usability of this function as neutral and below.
  • Some of the participants found the terms used were confusing.
  • Some participants found there was lack of guidance to help them understand the terms and graph displayed.
  • Some participants commented that the inconsistency of terms used (rounding of figures) has confused them


User Test 2


User Test 2

Venue: IDA
Date: 4 Feb 2016, Thursday
Time: 9.30am
Duration: ~45 minutes
Number of Participant(s): 4
User Test: Instuction here

Objectives

  • To gather feedback on usability of the user interface from existing users
  • Identify usability issues user might face

Scope for User Testing 2

  • Login/Logout
  • Create an experiment
  • Add a new goal to an existing experiment
  • Interpret results from the experiment

Participants

We have recruited four developers from IDA as our participants.

Goals and Results

S/N Goals Reached/Reached?
1 Participants should be able to login and logout successfully without any form of guidance Goal reached. All participants were able to complete the tasks.
2 Participants should be able to create a new experiment Goal reached. All participants were able to complete the tasks. All participants took a longer time to complete the task.
3 Participants should be able add new goals to an existing experiment

Goal reached. All participants were able to complete the tasks. However, some participants felt that the layout and the number steps required was an issue.

4

Participants should be able to understand the experiment results

Goal reached. All participants were able to complete the tasks. However, some participants found that the labels used was an issue for them to understand the result.

Key Findings

Overall Findings Summary
Quantitative Analysis The labels used were inconsistent or confusing.
Participants felt the site was dull and could be more ‘lively’ instead.
Participants took a longer time to create, edit and view the result of an existing experiment.
Some of the buttons that were purposely placed to guide the user to complete the task without any guidance were unused.
Possible changes before User Testing 3 Revise the red box and its label on the login page
Revise the labels on landing page after the user has login
Unnatural spacing in goal form partial
Order of goal types wrt. help text
Awkward snippet code placement
Step help icon not noticed
Update weightage button
Modal dialog at last step to start experiment immediately
SG help trigger showing when goal type is not 'click'
Source free public domain icons
Implement step click navigation
Total visits showing up for every goal
Make create experiment buttons more concise
Remove 'click to select' text on snippet code dropdown
Update weightage modal too wide
Experiment list table header capitalisation consistency
Reviewer wanted: total visit over total conversion, pie-chart opportunity?
No feedback upon experiment creation/edit (state change in general)
Livelier colour scheme
Validation errors that occur further down the page are not apparent when they happen (user needs to scroll down to see)
Implement no-experiment view

Detailed Findings

Participants should be able to login and logout successfully without any form of guidance

Goal #1 was reached
  • All participants rated the usability of this function as easy.
  • However, participants were distracted by the red box (labelled ‘You need to sign in or sign up before continued’). Hence, they made more steps to explore what does the label means before trying to login.
Participants should be able to create a new experiment Goal #2 was partially reached
  • All participants rated the usability of this function as neutral.
  • All participants made numerous steps to complete the task.
  • All participants met problems to edit the weightage. They could not locate where to edit weightage and one commented that the popup can be improved, mainly on the layout and labels used.
  • Some participants did not realized the help button to complete their task. The button is not obvious. And the term used in the help menu is different from the selection
  • All participants could not locate the sidebar of the given web page for the experiment. This is because when the screen is narrow and the page is web responsive, the participants could see the sidebar. However, the slide button (purpose to help the participant to view the full page) is not obvious until the participants were told.
  • Some participants were unclear of the CSS selector. They seems to unsure of the colours of the selection and to understand the main function of it.
Participants should be able add new goals to an existing experiment Goal #3 was partially reached
  • All participants rated the usability of this function as neutral.
  • Some participants commented that there was too much vertical spacing for the non-click goal on the configuration variation page.
  • Some participants commented that there was too many steps to add a new goal.


Participants should able to find goals of from existing experiments Goal #4 was partially reached
  • All participants rated the usability of this function as neutral.
  • All participants took a longer time to interpret the result.
  • Some participants could not find the statistical significance figure.
  • Some participants felt that the term (e.g. tab labels, chart labels, Scroll goal and page goal) used were confusing. There could be improvement on the charts (e.g. single variable chart is meaningless, the layout of the charts)


User Test 1

User Test 1

Venue: IDA
Date: 30 Oct 2015, Friday
Time: 3.00pm
Duration: ~25 minutes
Number of Participant(s): 2
User Test: Instuction here

Objectives

  • To gather feedback on usability of the user interface from existing users</l1>
  • Identify usability issues user might face

Scope for User Testing 1

  • Create an experiment
  • Edit an existing experiment
  • Start/Stop an experiment
  • Preview an existing experiment

Participants

We have recruited our product owner and sponsor from IDA as our participants.

Goals and Results

S/N Goals Reached/Reached?
1 Participants should be able complete task #1 without any guidance from our test facilitator Goal partially reached. Both participants were confused about the process (navigating from page to page), unsure about what the inputs were for.
2 Participants should be able to edit an existing experiment Goal reached. Both participants were able to complete the tasks. One of the participants had a lower rating and took a longer time to complete the task.
3 Participants should be able to understand the experiment results Goal reached. Both participants were able to complete the tasks. However, both participants rated the usability as difficult as the labels were confusing.
4 Participants should able to find goals of from existing experiments Goal reached. Both participants were able to complete the tasks.
5 Participants should be able to start and stop the experiment easily Goal reached. Both participants were able to complete the tasks.

Key Findings

Overall Findings Summary
Quantitative Analysis When adding a new experiment, one participant was unsure whether to click on ‘load’ or ‘next’ button to proceed to the next step.
The participants were confused by the input for “jQuery code” and “CSS selector”. The participants does not understand what are these inputs for.
Participants took a longer time to edit the existing experiment.
The header of the result table was confusing. Participants find it hard to understand the result
Possible changes before User Testing 2 Revise the editing experiments page to reduce the number of clicks required.
Revise the labels on the adding of a new experiment or adding more explanation text/information.
Revise the labels on the result page.
Revise the inputs placeholders to be more obvious.
Revise the “Adding of new experiment” interface to be more obvious as it is a multi step process.

Detailed Findings

Goal #1 - Participants should be able complete task #1 without any guidance from our test facilitator Goal #1 was partially reached
  • All participants rated the usability of this function as very easy to use.
  • However, from our observation, the participants seem to be rather confused about the naming of the labels.
  • One of the participant stated that the preview of the experiment is not obvious. The position of the preview was placed at the bottom of the page that the participant did not noticed that the experiment was loaded. Unnecessary scrolling was required
  • One of the participant stated that there is a lack of information on the terminologies and parameters used. If the user is using the function for the first time, the user might not understand what he/she was doing. The participant suggested that we should change the terminologies or provide more instruction/information to guide the user.
Participants should be able to edit an existing experiment            Goal #2 was reached
  • All participants rated the usability of this function no less than neutral.
Participants should be able to understand the experiment results Goal #3 was reached
  • All participants rated the usability of this function as difficult to use.
  • All participants seem to be confused by the labels used (Total completions and Total views). They took a longer time trying to understand what do the labels mean.
  • One of the participant navigated between the home page and result page twice. The participant was unclear whether he was on the wrong page.
  • However, all participants still managed to answer the question correctly and complete the task without any guidance from the test facilitator.
Participants should able to find goals of from existing experiments Goal #4 was reached
  • All participants completed the task quickly and rated the usability of the application no less than neutral.
Participants should be able to start and stop the experiment easily Goal #5 was reached
  • All participants rated the usability of this function as very easy to use.
  • All participants used the start and stop buttons on the home page.
  • All participants did not used the stop button on the “view experiment” page.