IS480 Team wiki: 2015T1 4Sight User Testing 3

From IS480
Revision as of 22:01, 23 November 2015 by Amabel.lau.2012 (talk | contribs) (→‎4. Manage marketing channels)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
4Sight team logo.png
4Sight Home.png HOME   4Sight Team.png ABOUT US   4Sight Project overview.png PROJECT OVERVIEW   4Sight Project management.png PROJECT MANAGEMENT   4Sight Documentation.png DOCUMENTATION  
Technical Diagrams Design Documents Meeting Minutes User Testings Project Documents


  • Location: Clearvision @ 6 Nutmeg Road
  • Date: 16 November 2015 (Wednesday)
  • Time: 15:00pm


1. Determine usability of the admin and analytics module
2. Verify if features built are in line with user requirements (admin & analytics modules)


  • Number of participants: 3
  • Roles of participants: admin and marketing team members


Each of the users was given a list of tasks to complete. Click here to download tasklist. The tasklist also consists of some questionnaire to gather feedback from users.

Summary of Survey Results

Click here to download the raw data and analysis.

Based on users' feedback, we have made the following enhancements:

1. Limited listing available for selections

Before After

UT3 enhancement1Before.png

UT3 enhancement1After.png

During the testing, one of the users pointed out that the options for the listing were too limited. She mentioned that they often have to refer to marketing efforts in previous months (>3 months) to see the effectiveness of it. It would be more useful if they are able to conveniently view a year worth of data. Hence, based on the feedback gotten, we have improved on it by listing 1 year worth of data from the current month.

2. Difficult to view the conversion rate chart

Before After

UT3 enhancement2Before.png

UT3 enhancement2After.png

Both marketing members commented that they found it difficult to view the conversion rate chart as they were not able to quickly view at a glance as to which bar charts are for which particular marketing channel. The bar charts are so cramped together that they could not differentiate between the leads, converts and rate bar charts. In addition, they also raised a concern that the viewing of data will be made more difficult with increasing marketing channels in the future. Hence, based on the feedback gotten, we have added the zooming in of chart function (see after) to allow users to zoom in to a particular section of the chart. To address the concern raised, users can do the filtering using our filter feature that has been implemented before the user testing.

3. Filter marketing channel for a specified time period

Before After

UT3 enhancement3Before.png

UT3 enhancement3After.png

Users pointed out that the current filter feature could be enhanced by allowing them to choose the period. This will be helpful for them when saving filters as they might only be interested to see marketing channels that were spent in a particular year and month.

4. Manage marketing channels

UT3 enhancement4.png

Users made a request for a page to manage their marketing channels. As a marketing channel can be purchased at different point in time for a particular year (eg: Facebook can be purchased in Jan 2015, May 2015, Dec 2015..), the list of marketing channels will increase with time. A page to manage the marketing channels will be useful in keeping track of only the relevant marketing channels.

5. Differentiate marketing channels in dropdown selection

Before After

UT3 enhancement5Before.png

UT3 enhancement5After.png

Users pointed out that the current dropdown for marketing channels does not differentiate the spending for the same marketing channel of different months. Hence the team made some modifications to address the concern raised.

User Study Summary

  • Overall, users like the improvements/ enhancements we have made to the analytics dashboard in comparison with the one they have tested in user testing 2.
  • The admin module received positive comments from user. No change was required.
  • For the entire list of tasks, usability scores did not vary drastically from user to user. All tasks were rated at least 4/5.