Difference between revisions of "IS480 Team wiki: 2011T2 Imateam User Testing"
Kikoh.2009 (talk | contribs) |
|||
(6 intermediate revisions by the same user not shown) | |||
Line 377: | Line 377: | ||
'''User Manuals'''<br> | '''User Manuals'''<br> | ||
− | [[Image:Clickdl-brown.png|70px]] [[Media:Admin | + | [[Image:Clickdl-brown.png|70px]] [[Media:Admin Manual Final edited.doc|<span style="color: #000000; padding: 10px 15px 0 15px; font-size: 12px;">View Admin Manual</span>]] |
− | [[Image:Clickdl-brown.png|70px]] [[Media:User Manual. | + | [[Image:Clickdl-brown.png|70px]] [[Media:User Manual Final.doc|<span style="color: #000000; padding: 10px 15px 0 15px; font-size: 12px;">View User Manual</span>]] |
'''Test Cases'''<br> | '''Test Cases'''<br> | ||
Line 407: | Line 407: | ||
<font color="#000000"><font size ="2px"> | <font color="#000000"><font size ="2px"> | ||
'''Data Collected'''<br><br> | '''Data Collected'''<br><br> | ||
− | After receiving feedback from Prof. Richard Davis at the midterm presentation on our | + | After receiving feedback from Prof. Richard Davis at the midterm presentation that quantitative data is more important, we decided to focus more on quantitative feedback for our Final UAT. However, we decided not to forgo all kinds of qualitative data as we felt that there is some value to qualitative data. |
Data we collected during the user testing sessions included:<br> | Data we collected during the user testing sessions included:<br> | ||
+ | *Qualitative performance measures: | ||
+ | **User feedback and comments | ||
*Quantitative performance measures: | *Quantitative performance measures: | ||
**No. of passed and failed test cases for each user | **No. of passed and failed test cases for each user | ||
Line 446: | Line 448: | ||
'''User Testing (Attendee)'''<br> | '''User Testing (Attendee)'''<br> | ||
− | [[Image:Clickdl-brown.png|70px]] [[Media:Attendee changes.docx|<span style="color: #000000; padding: 10px 15px 0 15px; font-size: 12px;">View list of changes</span>]] | + | [[Image:Clickdl-brown.png|70px]] [[Media:Attendee changes final.docx|<span style="color: #000000; padding: 10px 15px 0 15px; font-size: 12px;">View list of changes</span>]] |
'''User Testing (Admin/Organizer/Speaker)'''<br> | '''User Testing (Admin/Organizer/Speaker)'''<br> |
Latest revision as of 12:40, 23 April 2012
"Lack of documentation is becoming a problem for acceptance."
Home | Team / Stakeholders | Project Overview | Project Documentation | Project Management | Resource & Reference | LOMS & Reflections |
Diagrams | Prototypes | Minutes of Meeting | User Testing Materials | Presentation Materials |
User Testing (Midterm)
User Testing (Midterm) Methodology
Participants
|
Participants A pilot test will be conducted internally before the actual test begins. We will select a total for 30 participants for the actual user test: The participants' responsibilities will be to attempt to complete a set of representative task scenarios presented to them in as efficient and timely a manner as possible, and to provide feedback regarding the usability and acceptability of the user interface. The participants will be directed to provide honest opinions regarding the usability of the application, and to participate in post-session subjective questionnaires and debriefing. The best candidates to be engaged are: | |
Procedure
|
Participants will be divided down into 3 categories. These categories are namely IMAPAC staff, SMU professors and fellow students. IMAPAC staff will take on the role of Admin users. Our user tests will be held at Seminar Room 2-2 in Singapore Management University over 2 days. A computer with the web application will be loaded in the testing environment. The facilitator(s) seated in the same room will monitor the participant’s interaction with the web application. Note takers and data logger will monitor the sessions in a corner of the room to record quantitative and qualitative data. The test sessions will be photographed.
The debrief session will be taken very seriously as it allows participants to convey their exact feelings about the web application. After the debrief session, the testers will thank the participants for their involvement. | |
Materials/Documents
|
The following materials have been developed for record and evaluation purposes. Consent Form User Manual Questionnaires Test Cases | |
User Testing (Midterm) Analysis
Data Collected
|
Data Collected
Results Analysis We derived at the following information from the data that we collected:
| |
Post-Midterm UT Changes
Changes Made
|
Design
| |
User Testing(Final)
User Testing (Final) Methodology
Participants
|
Participants A pilot test was conducted internally before the actual test began. We selected a total of 30 participants for the actual user test: The participants' responsibilities was to attempt to complete a set of representative task scenarios presented to them in as efficient and timely a manner as possible. Their test results were recorded by the facilitators who were seated next to them. The best candidates to be engaged are: | |
Procedure
|
Participants were divided down into 3 categories. These categories were namely IMAPAC staff, SMU professors and fellow students. Our user tests were held at Singapore Management University over 3 days. A computer with the web application was loaded in the testing environment. The facilitator(s) seated in the same room monitored the participant’s interaction with the web application. Note takers and data logger will monitor the sessions in a corner of the room to record quantitative and qualitative data.
| |
Materials/Documents
|
The following materials have been developed for record and evaluation purposes. Consent Form User Manuals Test Cases | |
User Testing (Final) Analysis
Data Collected
|
Data Collected Data we collected during the user testing sessions included:
Results Analysis We derived at the following information from the quantitative data that we collected:
| |
Post-Final UT Changes
Changes Made
|
User Testing (Attendee) User Testing (Admin/Organizer/Speaker) User Testing (with sponsors) | |