2013-14 Term 1 G2 D'PENZ

From Interaction Design and Prototyping
Jump to: navigation, search
Home Assignments G1 G2 Technology


Assignment Deliverables Self Assessment Comparison to other Teams
A3. Low-fidelity Prototype Deliverables DateTime: 12th September 2013 0146hrs
  1. Problems and solutions are believable and based on the IS480 project.
  2. Personas are clear and are of important roles in the company.
  3. The scenarios show the real problems faced by the staffs while doing their tasks in the company, it clearly shows the users' tasks. In additional, we included the proposed system scenario if we were to implement it.
  4. Flow diagrams match the scenario
  5. Prototype images are clear and there are captions in every image for better explanation

We are glad that we have met all requirements of the Assignment however we feel that we are able to do a better job and improve on the Assignment.

Things we can improve:

  1. Better prototype presentation like videos which is more interactive
  2. Think of a alternative design where we might come up with better ideas
  3. Add more personas where we might be able to come up with problems of the as-if system

Work is commented based on the grading rubric. Be as objective as you can on how you feel you did well and where you think you can improve.

DateTime: 13/09/2013 0820hrs

Carpe Diem

They used 3 personas which are clear and descriptive. They include comic strips for their scenarios which we feel that it is better in explaining the the scenarios.

A4. Heuristic Evaluation (End of Iteration 1)

Deliverables DateTime: 19th September 1105hrs
  1. We have compiled the all the problems from our evaluators
  2. Other than the solutions provided by the evaluators, we come up with our own solutions as well. We provided solutions for all our problems.
  3. Generally, we feel that our results and descriptions are clear and easy to read. Self evaluation is done as well.

Work is commented based on the grading rubric. Be as objective as you can on how you feel you did well and where you think you can improve.

DateTime: 19/09/13

The Eagles

We find that it was thoughtful of them to provide an overview of the heuristic evaluation with the graphs and charts.
A5. High-fidelity Prototype 1: A Skeleton and a Plan Deliverables DateTime: 26th September 1130hrs
  1. Scenarios and Flow: Our flow diagram is consistent with our scenario and allows us to know what are the users' goals. We have covered how the user interacts with the system.
  2. Plan: We completed all work items within a day of the start date. Each item is assigned to only one person. Something we need to work on is that our plan can probably show how we leave room for the unexpected.
  3. Changes: Changes to prototype are clear and well-justified, having tackled all the feedback from heuristic evaluation.
  4. Prototype: All major views of the system are present, and we have linked to our application on appspot. Screenshots are consistent with our workflow. Perhaps, we could have done up other less major views to give an overview of the other functions our system has.
  5. General: All components are in place accordingly and easy to read.
DateTime: 26th September 1155hrs


Their prototype looks good and we can see the effort taken to do so. Scenarios provide enough details. They did up alternate flow diagrams as well. The plan was detailed and shows each work item being assigned to one person only. Changes to prototype included both 'before' and 'after' to help us understand it better.
A6. High-fidelity Prototype 2: Meat on the Bones Deliverables DateTime: 03 October @ 02:00hrs

As week 7 comes to an end, all of us feel tired with the workload from not only IDP but our other modules. We did not manage to complete all the allocated items for this week due to the heavy amount of work, but we have allocated them to next week, which we believe would allow us to exercise more control over the deliverables. During this week's assignment, we have highlighted the necessary changes to our implementation plan and the changes made. Our major views are flashed out with controls and our progress is charted accurately.

  1. Progress: As mentioned, most of the deliverables are on time and changes have been made and well indicated in the implementation plan.
  2. Screenshots: We have updated UI according to the deliverables and added in the necessary screenshots for respective updates. Also, we have updated the illustrations to help user understand the flow of the screenshots.
  3. Scenarios and Flow: Changes are made to the explanation and write-up, for greater clarity.

Team YOLO - Bill Splitter

We were thoroughly impressed by not only the fact that developed such a nice interface with Phone Gap but also at the rigour of their implementation plan.
A7. High-fidelity Prototype 3: Ready for Testing Deliverables 17 Oct 2013 12:18 AM

Prototype: We polished the interface of our procurement system by keeping in the feedback that we have received. However, much can be done to improve on the experience of the clients. Deliverables: We have made changes to the prototype that are consistent to the implementation plan. Changes: Changes are well justified and illustrated, where screenshot of our changes are specifically and neatly displayed Progress: Progress are in line with our schedule. General: Changes and progress updates are easy to read with the aid of screenshots. Overall this particular assignment has spanned across the largest duration among the others.


The Eagles

The UI looks very polished and the changes they made seem to be consistent with the original implementation plan.
A8. Laboratory Test

(End of Iteration 2)

Deliverables DateTime: 24th October 2013 1242hrs
  1. Goals: Our goals are fit the our client's problems and solution, we keep the goals simple to make things clear and understandable.
  2. Tasks, Data, & Documents: We feel that we have done well in this area, the tasks are related to our project, conditions are clearly labelled and the all documents are prepared for all testers.
  3. Results: Data collected are consistent, instead of only 3 testers, we continue to find one more tester for our project test. We use tables to present our results to make it more clear.
  4. Changes: We adapt to changes we feel useful and changed our prototype, it was clearly documented. Changes are consistent with the results given by our testers.
  5. General: We feel that we have done well and make the all documents easy to read.

However, we feel that we can do much more if we have the time, we feel that the test is really useful. If given the time, we might want to have more testers for our prototype and gather more results. Also, we can come up with more scenarios for a more comprehensive check for our system and gather all the questions in the help tab we built for our prototype.


27/10/2013 Diversity

  1. We like how they do the Goals section, it is very clear what they are going to test about and how should the user react to the different scenarios.
  2. The way they consolidate the results with graphs which make the results better to compare and understand
A9. Web Experiment 1: Setup Deliverables DateTime:
31st Oct 2013 1055hrs
  1. Changes to our prototype are well-justified based on testers' feedback. We can aim to be more ambitious with our changes as the current ones are simple small changes.
  2. Experiment purpose fits with project's problem and solution. Variables are consistent with experiment purpose and scope. Could be based on scenario more.
  3. Experiment walk-through is clear and well-illustrated. It is integrated with Usability Hub.

As our prototype was hosted on an internal server, we were limited by the tools available. Thus, we decided to use Usability Hub where it required screen shots only. Our changes could have been more ambitious, but we felt that it was important to test out small changes as well. This is because as discussed in class, even small changes can have drastic effects on user responses.

DateTime:31 Oct 2013 1817hrs


We love the way they use usability hub to set up the nav flow user test. We initially feel that the test isnt that useful, however looking how they used it, we might be rethinking and maybe adding a nav flow user test as well. In their experiment walk-through, they use diagram to explain their walk-through which is very easy and clear to understand.
A10. Web Experiment 2: Analysis Deliverables DateTime: November 11, 2013 @ 0915hrs
  1. Participants: We have at least 20 participants, however we retrieve more than 40 responses from the participants as we have 4 web tests (2 for help button and 2 for search function)
  2. Results: Our results include a statistical analysis.
  3. Conclusions & Changes: We feel that our conclusion is sound and reasonable as the results proof the changes that we are going to change or keep.
  4. General: In general, our web test result is clear and simple. For the Prototype, we include the screenshots, we put for the 5 tests, which makes the readers better understand what are the changes.

Compared to other market research assignments, this assignment had additional difficulty as there was a stronger need to justify our choices. The gathering of participants for a long survey not any easier and this made us realize the importance of crafting a survey that was relevant & concise in order to NOT waste time. The results were encouraging as it tallied with our original hypothesis.

Due to the heavy work load and project presentations, we are unable to come out with more time to polish our web test results. We can improve on the results tab to include more analysis. Furthermore, we can get more participants to do our web test.


Team Diversity

The experiments were well conducted. The sample of people surveyed were large and the analysis was thorough.

A11. Poster Session (End of Iteration 3)

Deliverables DateTime: Comment your own work based on the grading rubric. Be as objective as you can on how you feel you did well and where you think you can improve.

Evaluated team's number and project name

Comment on why you think this work is better than your work


Project Name ING Bank's Procurement Workflow Management System
Design Brief IS480 D'PENZ Currently, all information related to projects undertaken by Procurement are stored in documents ranging from paper-based form to Excel spreadsheets. This makes it difficult for them to access important information quickly and track progress of the projects. They are unable to make use of the data on hand to analyse potential cost savings for the bank.

With the new system:

  • it will lead the users through the entire procurement process
  • track progress of each project
  • manage KPIs
  • generate reports
  • analyse supplier performances
Problem No Common Platform for Projects

There is no common platform to store all of the information and track project status. Thus, when the various parties involved need to look up the information, they will have to ask the party who is holding onto the Form at that moment to scan them a copy or email them the details needed. This causes a lot of inconvenience and if there was a common platform, they can access it on their own to get the required information. Also, as details are updated on the Form first, the information on the Calendar is not real-time and accurate.

Solution We will create a workflow management system which everyone can access to see all information related to the project that they are involved with, and also update details directly.

G2 Deliverables

Iteration 1 A2 Observations
A3 Personas
Scenarios A3 A5
A3 Alternative Designs
A3 Paper Prototype
Flow Diagram A3 A5
A4 Heuristic Evaluation
Iteration 2 A5 Implementation Plan
A8 Lab Test
Iteration 3 A9 Web Experiment Setup
A10 Web Experiment Analysis
A11 Poster
A11 Video

High-fidelity Prototypes

Runnable 1 Name <Main App>
Type <Stand-alone application>
Platform <Android 4.0.3 on Samsung Galaxy SII>
Toolkits/Frameworks Used <Android SDK r22>
Major Releases Iteration 2, Iteration 3
GitHub repository
Runnable 2 Name <Admin App>
Type <Web application (Chrome)>
Platform <Microsoft Windows 7>
Toolkits/Frameworks Used <jQuery v1.10.2, jQueryUI v1.10.3>
Major Releases Iteration 2, Iteration 3
GitHub repository