Outcomes
Users were asked to view the video to provide feedback. They could pause, rewind, scrub and re-watch. Users were also encouraged to ask questions at any time.
For the most part users watched the video through. One person watched twice and a few people stopped at the credits, but for the most part they watch in a linear fashion.
I measured user feedback through observation, in-person questions and an online questionnaire. The data from these measures can be found
here.
The feedback I was after was about the concept (was it clear, easy to understand, fun, engaging) and the prototype format (video and audio quality).
Reflections
The video format for the prototype was a fairly quick way to get feedback about the concept without having to code anything. It was more detailed and controlled than simply telling someone my idea and also allowed for engaging visuals.
I received a number of pieces of useful feedback on the concept through the video prototype. Things I hadn't considered (e.g. how do you start/stop) and extensions of existing ideas (e.g. power-ups).
Testing in class provided a number of willing participants but I know this is not indicative of testing prototypes with 'real' users. The testing protocol I laid out worked well but I know I was much more informal with users than I have been in other usability testing situations or if they hadn't been my classmates with a similar background to developing the concept (i.e. we all knew it would be a game mash-up or a wearable).
Effectiveness
The protoype was very effective and achieved the goals I set out to assess whether the concept is entertaining and engaging, the gameplay is clear and logical, and to facilitate feedback on the concept. All three of these goals were achieved.
Constraints
The constraint of the video format meant that users could not interact with the concept, only take in information. This meant that I had to show how users would interact in the video protoype (i.e. show feet stepping on mat). I had a number of bits of feedback to say this was clear so I think this constraint was able to be overcome.
The constraint of the testing session was that it was an artificial testing environment. This meant that users were already prepared with background knowledge (negative) and very willing to give feedback (positive).
Implications
Changes to my concept
- need a start/stop/exit button (or method)
- walkthrough/game instructions might be helpful
- extra 'fun' could be added through power-ups (tuna) or traps (catnip)
Future prototypes
- Interactive Prototype 1 (IP1)
- I will add a start/stop/exit button on screen
- verbal instructions
- Further prototypes
- start/stop/exit could be implemented by another method e.g. jumping or double tap on mat??
- think about best way to display instructions, maybe a walkthrough/video/intro
- add in the extra bits (maybe IP3)
Future testing sessions
- Run testing in addition to class session
- Act more formal in-class session (i.e pretend they are real users who know nothing about the background) and print out testing script