Saturday, 19 September 2015

Interactive Prototype I: Testing

Outcomes

Users were asked to use a modified keyboard to play a basic version of the game and provide feedback.  The coloured dots on the screen corresponded to coloured dots on a keyboard (arrow keys).

Most users played the game as expected multiple times.  One user wanted to get the best time possible and tried to manipulate the keyboard in different ways to achieve this without interacting with the game in the desired way.  They tried mashing the keys and pressing random keys.  Another user  could not see the difference between the colours on screen and had to stop part way through their first go.

I measured user feedback through observation, in-person questions and an online questionnaire. The data from these measures can be found here.

The feedback I was after was about the movement of the different elements on screen (how the dots and cat moved in relation to each other) and how the timer effected gameplay.

Reflections

The interactive format for the prototype was a good way to get more detailed feedback on how the game is played.  It allowed me to see users play the game and what obstacles they came up against.  It became apparent quite quickly that the colour similarity was a problem and even stopped one user from completing a level.  Other feedback I received about the movement confirmed that it wasn't clear what moved when (and what was the next colour) but didn't provide much of a solution to this.  I then discussed this problem further with tutors.

Testing with a user who wasn't from class was really useful, I found my discussion with them quite different as they were less limited by what they knew to be possible.  It was also more validating to get feedback from someone who knew less about the concept/background to start with.  I would like to expand upon this for the next test.  The testing protocol and questionnaire worked quite well but it seem like the questionnaire might be a bit too long.

Effectiveness

The prototype worked really well and I was able to answer the specific questions I had about movement and apply the feedback about the movement.  The main problem with the prototype was that the colours I had selected were too similar which caused problems for some users.  One user was unable to complete the session at all as they could not tell the colours apart, others had some problems but could still complete the level.

Constraints

The constraint of the testing session meant that even though I knew I had a problem early on with the colours, I was unable to change it for other users.  It would have been better if I could have changed this as soon as I knew it was a problem so users could concentrate on other elements like movement.  To be able to change this on the fly would have taken time that I did not have in the session.

Implications

Changes to my concept

  • make colours more different from each other and add patterns for easier visual identification (and accessibility) 

Future prototypes

  • Interactive Prototype 2  (IP2)
    • make colours more different from each other and add patterns for easier visual identification (and accessibility)
    • move controls further away from each other
    • add audio feedback to wrong control pressed
    • change cat/dot layout to make it clearer which dot is next
    • fix bugs in start/stop/pause
  •  Future prototypes
    • on-screen feedback for wrong control pressed
    • add movement to cat
    • change timer from count up to count down
    • make background change colour with timer (sunshine to night timer)
    • make start/stop easier (enter or other control)
    • add in the extra bits (power-ups, levels, etc) (maybe IP3)
    • add high scores/previous score
    • add exit button

Future testing sessions

  • Reduce length of questionnaire

Friday, 18 September 2015

Interactive Prototype I: Testing Data

Here is the raw data from the testing sessions that were run for the Interactive Prototype I of Meow Meow Cat.  The data forms two parts, the observations recorded during the session and the questionnaire users filled out.

Testing session: 15 September (week 8 workshop A)

Users:
  • 6 total
  • 4 male, 2 female
  • 1 not from class

Observations

User #1
  • played three full games
  • played two other games with errors (game over message at wrong point, I don't think the game reset in-between)
  • wanted to be previous time when he played
  • times: 24, 15, 13 seconds
User #2
  • I reset game in between goes due to problems with previous session
  • played three games
  • found colours hard to see
  • second go, mashed controls with fist (i.e. pressing all colours at once) to get better time
  • third go randomly pressed keys to get better time
  • times: 24, 17, 6 seconds
User #3
  • played twice
  • had problems telling red and purple colours apart
  • times: 23, 19 seconds
User #4
  • played twice
  • asked "what feedback is there if you press the wrong colour?"
  • times: 24, 22 seconds
User #5
  • played once
  • was unable to tell the difference between colours (blue and purple), had to guess
  • suggested adding pattern to colours to help
  • time: 31 seconds
User #6
  • played four times
  • tried to beat score, stopped playing when couldn't beat score (i.e. when there was no more challenge)
  • commented that it was fun to play
  • times: 17, 13, 9, 9

Observation overview

  • Limitations of prototype included the modified keyboard; user had to re-learn keyboard input
  • Survey took a long time, possibly too many questions
  • Game played 15 times
  • Playing as intended:
    • Lowest time: 9 seconds
    • Longest time: 24 seconds
  • Playing in other manner:
    • Lowest time: 6 seconds (by pressing random keys as quickly as possible)
    • Longest time: 31 seconds (user who had to guess colours because they were too similar to differentiate)
  • Average time:
    • 17.73 seconds (all goes)
    • 17.67 seconds (not including mashing keys, random selecting keys or guessing keys)
    • no real difference between averages
  • Problems with colours being too similar and making it hard to see
    • could change colours
    • could add pattern to colours
  • Start/Stop/Reset button a little buggy - had to press in correct order to work
  • All users who played multiple games improved on their previous time

Questionnaire results

Total responses: 6
All questions except the last open 'Any comments?' question was compulsory.

Q1. Describe what you did in this session

  • I played the game three or four times.
  • Played the game until I could not determine the difference between two of the colours. I then had to stop as I could not play the game.
  • Used the colour keys to move the cat along the path of coloured dots
  • I pressed the colours that came up on the screen.
  • Make sure the colour of each direction key, and try to remember it. play game as fast as possible.
  • Pressed the coloured dots in the correct order to advance the cat and try to beat my record time.

Q2. How would you describe the instructions for this testing session?
Scale '1: Clear - it was easy to understand what to do' to '5: Confusing - I did not understand what to do'

5 users: 1/5 on scale
1 users: 2/5 on scale

Q3. Describe the movement that happened during the game. 

  • The cat stayed still and the colours moved.
  • when I press the right key, the dot will move below the cat
  • The cat moved from left to right across the game screen and the coloured dots disappeared
  • the dots moved towards the cat/the cat moved along the dots. I'm not sure. I pressed the colour that matched the next dot on the screen.
  • the cat moved forward (right) one dot at a time
  • The colors shifted left each time I pressed them.

Q4. Apart from the cat and the dots, did you notice any other elements on the screen? 
  • I noticed the time and the buttons at top, but during gameplay I didn't pay any attention to them - just concentrated at looking at the dot keys and the dots on screen
  • not many, just the those functional buttons
  • not after I started playing.
  • There was a yellow background. There was a start, stop, reset button at the top. There was a timer up the top left of the screen.
  • The time on the top of the screen.
  • The Timer in the top left hand side of the game. The Buttons in the top right were also visible
Q5. If you noticed the timer, did it affect how you played the game?
3 options

3 users: Yes, it affected how I played
1 user: No, it did not affect how I played
2 users: No, I did not notice the timer

Q6. Describe how the timer affected how you played.  
  • I didn't really look at the timer during the game - only after to see if I beat my previous game's time. perhaps there could be audio reminder every 5 or ten second intervals? Might be worth trying, hopefully it wouldn't be too annoying, but will add some element of pressure!
  • I tried to beat my previous times.
  • I didn't pay attention to it until the end of the game.
  • I want to finish the game as soon as possible
  • I did not notice it. So it did not affect the way I played
  • Added stress, but also made me want to continue playing to see how fast I could complete the course
Q7. Which elements on the screen moved? 
  • the dots are moving
  • The dots moved and the timer counted.
  • the dots
  • The colours moved.
  • The colors.
  • Only the dots. It seems like the cat is moving but i am pretty sure it remains stationary
Q8. Any other comments?
  • It was sometimes a little confusing which colour needed to be pressed as. If that cat was currently on green, I wanted to press green but I actually had to press blue which was the next colour in the sequence. I was able to learn to press the correct button after a few plays but it was not immediately obvious.
  • add a key press check function, maybe, because user can press all direction, and the game will be finished very quick
  • For some reason, I kept wanting to press the color under the cat rather than the color to the right of that color. I think it would be easier to press the color under that cat because it's not between to other colors, and it's more noticeable because it's under the cat. I think if I played this game more, I would more easily press the colors instinctively rather than having to look down to see where colors were. I think this will be easier in your next prototype because you won't have the correlated instinct of up/down/left/right keys. Overall, I really liked this game! Good work!
  • The first comment is about colour. I could not tell the difference between two of the colours and as such I could not finished the game. Change the colours or add a pattern to make it easier to determine which is which. Secondly I would add a space between the colour I am on and the next colour. I was confused at times. If the colour I was on was a different size or shape I would automatically assume it was the colour I was on and look to the next number.
  • It would be good if there was a sound when you hit the wrong colour, I found it hard to know if I had pressed the wrong button, or just not pressed anything.

Questionnaire overview

  • Users understood that they were moving the cat along the coloured path even though it was the path that was actually moving
  • Users found my instructions for the testing easy to understand (Q2) but I should have reversed the scale (so 5/5 was 'completely understand' not 1/5)
  • When asked what they did, users understood the gameplay of the cat moving along the coloured path 
  • When asked to describe the movement, some (3-4/6) thought the cat moved (what I intended to be perceived), some (2-3/6) thought the dots moved (what actually happened) but when asked what element/s moved, all users reported the dots moved.
    • confusion about what moved might be clearer if the cat had movement, even if it didn't physically move across the screen e.g. 'jumped' up and down on the dots
    • users knew that the only 'moving' element was the dots but about half perceived the cat moving (what I wanted to happen).
  • Half the users (3/6) noticed the timer and used it to compete against themselves even though they were not told to, and they didn't know their times were being recorded.  One other user noticed the time but reported it did not affect how they played.
  • Players liked the pressure of the timer to compete but didn't notice during gameplay, only at end of level.
  • Give users some feedback when they press the wrong button
    • maybe sound or text/image on screen
  • Physically distance the inputs for the next prototype to make it impossible (or at least much harder) to randomly press inputs or press more than one at a time
  • Colours (accessibility) needs to be fixed
  • Game started with cat off the dots and when you pressed the first colour, dot moved underneath, then press next colour, etc. Once you had pressed a couple of colours, it was then unclear/easy to get mixed up which colour to press next.  Was it the colour under the cat of the colour next to (right of) the cat.
    • could change size of dots once they are pressed (i.e. colours under the cat and to the left of)
    • could add in a market (like a caret) underneath the dot you are to press next
    • could animate the cat so it looks more like it is jumping to the next dot
    • add more of a space between the dots I have pressed and the dots I haven't pressed yet

Tuesday, 15 September 2015

Week 8: Makey Makey I

We started using Makey Makey's this week in class.  We did a few activities to simulate different games and their controls and get us thinking about the different ways we can use them and what problems we might come across.  Our tutor mentioned at the start that one of the biggest things to remember was to ground it and I still forgot half a dozen times during the workshop.  My IP1 will plug straight into a Makey Makey because of how it was set up (using keyboard arrows).  My challenge will be to get it as physical as possible on a floor mat set-up, and hopefully to add more functionality to the prototype and fix some of the bugs from IP1.

One of the most interesting things we did was play multi-person Pacman where 4 people were each responsible for a direction.  We had to communicate a lot to make this work but it made it a lot more fun than regular Pacman.  Here are a few of the different set-ups I tried today.

Set-up for space invaders.  Alfoil bracelet grounds it.

Set-up for piano, using back letter controls (WASDFG).

Playing the online piano.

Set-up to play my game, success!

Week 8: Physical interactions

This week in class we looked at three existing digital experiences (email, Twitter and Super Mario Brothers) and had to come up with 5 new ways of physically interacting with them.



Email

Core interactions

  • new message
  • write message
  • send message
  • view inbox
  • move or delete message

Ideas

  1. Sandbox
    movements in sand associated with specific actions
  2. Physical drawers
    drawers on a desk with actions associated with drawers; interactive elements (block) do something when you transfer between drawers
  3. Coloured balls
    similar to drawers but colour indicates action; when viewing an email, the colour you choose decides action (red for delete)
  4. Physical box with coloured lights
    box is scrollable and has digital display
  5. Physical letter
    uses the tactile sensation of writing a letter that then scans image-to-text and shreds original (like a fax/shredder)


Twitter

Core interactions

  • scroll newsfeed
  • compose tweet
  • send tweet
  • quote or retweet
  • follow someone

Ideas

  1. Floor mat
    uses a floor mat that you would stand on/tap to compose tweet; layout would be like old mobile phone keypad
  2. Rollerdex
    scroll through to scroll through newsfeed; can scroll forward and backward; touch sensors on 'cards' would perform simple actions like retweet
  3. Deck of cards
    to promote slow, deliberate reading; hold card up to sensor to load next tweet in feed
  4. Chips/box
  5. Morse code
    morse code tapper, switches for actions



Super Mario Brothers

Core interaction

  • move forward and back
  • jump

Ideas

  • Bike
    pedal forward and back; get off saddle to jump
  • Semaphore flags
    different combinations for different moves
  • Box of water
    sensors on box, splash to indicate action
  • Foot pads
    two foot sensors (left and right) to simulate movement; handrail touch to move backwards; jump to jump
  • Row boat
    left row for back, right row for forward, row both to jump

Summary

This was a really interesting exercise, I was definitely running out of idea steam by the time I got to 5.  I also felt like I was repeating elements of my ideas, it was hard to think outside the box.  Super Mario was much easier as the interactions were simpler, email and twitter had so many behaviours that I found my ideas centered towards one or two interactions and not the full interaction of these applications.

Sunday, 6 September 2015

Week 7: Reviewing video prototype testing

Question Type Traps
Q1. Do you understand the concept and gameplay?
Scale '1: No, did not understand at all' to '5: Yes, understood the whole concept'
Quantitative This was a compound question.  Understanding the concept and how the game works is two different things.
Should have separated this question.
Q2. Are there any parts you did not understand? Qualitative Not specifically, it builds on the previous question which should have been separated.
Q3. Did the video format help with your understanding of the concept?
Scale '1: No, it didn't help' to '5: Yes, it did help'
Quantitative This was a bit leading.  Maybe "How did you find the video format of the prototype?"
Q4. What would you improve about the game?  Qualitative No
Q5. Do you agree with this statement?  The concept is engaging.
Scale '1: Strongly disagree' to '5: Strongly agree'
Quantitative Could be leading.  Might have been better as a semantic differential
e.g. "Rate the concept (engaging to boring)" or similar.
Q6. Was the audio of a quality that you could understand what was being said? Qualitative Probably a bit wordy "could you hear what was said" would have been simpler.  
Q7. Were my instructions clear? Qualitative Leading question, could again be a semantic differential which would have made it quantitative as well.
e.g."Rate the instructions (clear to confusing)"
Q8. Any other comments? Qualitative No

Friday, 4 September 2015

Week 6: Class Responsibility Collaboration (CRC)

In this week's lecture we looked at Class Responsibility Collaboration (CRC) cards.  First was to look at the objects.  For my game, they are: stage, player (cat), path of dots, sun, start/stop, timer.


I then created a card for each of these and looked at the attributes (variables) and abilities (functions). 




This process raised an interesting question for my game, what moves?  I had to quickly sketch it up to understand the relationship between the moving objects.  Using this, I was able to refine my goal for the next prototype (IP1) - how do the movements work and work together. 

So far, I have:
  • the player moves along the path (right) of coloured dots through keyboard inputs 
  • the path is longer than the screen so the path has to move as well
  • the path moves to the left as the player moves (e.g. it does not move independently, if the player doesn't move, the path doesn't move)
  • sun starts near the middle and moves to the end of the path
  • how fast the sun moves depends on the level
  • the sun's movement is a fixed speed and time
  • if the sun reaches the end before the player reaches the sun = game over
  • if the sun goes off screen but has not reached end yet, timer informs player of time left



Writing out all the movements, it feels a little complicated.  It might also be confusing to have the sun go off the screen.

Current:
Goal is to reach sun (location) for nap before sun reaches end of path (second location)

Suggested change:
Goal is to reach end of path (location) before sun (day) becomes night (timer)

I am going to ruminate a bit more on this change and discuss it at our next workshop for more feedback.


Restaurant Dining Experience

What is our restaurant? Generic chain-store with table service

P.O.V. of waiter

Existing experience

  • welcome
  • take to table
  • take order
  • deliver order
  • clean table
  • take payment

External factors

  • customers
  • cooking staff
  • other waiters
  • restaurant business
  • management

Internal factors

  • tired/sick
  • emotional state
  • workload
  • pay
  • work satisfaction

What could be enhanced/augmented/supported with technology?

  • ideas (reduce role of waiter)
    • ipad ordering - reduce number of tasks
    • strip lighting to seat
    • deliver order via device like sushi train or buzzer to collect food or robotic cart
    • app payment
    • self cleaning tables
  • electronic ordering to reduce errors and tie order with table

How would introducing technology change experience?

  • less for staff to remember
  • less order errors
  • wireless means order could go straight to kitchen without waiter physically going there and also that orders are already 'rung-up' for when cutomer goes to pay

What experience scenarios might you test?

  • prototype that lets wait staff record orders on an ipad but that doesn't have other funcationality included like sending orders to the kitchen.
  • waiters could interact with the interface to record orders and different interfaces could be tested to find out what elements are work/don't work

P.O.V. of customer

Existing experience

  • arrive and are seated
  • browse menu
  • order
  • wait for food/drinks
  • eat
  • maybe order dessert
  • pay
  • leave

External factors

  • wheather
  • traffic
  • parking
  • time of day
  • people you are with
  • ability to make a booking or just turn up
  • time to get seated
  • business of restuarant
  • time to order
  • tiime for food to arrive
  • experience of noise (children, loud music)

Internal factors

  • how hungry you are
  • how much time you have
  • how much money you have/want to spend
  • dietary needs

What could be enhanced/augmented/supported with technology?

  • mobile app to increase efficiency to place an order
  • show image or dish and ingredients
  • incorporate payment
  • vote on dishes
  • give quick review
  • app: allow restuarant to input all meals with ingredients that users could filter by what they didn't like or what they are allergic to

How would introducing technology change experience?

  • more accuracy on ingredients (ie vegetarian dish but uses fish sauce)
  • allows users to view and choose at leisure rather than asking wiater lots of questions
  • ease of choosing restuarnt you know will have something you can eat

What experience scenarios might you test?

  • two options, paper menu with symbols (vege, gluten free) and app that allows users to filter
  • test user opinions on ease of ordering and whether it imporves their expereince

P.O.V. of chef

Existing experience

  • get to work
  • prep
  • cook
  • some clean
  • some ordering of ingredients
  • manage multiple meals at once

External factors

  • arriving at work on time
  • number of customers
  • number of waitstaff
  • changes to menu
  • special orders
  • availability of ingredients

Internal factors

  • tired/sick
  • emotional state
  • workload
  • pay
  • work satisfaction
  • expereince

What could be enhanced/augmented/supported with technology?

  • electronic ordering
  • streamline cooking workflow
  • table tell if diners are finished one meal and ready for next
  • screen display with orders and timing and orer status (e.g. prep, pre-cooking, cooking, plating)

How would introducing technology change experience?

  • streamlined workflow
  • quicker turn around on orders
  • greater tansparency of time needed and order status
  • ease to serve whole table at once
  • reporting 
  • can track meal inventory (and when they will run out)

What experience scenarios might you test?

  • small setting test with reduced cutomers and menu
  • increase load to see if there is a point of business where it is not helpful