I have a Google docs questionnaire that I made that asks kind of broad things like "How long did the game take?" "Who won?" "Were any of the rules too obscure or confusing?" etc, but I was wondering what kinds of questions you all ask of your playtesters?
The only question I tend to ask that isn't "standard" is...
"If you would not purchase this game for yourself, would you consider purchasing it as a gift for someone else?"
I have a few playtesters that pretty much stick to card games played with a normal poker deck; as such, no matter how much they like the game they always say "I would never buy this game" :)
I also push for detailed feedback; yes/no answers may help a little, but knowing why they responded as they did is invaluable
Some you asked:
How long did it take to play?
What rules did you have to look up or not understand?
Please clearly write your name as you would like to see it in the rules.
Was there anything you felt was over/under powered?
Other suggestions for improvement...
Did you have fun?
What would make the game more fun?
Apologies for the wall of text. I tend to over-answer and over-explain myself (though with the best of intentions). I don't have a lot of experience getting feedback from play testers, but I have had several classes that have dealt with surveys and what makes them effective, as well as personal experience designing surveys and interpreting results, so here are my thoughts. Hopefully they're of some use - take them for what they're worth:
I think the key to a good survey is focus. More than anything, I'd suggest making a list of information you want to find out from the play testers and prioritize them. I'm sure you've got a catch all field for additional comments at the end so use that to your advantage. Don't be anxious about having overlooked something - responders will use that catch-all field to let you know if you have. In fact, I find I'm actually more likely to write something in the catch-all field at the end if the survey is made up of specific, detailed questions than ones that are too broad or open ended.
Getting a good indication of people's true gut reactions is crucial. I'd ask testers something like, "On a scale of 1 to 5, 5 being best, how would you rate the quality of your experience with this game overall?" before I asked them any thing else, because you want that initial gut reaction. If you ask people to analyse/justify their initial reaction, they often over-analyse things and actually give less accurate responses. If you're doing an electronic survey, I'd consider going as far as to asking this on a separate page so that people can't second guess themselves and go back to change it. (You can find an interesting study on this phenomenon here, if you're interested - if you can't access the full write up, the abstract has all you really need: http://psycnet.apa.org/index.cfm?fa=search.displayRecord&uid=1991-17498-001 )
How easy a survey is to respond to will dictate how many (and how complete of) responses you get more than anything else. Targeted, specific questions are actually easier for people to answer than broad, general ones. Having people rate things on a scale from 1 to 5 (with 5 being best) is a great way to do things because people are more likely to give you an accurate reflection of their first impression. It's much easier to rate the speed of setup, how smooth the turn progression went, etc, than it is to answer a question like "Can you think of anything that slowed down the game or phases of the game that could be more streamlined?" For questions that don't lend themselves to a rating scale, people should ideally be able to answer with one or two words.
Also, don't fall into the trap of trying to make the responders do you job for you. While you want to ask for suggestions (in the catch all and perhaps about specific elements), trying to identify potential problems should take priority. Questions like, "How could this game be made better?," are hard to answer, so most people will leave them blank. And while it's worth asking just in case someone does come up with a great suggestion, most of the ones that you do get aren't that useful. You're the expert on game design (or at the very least, the expert on your game), and your play testers usually aren't. Play testers are experts in an area you aren't: their experiences playing your game. If you construct a survey that centers on your play testers' expertise, you'll get a more accurate picture of they're experiences and you'll be better able apply your expertise (tweaking your game so that other people have a better experience playing it). Like I said, there's no harm in asking for suggestions - you'll just get better results realizing your (and your play testers') areas of expertise and playing to them as much as possible.
Here are some questions I thought of that might yield useful responses (some are open ended, some lend themselves to a rating scale, some could go both ways, depending on how they're phrased). They just came off the top of my head, so this list is by no means definitive (or even balanced), but hopefully there's something in here that will be of some help:
Hey, no problem on the wall of text. A lot of it was pretty helpful. Some of the things you've suggest (asking about how long it took to play or how many times they played) I have included in my own questionnaire, but I think the advice on asking more specific questions and allowing the catch all at the end to do more work was really helpful. I'll definitely be revising my survey before the next time I have people play.
I like to bluntly ask 'what didn't you like about this game', or 'what didn't feel right'.
Go into the conversation with the attitude that something must be wrong, and you want them to tell you what it was - that way they're more likely to think of something than they would be if you ask whether anything was wrong.
Fantastic post man. I will DEFINATELY be using this when AtomPunk is rolled out to playtesters. Thank you so much, really appreciated advice :)
I feel like such a dork!
I've just been giving my testers a piece of paper labeled "Notes and Observations" and letting them put what they feel. This has worked well for finding the hitches in rules and such, but after seeing this i'm so excited to give a much more directed feedback form.
excuse me while i go make my form...
So, after going over everybody's suggestions, I threw together a Google docs survey, that you are all more than welcome to use as is or as a starting point for your own surveys.
If you think some things could be re-arranged or modified or if you have any questions about why i did it the way i did, please voice those thoughts!
Everybody is more than welcome to use this for their own projects: https://docs.google.com/document/d/1RcnsV6TEKSUMejDfP0G59LT0AmEy7K6nzXwU...
Ask is the key here; Questionnaires can ask about things you already know are a problem or might be a problem. Problems tend to crop up in unseen ways. I prefer to interview the players so that my questioning can be guided by the answers I get.
I start with something like. "What didn't you like about X"
Are we talking about blind testers, or about in-person testers? I never give questionnaires to in-person playtesters. What's important to me is watching what they actually do during the game. In that respect I follow Jakob Nielsen, who says that what people say they do or would do, isn't what they actually do. If you want to ask questions in-person try some variation of Edard de Bono's Six Hats method. It's simple and fairly specific, but gets at several different angles.
Just a note to say that I cross posted some of your excellent responses to this question at the dev forums in bgg. I figured that others could benefit from the super great words :)
There's a whole bunch of questions that you can ask, and they will pertain to the kind of information that you're looking for, but I've found that a good jumping off point is:
Did you have fun?
Did you feel engaged the whole time?
Did you feel like you had a chance to win the game for the duration?
Would you play it again/would you recommend others to play it?
Did you feel like your choices mattered and affected your play?
I recently needed to put together a survey for blind playtests of my new game. Of course, I couldn't put everything on the survery that I wanted to ask but I followed a lot of the ideas in the thread.
One more comment on the price question... I actually find that this is a good way to get more game feedback than you might initially think. If playtesters respond with a lower price point that you were going for, push them on why. I've had folks tell me that the game was too simple and/or not engaging enough and thus they put a lower price tag on it.
In one way, it makes no logical sense... you could make a tic-tac-toe set with diamond 'x's and pure gold 'o's. It won't be a cheap game, but it is simple. But you've got to look past that and engage with their actual reasoning as to the price point.