Playtesting is important, but it’s often hard to find the time. It takes a lot of effort to find playtesters, prepare the build, gather participants and analyse their feedback. This can be difficult to prioritise alongside everything else busy game developers have to do – marketing, social media, and… actually making the game.
What if playtests could be quick, easy, and immediate? And you could get an immediate steer on whether your game is on the right track, and what the biggest problems might be.
Pop-up playtests offer a quick method for getting low-cost information about the player experience. They are most effective at highlighting the biggest issues with your game.
Spotting and fixing these obvious issues early gives you more opportunity to find more detailed and nuanced results with more formal playtests.
However they also throw out a lot of the protection that more formal playtests have built it, and increase the risk of drawing misleading or unhelpful conclusions from your playtest.
In this article you’ll learn how to make the most of pop-up playtests, how to avoid some of the traps that mislead game developers, and get the best possible data from your study.
Why do we need to run pop-up playtests?
Playtesting is hard to organise, and so it doesn’t happen enough. This means our games suffer – less playtesting means less iteration, discovering problems too late, and underwhelming releases.
I’ve spent over 25,000 hours with players during playtests, and I’ve seen that a pop-up playtest is better than no playtest at all. Today, we’ll learn how to make the most of them.
What is a pop-up playtest?
A typical formal playtest involves finding the right players, designing tasks for them to do, gathering data and analysing it to inform game design decisions. Games user researchers (that’s me!) do this as a full-time job, and most game developers understand the value of running these formal playtests throughout the development of their game.
Formal playtests take time to prepare and have a process behind them to ensure the results are reliable. This is what game user researchers do to a professional standard every day.
A pop-up playtest throws a lot of the process out – finding some people, putting them in front of your game, and seeing what happens. Minimal prep, minimal time, just instant feedback. And sometimes that’s all we have time for.
Just you, watching someone play your game.
And maybe asking them a question or two at the end.
Other names for this process are ‘guerilla’ research, or quick and dirty playtests.
What tech do I need?
The tech for pop-up research can be very low-key – you just want to record what happens in the game, and what people are saying as they play.
For a mobile game, you can do this with built in screen recording software, such as is built into iOS. Hit the button to start it recording, and then hand the phone to the player – easy.
If you’re running playtests remotely over video conferencing software, Zoom (and others) have meeting recording built in – so you just need to get the player to share their screen, and start the meeting recording.
If you’re testing on a PC or laptop, Obs (open broadcaster software) is a free tool that lets you record your screen and voice over – or windows game bar does it too. Start recording, hand your laptop to the player, and you’re away!
🚀 Do this now: Decide if you’re running these playtests in person, or remotely, and get your tech setup together
What are the risks with pop-up playtests?
There’s a reason why playtesting (and “games user research”) is a discipline – it’s easy to get accidentally misled by feedback from players and draw wrong conclusions about your game.
Some traps include:
- Finding unrepresentative players, and balancing the game too easy or too hard for your real players.
- Focusing too much on what players say about the game, and not enough on their behaviour
- Helping players too much, and biasing their experience artificially
- Believing players who tell you “I’d buy this game”
Here are four principles to keep in mind when running playtests that are easy to incorporate and will improve the quality of your pop-up playtest. They will help you avoid those common traps!
Four ways to make your pop-up playtests more effective
Decide your playtest objectives
You might learn something if you just put your game in front of players and ask them “what do you think”. But you’ll get much more useful results if you think about what you want to learn from the playtest first.
Consider the most important things you need to learn to make your game a success, with prompt questions like “What parts of the game am I most uncertain about” or “What do I absolutely need players to understand about this game”.
From that you can write a short list of objectives – things you want to learn from your playtest.
Some examples of objectives:
- “Do players understand their goal from this level”
- “Has the tutorial taught players how to use their special move”
- “Will players recognise the correct strategy for this puzzle”
Use this list of objectives to decide which tasks to set players, which questions to ask them, and what to watch out for during the playtest.
Some objectives are almost impossible to answer reliably with pop-up research. Complex questions around balancing the game, or measuring opinions require reliable quantitative methods, like surveys – and so will be a lot less suitable than “do players get stuck in this level”.
🚀 Do this now: Think about what are the riskiest or most important parts of your game. Write them out in a list, ready to inspire your playtest objectives.
Find the right players
It matters who takes part in your playtests. If they are game developers, their feedback is going to be very solution focused. If they are your friends they are going to be too kind. And if they don’t normally play this type of game their experience isn’t going to be typical of your real players.
Finding the right players takes time (here’s a guide on how to get started with finding playtesters).
For your popup playtest make sure you’re consciously thinking about “how can I find people who are similar to my real players”.
Use that decide where to go and look for players – perhaps existing gaming communities or a local gaming cafe are a good start. Many communities of game players can be found on reddit (but be careful of game development communities – game developers are not your audience!)
Then consider “how will the type of people who playtest for me influence their feedback”.
Use this to filter the feedback you hear. Think about whether their background and the type of player they are would influence the type of feedback you’re getting, and use that to decide how much to prioritise what you’re hearing. This can often highlight interesting threads that should be explored with more rigour with more formal playtesting.
🚀 Do this now: Make a shortlist of places you could look for players, and for each consider “what influence will players from here have on their feedback”. Then pick the two or three most appropriate places to go and run your playtest.
Watch people play
It’s tempting to rely on surveys or feedback on a Discord channel to get data from playtests, because it feels convenient. But surveys rely on players self-reporting what happened to them, which misses a huge amount of valuable information.
Players can’t tell you features they didn’t discover or didn’t understand. They can’t tell you what they missed. They can’t give you enough detail about why problems happened that you can fix the problem.
To get the right level of detail, you need to watch people play.
Watching just two or three people play your game is often much more useful than a survey of hundreds of players. It gives a huge amount more rich information and put you in a much better position to actually make informed changes to your game, rather than trying to guess what players meant on a survey.
🚀 Do this now: Go out into the places you shortlisted, and watch three people play your game!
Analyse what you see and hear
After you watch people play your game, and speak to them about it, you’ll have a bunch of raw data to draw upon. It’s crucial to actually analyse this feedback, rather than act on their suggestions immediately.
A good rule of thumb is to trust what you observed. If you saw a player do something, or saw that a player didn’t understand something, you know that is an objective fact, and it’s safe to take action to fix it.
You have to be more careful with ‘feedback’ from players that they reported themselves. Their opinions about the game being too hard, or not enjoying certain bits are a clue that you should investigate that area more – but not necessarily trust their interpretation of what should be different.
For example “This game is hard” might be fine if you want the game to be difficult. Or “You should add a new weapon” is a clue that something is wrong – but the fix might be completely different to adding a new weapon, and needs exploring.
Trust the player behaviour you see yourself, and use what players say as a clue to investigate further – but don’t act immediately on their feedback.
🚀 Do this now: Separate your playtest data into ‘observations’ and ‘feedback’. Prioritise working on issues flagged by the ‘observation’ data.
Get serious about playtesting
Pop-up playtesting is a good toe into the water of playtesting, and will start to show value immediately. But pretty soon you’ll realise you want to take this more seriously, and be ready to run in-person or remote playtests at the drop of a hat, write reliable surveys, and run high quality playtests throughout development.
I’ve made the playtest kit for you.
It is a repeatable playtest process, making it simple to run professional quality playtests, iterate your game, and make a game that players will love.
Learn more about the playtest kit, and sign up for the free email course at the bottom of the page, to get started today gathering playtesters at playtestkit.com
Ready to RuN BETTER PLAYTESTS?
Every month, get sent the latest articles on how to run efficient high quality playtests to de-risk game development.
Plus get my free course on how to get your first 100 playtesters for teams without much budget or time.