How to Recruit Participants for a Card Sort Study
Getting responses is the hardest part of card sorting. Here are the most effective ways to find participants — from free methods to paid panels.
How to Recruit Participants for a Card Sort Study
You've set up your card sort study. The cards are ready. Now you need people to sort them — and that's usually the hardest part.
Here's what actually works for getting participants, from free approaches to paid panels, and how many you really need.
Method 1: Your Existing Users (Free, Highest Quality)
Start here. Your actual users already think about your product the way your audience does. They bring real mental models, not guesses. That makes their sorting data far more useful than what you'd get from strangers.
To reach them, send a short email with a clear ask: "Help us improve navigation — takes about 5 minutes." You can also drop an in-app banner linking straight to the study, or post in any community spaces where your users hang out (forums, Slack groups, Discord).
Small incentives go a long way. Feature previews, account credits, even a simple discount code — anything that feels relevant to the product. From a list of 500-1,000 users, you can reasonably expect 20-50 responses, and that number climbs noticeably when you include an incentive.
Method 2: Colleagues and Internal Teams (Free, Context-Dependent)
This works well for B2B products, internal tools, and intranet redesigns — situations where your coworkers actually are the target users. They already know the terminology and the business context, which is exactly what you need.
Share the study link in a relevant Slack channel with a quick explanation of what you're doing and why. DM 10-15 people directly and ask for five minutes. You can even run a live sorting session during a team meeting if that's easier.
One important caveat: don't recruit colleagues for consumer products. Your team's mental model of your product is almost certainly different from your customers'. You'll end up with data that reflects how your company thinks, not how your users think.
Method 3: Social Networks (Free, Variable Results)
Social media can get you 10-50 responses within a day or two, but quality depends heavily on whether your followers actually match your target audience.
LinkedIn works best when you frame it as a professional research request — be specific about the study and the time commitment. Twitter/X can work if you include context and use relevant hashtags (#UXResearch, #ProductDesign, or whatever fits your domain). Reddit is surprisingly effective if you find the right subreddit — post in r/personalfinance for a fintech app, r/healthcare for medical tools, and so on. Facebook Groups centered around professional or interest-based communities can also pull decent numbers.
LinkedIn tends to get you 10-30 responses per 500 connections. A well-placed Reddit post in an active community can pull 30-50.
Method 4: Prolific Research Panel (Paid, Professionally Reliable)
If you have a small budget, Prolific is the best option for paid recruitment. Their participants are used to taking part in research, which means they actually pay attention and give thoughtful responses. Completion rates are typically above 90%.
What makes Prolific worth it:
- Participants who are familiar with research tasks and take them seriously
- Demographic filters so you can target by location, age, profession, or domain expertise
- Built-in quality controls and participant ratings
- Studies usually fill within 2-4 hours of launch
It costs about $1.50-2.50 per completed response including platform fees. For 25 participants, you're looking at $40-60 total — reasonable for most projects.
Method 5: Premium Research Panels (Paid, High-Cost)
Platforms like UserTesting charge $30-100+ per participant. That's hard to justify for a straightforward card sort. These platforms are great for complex usability testing with think-aloud protocols, but for card sorting specifically, Prolific gets you comparable quality at a fraction of the price.
Save premium panels for when you need very specific screening criteria or recorded think-aloud sessions during the sort.
Method 6: Specialist Recruitment Agencies (Paid, Niche-Required)
You'll only need a recruitment agency if you're targeting very specific, hard-to-reach people — doctors, C-suite executives, rare technical specialists. Expect to pay $50-150+ per participant and wait 1-2 weeks.
For most card sorting studies, Prolific's demographic filters cover plenty of professional backgrounds at much lower cost and faster turnaround.
How Many Participants Do You Actually Need?
Card sorting patterns tend to stabilize around 20-30 participants. After that, you're unlikely to see the groupings shift much. Here's a rough guide:
| Study Goal | Participants | What You Get |
|---|---|---|
| Quick pilot to test your study setup | 5 | Basic patterns start to emerge |
| Early pattern exploration | 15-20 | Moderate confidence in groupings |
| Solid similarity matrix | 20-30 | High confidence — enough for most projects |
| Academic or publication-quality data | 30-50 | Statistically robust results |
Tips That Actually Help
Send a reminder. A single follow-up email 3-5 days after your initial message can roughly double your response rate. Most people who are willing to participate just forgot or got busy.
Beyond that, a few things make a real difference:
- Tell people exactly how long it takes. "5 minutes" in the subject line or message gets way more clicks than a vague ask.
- Explain briefly why their input matters — people are more likely to finish when they understand the purpose.
- Make sure your link works on mobile. A lot of people will open it on their phone.
- Send a thank-you afterward. It's polite, and it helps if you ever need to recruit from the same group again.
On the quality control side, test the study yourself before sending it out. Watch your completion rates as responses come in — if they're below 70%, your study might be too long or confusing. Close recruitment once you hit 25-30 responses. And keep an eye out for anyone who finishes suspiciously fast (under 2 minutes usually means they weren't really paying attention).
Recruitment Checklist
- Tested the study yourself and confirmed everything works
- Recruitment message includes a specific time estimate and clear context
- Target participants actually match your real user base
- Study link works without requiring sign-up or registration
- Decided on incentives (money, credits, or just a thank-you)
- Follow-up reminder scheduled for 3-5 days after launch
- Set a target for how many participants you need and when to stop
Further Reading
- What is Card Sorting? Complete Guide
- Card Sorting (UX Glossary)
- Information Architecture (UX Glossary)
- How To Run Your First Card Sort Study
Frequently Asked Questions
How many participants do I need for reliable card sorting results? For most studies, 20-30 participants will give you stable, reliable patterns. Once you go past 30, the core groupings rarely change — you're just spending more money for marginal gains. Unless you're publishing academic research, 25 is a solid target.
Should I pay participants for card sorting studies? It depends on your situation. Paid participants through platforms like Prolific tend to complete the study more reliably and give more thoughtful responses. If you have $40-60 to spend, it's usually worth it for faster, cleaner data. But if you have access to your actual users, free recruitment can work just as well.
Can I use colleagues as participants for card sorting studies? Yes — if they're representative of your actual users. For B2B tools, internal systems, or enterprise software, your coworkers are often the perfect participants. But for consumer products, skip this. Your team thinks about your product differently than your customers do, and that bias will show up in the data.
How long should I leave card sort studies open for recruitment? Most of your responses will come in within the first 48 hours. If you're recruiting from your own user base, give it 5-7 days and send one reminder. If you're using Prolific, your study will usually fill within a few hours.
What makes Prolific better than other crowdsourcing platforms for card sorting? Prolific's participants are there specifically for research, so they tend to be more attentive and complete studies at much higher rates (90%+) than you'd see on general platforms like MTurk (more like 60-70%). You pay a bit more per person, but fewer wasted responses means better value overall.