Researchers working in commercial settings are faced with a unique challenge: how to ensure quality research output that our paying clients can act on?

The implications of such a challenge are far-reaching. We are responsible for the quality of the research itself, and the subsequent recommendations that stem from such work. Conversations around quality often focus on methodology: is it the right fit? Does it answer key questions? Are there biases, gaps, or other flaws that we can eliminate? One aspect, however, that can sometimes be overlooked is the participant experience yet we know it can make a huge difference to the quality of response and hence to the overall outcomes of a project.

It was this that prompted an investigation into the participant experience at large, plus its impact on the research process and its outcomes. Andrew Cannon, executive director of the Global Research Business Network, recruited researchers and partner companies keen to explore the participant experience in greater depth. GRBN is a global network of research clients and agencies who are working together to promote and advance the business of research by developing and supporting strong autonomous national research associations. The project included qualitative, quantitative and community-based work, speaking with new and experienced participants in the US and the UK.

Researchers, along with our trusted field partners, act — in some ways — as the buffers between businesses and the people they are interested in speaking with. We advocate for participants by evaluating the fairness of requests for information, as well as their assigned workload. We aim to properly incentivise people for their work, despite stringent budgets. We seek to allow their truths to be freely heard in the most honest and accurate of ways.

Often, we take on this role as ‘participant advocate’ instinctually. We understand and respect that our work is based largely on the compliance of those people we have sought and found for our studies, therefore we protect them as best we can. Some would argue, though, that the participant experience deserves even more of our consideration than defaulting to the checks and balances that have been built into our processes. For the participant experience is, in many ways, the crux of what designates our work as quality — or not — and by extension, clients’ ability to rely on our work to support their decision-making.

Some of the standard practices that impact participants the most are those that have evolved the least over time. While we have made strides in terms of how we collect information (the technology has moved to more mobile-friendly options that align naturally with consumer behavior, including easy-to-use applications for video uploading and remote interviewing), we have still kept some practices in place that may undermine progress.

The qualitative aspect of the work was undertaken by Katrina Noelle of KNow Research, Kerry Hecht Labsuirs of Echo Qualitative Project Support, and myself (and the team) at MindSpark Research International. Additional support by way of recruitment and facility services was provided by Liz Diez of Acumen Research, as well as Dub, Tango Card, Netquest and WatchLab.

Benefits of dual-method approach

Through a dual-method approach, one that spoke to veteran research participants online and asked ‘fresh’ research respondents to take part in a mock-focus group that they then provided feedback on, our team of researchers collected feedback about which parts of our work impact participants most.

A particularly valuable piece of advice was around the setting of positive expectations for participants. When asked to write their own version of a study invitation, nearly all submissions included fun aspects of the research, and made mention of how the participants’ voices could be heard and valued. This serves as a clear sign that we need to move past financial incentives and be sure to include information about the aspects of our work that make people want to get involved.

The recruitment process itself was sometimes felt to alienate participants. Several — both veteran and fresh — described the feeling of a recruiter (read: the screener) trying to ‘trick’ them, setting a negative tone and a desire to trick us back. Additionally, some of the veteran respondents described going through numerous, detailed questionnaires on the phone, only to be repeatedly rejected for reasons they were not made privy to. This process was felt to have created an unnecessary game between the recruiter and the potential participant, where winning is answering the questions in the 'right’ way rather than establishing a potential participant’s relevance and fit for a particular project.

As a result, we recommend a more transparent approach to recruitment: when possible, share the objectives of the study along with the profile you’re looking for without making participants jump through hoops to ’prove themselves’. Most participants only want to be helpful, and we would do well to begin relationships with as much transparency as possible.

Additionally, much was discussed about pushing participants past their level of comfort — particularly by asking for depth about issues and decisions where there is no depth to be found. Some described feeling pressured by researchers to go into endless detail about packaging or purchase choices. When they had already reached the end of their capacity for detail, they made up the rest.

In this case, we suggest a more human-centred approach to your questions. Given all that we have learned from behavioral economics and psychology about people’s ability to rationalise emotional decisions and remember tiny details, let’s focus on what pieces of the puzzle are most valuable for us, so that the participants themselves can honestly contribute.

The output from this initiative has ignited a conversation about much-needed changes to the work we do. The resulting conversations and their respective implications are being curated into a handbook that will include tips for both researchers and clients alike. While some of the tips will be relatively easy to adopt, others point to areas of improvement that have been sorely overlooked and may stir further debate — especially on the qualitative side. Above all, it will demand an attitude and cultural change within our industry. It will demand that we no longer pay lip-service to participant engagement, but deliver positive participant experiences with our practices and processes.

These topics, and many more — including adjustments to standard screening practices, moderation techniques and the value of staying in touch when research is finished — will be included in the GRBN participant engagement handbook, ENGAGE: 101 Tips To Improve The Research Participant User Experience, which is available at

https://grbnnews.com/grbninitiatives/engage-handbook.