Greater pressure on budgets, and constant claims that automation, and ‘big data’ are going to reduce or replace primary research, means you’d be forgiven for feeling that the future looks bleak. But I personally remain optimistic. I believe that technology is pushing us towards greater integration of qual and quant, and enabling exciting new hybrid approaches.

Integration boosts credibility

It was always very easy in the past for qual and quant to exist alongside, but apart, from each other. Researchers tend to have an affinity for one more than the other and therefore the two disciplines often end up siloed from one another. But, if I reflect on my own experiences, I am certain that the best projects I ever worked on were those which integrated qual and quant approaches.

I think that’s because, when conducted in isolation, qual and quant both experience different issues of credibility. When presenting qualitative research findings — especially those that the client doesn’t want to hear — it can be easy for the listener to dismiss the research as “just the opinions of the 20 people who turned up”. Quant’s problem, on the other hand, is that it can be too impersonal. When a client is presented with a chart showing the aggregated opinions of 1000 people, it can be just as easy to dismiss as just “numbers on a slide”.

However, when a client is presented with a clear story in the quant, backed up by quotes or footage of real people, it becomes impossible to ignore. Likewise, if the themes arising from the focus groups are given the support of 100 other participants. Quant provides scale to qual. But qual makes quant human.

In the past, integration of qual and quant meant conducting both and then combining the findings in the report, but new technologies are enabling projects which are combined from the outset. We are starting to see a convergence of survey tools, mobile ethnography and online community platforms. Platforms like CrowdLab, Revelation and Dub support mixing and matching quant surveys, media uploads and moderated discussion all within the same project.

On a combined project like this, participants might be asked to complete a survey to rate different concepts, and the next day be automatically routed into follow-up forums to discuss their favourite concepts with those who shared their opinion. Because these are digital methods, they can be more practically scaled to larger sample sizes, meaning the approach can be truly qual/quant. In other words, they can provide both a definitive answer on people’s preferences, and a deep understanding into the reasons for it.

Another example might be to ask participants to record a diary of their coffee drinking behaviour: capturing photos on their smartphones of their purchases, recording their thoughts on the customer service.

Even starting with the sample size of a typical focus group, over the course of a week, this would be expected to build to a large data set — virtually demanding we review the data for the occasions with a quantitative lens (Which were the shops visited most often? What did people tend to buy?). But we would be equally able to flip it around and build qualitative case studies on the behaviour of individual participants across the whole week (What patterns emerged? What were the motivations for these trips?).

Variety of choices

We might also decide to conduct thematic coding of the photos for the types of drinks bought — turning the media into something that can ultimately be charted — or we might review each photo with the eyes of a moderator or semiotician (What themes can we deduce from the decoration of these coffee shops? To what extent are people ordering coffee to accompany something else?).

But there’s no reason why you couldn’t scale this exact same study up to 500 participants, which takes us into some very new territory indeed. We can still explore this data qualitatively, but it would also have true quantitative robustness. Our coming challenge as researchers is to figure out how to manage this new type of data; to uncover insights at large scale without losing sight of the nuance.

Tools tailored for qual

Companies are already responding to this. Tools like Voxpopme, LivingLens and Big Sofa can make large volumes of media content manageable for qualitative analysis. Fascinating new platforms like ReMesh use AI to create the illusion of having a one-to-one conversation with hundreds of people at once; making quant feel like conducting qual.

Many survey tools are starting to support qualitative probing directly off the back of a quantitative questionnaire — meaning that interesting responses or diary entries can be used as the immediate jumping off point for qualitative discussion.

It seems clear to me that the future of research is a far more integrated one, where hybrid qual and quant projects are much the norm. This will require new skills and more frequent collaborative working. But it should lead to more credible research findings, and ultimately our clients are not interested in whether project is qual or quant, they just want to get the right answer.