There are a number of aspects to carrying out media content analysis, but essentially it all comes down to three questions:
There are basically three schools of thought in regard to the best way to analyse media content: Automated, Human, or a Blended Approach. The link below connects to an interesting discussion on this subject, being carried out by two highly experienced and well-respected analysis industry figures in the UK: Mark Westaby (Metrica & Spectrum) and Mike Daniels (Report International). http://www.research-live.com/features/tracking-online-word-of-mouth-the-people-vs-machines-debate/4000156.article
Mark argues the case for automation, while Mike supports the pro-human side of the debate. But rather than commenting on, or rehashing their discussion, this posting presents the (very) short versions of some of the most common arguments surrounding the man-versus-machine debate.
Perhaps the primary argument for an automated approach to media content analysis rests on the very reasonable claim that computers are superior to people in respect of processing consistency. And that means there are fewer quality control issues than with human analysis. Computers are also arguably cheaper to “run” than people, so automating an analysis task brings – or should bring – very substantial cost benefits And lastly, computers are fast and can cope with vast flows of incoming data - certainly more than any person or team of people could reasonably be expected to process in the same time frame. So automated systems can deliver information in a more timely manner, and do so regardless of the size of the initial data-set to be studied.
However the downside to fully automated research solutions is that, to date at least, they’ve not been anywhere near as accurate as their human counterparts when it comes to the subjective assessment of the meaning of various types of content. And this is a big problem if an organization is relying on this kind of qualitative research to inform strategic planning or tactical actions. Poor research can mean poor decisions, flawed plans, and undesirable business outcomes. And if that happens, any cost savings that might have been achieved by taking on an automated service, suddenly seem insignificant against the broader cost of failure.
The Human Touch
In contrast to computers, people are inherently good at processing language: at understanding and contextualizing what’s being said about a subject when we read about it. Certainly we’re better at this particular task than computers, whose programs often attempt to understand language as a statistical rather than a social/cultural construct.
But depending on where in the world we live, and our level of expertise, we’re also an expensive resource. And when it comes to carrying out repetitive tasks, we humans are also terribly slow in comparison to our binary-oriented counterparts. Plus, we tend to make way more clerical mistakes. So why do people persist with using humans to do content analysis?
I believe it’s because we can deliver high levels of interpretative accuracy, and we’re ‘wired’ in ways that give us the ability to develop useful insights from what we read, hear and see.
The bottom line is that with technology at its current level, when you absolutely have to get it right and be presented with valuable insights, a good human will beat a good computer every time. Now if only we could be more affordable!
To close the cost-gap between computers and humans, and to make it feasible for organizations to continue to receive human-prepared content research, many analysis houses have off-shored their processing to low-labour-cost centres around the world. This allows them to offer at least some human processing at a comparable cost to many advanced automated solutions.
Unfortunately, off-shoring one’s human analysis business – especially to teams of readers whose native language is not English – may create issues when English language content is being analysed. Some of these issues are: a lack of in-depth understanding of the discussion that’s being played out in the media, and the inability to pick up on specific, critical, cultural and contextual nuances imbedded in the content. Less common are security concerns associated with having foreign nationals carrying out sensitive Government and commercial competitive analytics, but nonetheless they do arise from time to time.
However purely in terms of cost, off-shoring analysis makes sense: providing human resources at cost levels that are probably similar to the powerful computers needed to process large volumes of complex content. And let’s not forget that apart from the computers themselves, analysis houses relying on large-scale automation also have to factor-in the expense of the software packages they run, the people required to keep the systems humming, the data “pipes” needed to feed the media content to them, and the specialized spaces needed to house the whole installation.
A Blended Approach
Blending refers to the research approach which applies both computer and human resources to the task of content analytics – and I don’t just mean giving each human reader a desktop device loaded with a spreadsheet, a simple database, and a word-processor! Real blending involves the development and use of so-called expert systems: highly specialised computer software systems that can learn, and that compliment and multiply the effectiveness of their human research ‘partners’.
If this all works as intended, man and machine team-up to deliver research outcomes that are at once fast, consistent, accurate, insightful, and cost effective. In short, a blended approach has the potential to deliver all the benefits of both automated and human approaches, with few or none of the downsides that they exhibit separately.
This might seem like an ideal solution from most clients’ perspectives (and it probably is), so the question arises: why isn’t advanced blended analytics practiced more often?
The answer is simple. From the research company’s point of view, creating a proven high-quality blended analysis capacity is a real pain in the P&L.
Leading-edge blended research systems are expensive to build, can take a long time to design and perfect, and require access to skill-sets not usually found in your typical media analysis house, PR agency, or media monitoring company. And all that serves as a substantial barrier to entry into the blended research space for many research houses.
But because blended analysis systems can deliver such substantial benefits to professional communicators, it’s worth your while to explore this methodological option whenever and wherever you can find it.
More soon ...