Today's NYTimes has an article about the huge disparities in U.S. immigration courts, with some surprising data on how widely asylum decisions vary between court locations and between judges within the same court. In a system that is meant to apply uniform judgements across cases, there is some serious variablity in decision-making: "...Colombians had an 88 percent chance of winning asylum from one judge in the Miami immigration court and a 5 percent chance from another judge in the same court." And: "...someone who has fled China in fear of persecution and asks for asylum in immigration court in Orlando, Fla., has an excellent — 76 percent — chance of success, while the same refugee would have a 7 percent chance in Atlanta."
We mention this here as a warning to those who do PR measurement and depend on human judgements for their data or analysis. Suppose, for instance, you were doing media content analysis and your human coders had as much variablity as those judges? Good luck getting useful results there.
Of course, everyone doing media analysis guards against that sort of bias by doing intercoder reliability assessment, right? Right?
Wrong, apparently. In fact, CARMA claims that: "CARMA Asia Pacific is the only commercial media analysis firm that carries out intercoder reliability assessment."
Doesn't that make you wonder about the reliability of your data? --WTP
Just to set the record straight... Carma is NOT the only firm that does intercoder reliability testing. KDPaine & Partners does it, as does Report International. They may be the only one in Asia Pacific but I believe Mary McNamara also does it in New Zealand which means they can still lay claim to Asia we suppose.
Posted by: KDPaine | May 31, 2007 at 11:41 AM
Cormex Research in Canada has been conducting intercoder reliability testing as well.
Posted by: Andrew Laing | June 04, 2007 at 01:47 PM
Again ... just for the record, Cubit Media Research, working in Australia, Asia, the US and Europe, has designed an advanced multi-layered, data scrub technique which it applies to all of its work. The benefit if this approach is that it delivers far higher qualitative accuracy and consistency than any other intercoder comparison system - at a fraction of the cost.
Posted by: Warren Weeks | June 04, 2007 at 08:35 PM
Thanks, Katie, for your clarification. Sorry I'm a bit late commenting :( At Report International, we do indeed implement intercoder testing. We have found this especially critical in international programmes, where a key goal is to ensure maximum consistency across very diverse data sets... In fact, it was the results from our initial intercoder tests some 10 years ago that prompted us to enhance the granularity and accuracy of our data by developing and implementing our current coding methodology.
Posted by: Mike Daniels | July 09, 2007 at 04:54 AM