Measurement Mistakes You Don't Want To Make
How
To Really
Mess Up Your PR Measurement
Program
Ten fatal PR research mistakes and how to avoid
them.
Despite the best laid plans, public relations measurement programs can sometimes go awry. You can't always anticipate how everything will go, and your elegant research design rarely seems to play out quite like you planned. Let's face it, unforeseen problems and errors can creep in here and there and part of your job is to figure out how to get the job done anyway.
But there are certain errors your program just won't survive. These mistakes will ruin your data or analysis and leave you with no options but to learn an expensive lesson and to start over. Here are ten fatal research errors that now you won't have to learn about the hard way (and, yes, these will be on the exam):
1. Clipping
systems that miss clips
We won't name names, but
you should regularly test your provider. Do what we call a "Pub/Month" check:
Look back over the stats for the last year and see on average how
many articles you get in your key publications. If you are below
that for the current month, or if you have zero clips for the month,
someone's probably missing your clips.
2. Dirty
data from your content provider
This means errors like not differentiating
between nytimes.com and The New York Times. Again,
check the data on a monthly basis to make sure that it includes what
it's
supposed to.
3. Bad
circulation figures (impressions)
It really doesn't matter
if it's off by 10 or even 100. But we've seen cases where providers
have moved commas and made the NY Times circulation 14 million
instead of 1.4 million. Do a reality check.
4. Corporate
articles that end up in product categories and vice versa
This needs to be checked monthly or even weekly for the first six
months to make sure that it reflects reality.
5. An
unclear
definition of tonality
Ask three people what a positive article is and you'll get
three different answers. We define it as "leaves the reader
more likely to do business with, invest in, or go to work for the
company." How
you define it is your own business, just make sure it's consistent.
6. An
unclear
understanding of key messages
Again, do a monthly reality check.
7. Not
comparing apples to apples in a competitive analysis
This includes errors like
looking at your own local coverage but not the local coverage
of
your competition.
8. Not
being clear about the universe of publications
Make up a written list of search terms as well as a list of the publications/universe
to be covered.
9. Not
having total control of the names and mailing list for your survey
Beware of merging lists: You can end up with two surveys in one household
just because the middle initial is left off one name but not the other.
10. Not
being clear about what social media you want to
measure
Are you interested in
user reviews, Facebook, MySpace, list
serves, blogs, or all of the above?
Comments