David Geddes: Glad to be here. The short answer is that the standards process has come along very well; we now have basic standards in place and are encouraging people to pledge to support them.
For a longer answer, let’s start by considering a spectrum of standards and best practices for public relations measurement. (Here I’ll use the terms “standards” and “best practices” as parts of one concept.) At one end of the spectrum are the simplest standards, for example a definition of “mention” in social media analysis and a specification of how mentions be calculated. The main complexity here is how to treat company nicknames such as “Mickey D’s,” stock ticker symbols, and other name variants. This base level standard is analogous to, say, a mechanical standard for threads per centimeter on a bolt.
At the other end of the spectrum is a hypothetical “killer standard” (or best practice) that links (i) public relations activities, (ii) outputs, (iii) engagement, (iv) cognitive change, often referenced as outtakes, (v) behaviors, and (vi) desired organizational outcomes. There remains active debate both in the profession and in the academic literature about what this type of best practice might look like.
The Coalition decided to address the base level standards first, for two reasons. First, we wanted to build cooperation, communication, and engagement among industry players by starting simply, where there was the greatest likelihood of reaching consensus. This would pave the way for handling more complex standards and best practices. This, by the way, is what the Barcelona Principles achieved earlier: They were the first time that a broad representation of industry players ever agreed to anything, apart from where to go for a drink after the conference sessions. This was no small feat, and the individuals who worked on the principles and organized the European Measurement Summit deserve much credit.
Second, the hypothetical “killer standard” would be build upon base level definitions, guidelines, and standards. Consequently the base level seemed the right place to start. Furthermore, it is quite possible that this “killer standard” is the level where industry players should be competing for the best approach, and so there will never be a final standard in the true sense of the term.
Today, we have sets of standards for traditional media analysis, for social media analysis, for the communications life cycle (beginning with awareness and extending to advocacy), for return on investment, and for ethics in research and measurement. These have been available to the profession for almost two years for testing, for comment, and for revision. Four major corporations have adopted these standards, and are working internally and with their external agency and measurement partners to apply them. These companies will be reporting on their experience over the course of the year.
In one case, several standards for traditional media analysis were rigorously tested for reliability (O’Neal, Eisenmann, and Geddes, 2014).
From the outset, the Coalition has expected to receive and has welcomed comments and suggestions concerning these standards from businesses, non-profits, NGOs, agencies, measurement firms, and academia.
TMS: What fraction of standards are completed, and what fraction still in progress?
D.G.: The simple answer is that we now have a foundation, but we don’t know how many further standards will be developed.
As described by the International Organization for Standardization, standards are developed and adopted in response to the needs expressed by the marketplace. The Coalition started with a logical set of standards, many in the areas of media analysis and social media analysis. This decision was based purely on demand from industry professionals, and does not reflect a greater or lesser importance of these measures of outputs within the greater scheme of public relations measurement.
Further standards will be developed as requested by the industry. For example, should there be standards for employee communications measurement and evaluation? Or for investor relations? I would ask that practitioners who see a need for a specific standard contact the Coalition to discuss the need and next steps.
TMS: What state is the measurement industry now in? What fraction adheres, more or less, to the standards?
D.G.: Let me give you a a few examples on the positive side. The quality of applications for measurement awards is higher than ever. Papers that received a Jack Felton Golden Ruler Award from the Institute for Public Relations ten years ago would not be so competitive any longer. Public relations degree programs across the country are devoting more time to public relations research, measurement, and evaluation. Consequently the new generation of public relations professionals has grounding in research design, statistical analysis, and how to plan measurement programs.
On the negative side, Wright et al (2009) report that, based on an industry survey, AVEs remained the third most-used metrics, employed by 35% of industry practitioners. Anecdotally, judges for major industry awards have told me that the lack of adequate measurement against objectives allows screening out 80% of award applications.
Compliance with standards will require time and is voluntary. The Coalition does not have and does not intend to develop any auditing or policing process. Our aspiration is that the marketplace will favor those who clearly comply with standards.
I encourage companies and other organizations to pledge their support for the standards at http://www.instituteforpr.org/researchstandards/public-relations-research-standards-pledge/
TMS: What has the feedback been from organizations? Has it been a valuable help in developing the standards?
D.G.: This is where the customer panel established by the Coalition plays a central role. The four companies that have adopted the standards—General Motors, General Electric, McDonald’s, and Southwest Airlines—report that they are using the standards for internal training, and they’ve expressed an expectation that their external partners adopt the standards. Anecdotally, one customer panel member estimated that bringing suppliers into compliance required about one hour of work with each external partner. We expect to conduct webinars and/or publish articles about the implementation experiences of these companies.
While on this subject, the Coalition would welcome additional members of its customer panel, especially those based outside of the U.S. Interested organizations should contact Frank Ovaitt at the Institute for Public Relations or myself.
TMS: What are the next steps?
D.G.: First, we would like companies, agencies, and measurement firms to learn about the available standards, pledge to support the standards initiative, apply the standards in their work, provide feedback on the standards, and, where needed, suggest new standards needed in the profession.
For client organizations, we would like to see an expectation that their external partners—agencies, research/measurement partners, and consultants—adhere to the standards. To this end, we are making available standard language for client organizations to include in RFPs, for example.
For agencies and research/measurement firms we would like to see that they adopt the standards, educate their internal teams about the standards, and that they pledge their support for standards. To this end, we are making available standard language for inclusion in proposals and reports concerning adherence to standards.
TMS: How can the ordinary PR pro in the trenches best make use of the new standards? Is there a good way to ease into adoption? Like maybe just use the Transparency Table?
D.G.: The transparency table is a good first step, because the table forces the practitioner to document what she or he is currently doing. Self-assessment is a necessary first step to improvement.
The standards are actually an integrated package. The first big step is the commitment. Once the individuals adopt the definitions and measurement protocols, the subsequent steps should be easier. There are many resources available, including contacting the International Association for the Measurement and Evaluation of Communication (AMEC) or the Institute for Public Relations Measurement Commission.
The standards are most directly relevant, on a day-to-day basis, to team members involved in research, measurement, and evaluation. Consequently, it may be the research group within an agency, or the person within an agency who has the most experience in measurement, who will apply the standards.
The role of an account team lead at an agency, for example, would be to insist that the team and its measurement partners, if any, use the standards. An ordinary public relations professional has lots of things to do already.
TMS: We know there is a push to Sign the Pledge. How is this going? To what extent will peer pressure be effective in increasing adoption? Do you have some kind of accreditation program for organizations? Or perhaps a Gold Star Seal of Approval?
D.G.: The opportunity to pledge support is important. We have about 30 signatories already, and we expect more as awareness grows. As the ISO notes, the marketplace drives the adoption of standards. So, in that sense, yes, we expect peer pressure and client expectations to drive the adoption of standards.
The Coalition does not have and does not plan on having an accreditation program, a certification program, or a seal of approval.
TMS: Thanks for speaking with us, David.
D.G.: My pleasure, thank you.
Eisenmann, M, O’Neil, J, & Geddes, D, (2014) An Examination of the Validity, Reliability and Best Practices Related to the Proposed Standards for Traditional Media. Available at: http://www.instituteforpr.org/topics/examination-validity-reliability-best-practices-related-proposed-standards-traditional-media-jackson-sharpe-award-winner/
Wright D, Gaunt R, Leggetter B, et al. (2009) Global survey of communications measurement 2009 – final report. London, UK: Association for Measurement and Evaluation of Communication. Available at: http://amecorg.com/wp-content/uploads/2011/08/Global-Survey-Communications_Measurement-20091.pdf
###
Now that the new public relations and social media measurement standards are in place, the next step is to encourage more companies and organizations to use them. The Institute for Public Relations keeps a running tab of the two-dozen-and-growing list of organizations that have pledged to use the standards. There's a form there that your organization can use to join them. Step up.
Public relations and social media measurement standards have received a huge amount of coverage and discussion of late (at least 20 articles here in TMS alone). But never has there been a more practical explanation and example than the sample Media Codebook for media analysis included as part of Geddes, O’Neil & Eisenmann, 2014. (See the detail below.) If you do media analysis, this example codebook is a standards primer, cheatsheet, and guide all rolled into one. If you have any confusion or doubt as to why standards are necessary, or how the heck they are actually applicable to traditional media analysis, here's your handbook.
David Geddes, Ph.D., Julie O’Neil, Ph.D, and Marianne Eisenmann, MBA, received the Jackson-Sharpe Award for the paper this Codebook is part of, their 2014 International Public Relations Research Conference submission, "An Examination of the Validity, Reliability and Best Practices Related to the Proposed Standards for Traditional Media." The Codebook is Figure 1 of the paper. The paper itself is actually the second part of a test of the reliability of the proposed interim traditional media standards, and includes an informative introduction to media analysis and standards.
Below is a detail from the Codebook, just to give a sample of what it includes (click it to see it larger):
---Bill Paarlberg, editor
###
(thanks to Texas Christian University for the image)
Last week at The Conclave, two dozen wise and experienced measurement experts worked on polishing up the social media measurement standards posted at smmstandards.com. They pondered the future: "Now that we have the basic standards in place, do we need to continue to meet?"
The answer, arriving this morning thanks to Business Insider, is Yes: "'People Don't Use Words Any More': A Teenager Tells Us How To Use Emojis Properly." Apparently, emojis are all the rage for texting among the younger set. And the very high-brow set as well, see Fred Benenson's Emoji Dick, a translation of Moby Dick into emojis using Amazon's Mechanical Turk.
Here's a leg up for the Conclave in developing measurement standards for emojis: iTunes sells a translator.
-- Bill Paarlberg, editor (Thanks to So Says Miss Brightside for the emoji sample.)
New research implies that a researcher's opinons will bias their analysis of public relations measurement data. Does measurement need stantards for data analysis?
One of the fundamental tenets of public relations measurement is that data-informed decisions are more valuable than gut feelings. But what if people cannot be relied upon to correctly analyze the data?
New research shows that a person's ability to analyse data is strongly affected by their politics, even for those people who have demonstrated ability in data analysis. (See "Science Confirms: Politics Wrecks Your Ability to Do Math" in Mother Jones. Or download Kahan et al.'s original research right here.) The implication for public relations and social media research is that those who analyse data may bias their results based on their opinions or preferences for the outcome.
The existence of sources of bias in empirical research is not news. The Dictionary of Public Relations Measurement and Research lists three different types of it. And we've long known that people tend to read what they want to read in research. What is new here is the implication that even "numbers people" suffer from unconscious distortion of their ability to reason from data.
The pubic relations profession has recently made great strides with standards for data collection. Now there is good reason to consider standards to insure accurate and reliable data analysis.
(thanks to DrHansen.com for the image)
Bill Paarlberg, Editor of The Measurement Standard, has been writing about public relations measurement for 20 years. He is editor of the award-winning "Measuring the Networked Nonprofit" by Beth Kanter and Katie Paine, and editor of two other books on measurement by Katie Paine, "Measure What Matters" and "Measuring Public Relationships." Visit Bill Paarlberg's page on LinkedIn.
The Measurement Standard is a publication of News Group International.
On the Institute for Public Relations website are five short videos of IPR head Frank Oviatt speaking on reaching consensus on public relations standards. This one is called "What Are Standards?" You can also view:
Amidst the recent well-deserved hoopla over progress on setting public relations measurement standards, it's proper to recall that early headway in that direction was made by Don Stacks and the IPR with the Dictionary of Public Relations Measurement and Research, originally published in 2002. Don Stacks and Shannon Bowen have now updated the Dictionary. You can download the Third Edition at the IPR website, which says:
The Dictionary of Public Relations Measurement and Research has become one of the most popular papers the Institute for Public Relations has ever published. This third edition covers an expanded number of terms, addition of social media terms and processes, and adds ethics as a category.
Congrats to Drs. Stacks and Bowen for a job well done. I had a fun and informative time browsing through this new edition. Try it, I bet you'll learn something. -- Bill Paarlberg, editor (thanks for the image to The Dictionary of Regional American English)
The slideshow below by Katie Delahaye Paine (the publisher of The Measurement Standard newsletter) will introduce you to the purpose, process, and results of the new social media measurement standards. For all the details, visit smmstandards.com. See the following articles to learn how you can start using the new standards:
In today's New York Times is an article by Katie Thomas called "Breaking the Seal on Drug Research," concerning recent efforts of academic researchers and activists working to make clinical drug trial data public. "More researchers are insisting on seeing all the data behind all clinical trials for drugs, not just the rosy reports that companies choose to release."
Remind you of anything on going on in the world of public relations and social media measurement?
We might not think of members of the standards-setting groups the Coalition and the Conclave as activists, but their work to make measurement more transparent (see especially The Sources and Methods Transparency Table) is a similar effort. If publication bias (the selective publication of research results) and other results-skewing problems are rife in medical research, then there's little doubt our world of public relations research can use some tidying up as well.
Or, as the Times' article states: "Until recently, the idea that companies should routinely hand over detailed data about their clinical trials might have sounded far-fetched. Now, the onus is on the industry to explain why it shouldn’t." This kind of transparency means progress for the measurement industry, and for that we have the current standards-setting movement to thank. -- Bill Paarlberg, editor
(Thanks to socialsquare for the image.)
###
Bill Paarlberg, Editor of The Measurement Standard, has been writing about public relations measurement for 20 years. He is editor of the award-winning "Measuring the Networked Nonprofit" by Beth Kanter and Katie Paine, and editor of two other books on measurement by Katie Paine, "Measure What Matters" and "Measuring Public Relationships." Visit Bill Paarlberg's page on LinkedIn.
The Measurement Standard is a publication of News Group International.
Read our other coverage of the new measurement standards: So What If We Have Standards?, Cocktail Party Measurement Standards Talking Points, and 7 Reasons Why You Should Implement the New Measurement Standards Now!
OK, so you know there have been new standards developed for public relations and social media measurement. What should you do with and about them? Below is your checklist. #1 will bring you up to speed on just what the standards are, the rest are how you can start using them.
1.
Read the standards at www.smmstandards.org and at IPR's Standards Center.
Extra credit: Read WOMMA’s Influencer Guidebook and AMEC’s PR Professional's Definitive Guide to Measurement.
Extra extra credit: Post your comments, critiques, and encouragement on these efforts.
2. Issuing an RFP for measurement? Make sure that your RFP includes a requirement to comply with the standards. Have your vendors use The Sources and Methods Transparency Table.
3. Evaluate your current programs. If you already have a measurement system in place, take a hard look at all your programs. Which ones adhere most closely, which ones have the farthest to go?
3a. Start with the ones that might need just a bit of tweaking to be in compliance. Maybe your definitions need to be consistent, or you could make use of The Sources and Methods Transparency Table. Make a list of your metrics and compare them to the best practice methodologies.
3b. Now look at your programs that are seriously off the standards track. Perhaps you are still using AVEs, or are measuring outputs rather than outcomes. How can you redesign these programs, or replace them with those that are standards-compliant? See 2. above.
4. Make your organization standards-compliant. Gather the communications staff and hold an educational workshop about the standards and explain why they need to change their metrics. Help them go through 3. above for their own programs.
5. Make your awards program standards-compliant. If you work for or with a professional association, encourage the awards committee to rework the awards criteria to include adherence to measurement standards as part of the “measurement of results” section. If there is a certification program, make sure that questions about the standards are on the exam.
6. Make your classes standards-compliant. If you’re an educator, rework the research syllabus to include a section on standards.
7. Let the world know that you are standards-compliant. Fly your standards standard high by indicating on your reports and organizational boilerplate that you adhere to the standards. Let us know, and we’ll list you on the honor roll at www.smmstandards.org.
(thanks for the image to Steve Harris and Openview Sales Lab)
###
Katie Delahaye Paine is Chairman, KDPaine & Partners, (a Salience Insight company), and Chief Marketing Officer of News Group International. KDP&P delivers custom research to measure brand image, public relationships, and engagement. Katie Paine is a dynamic and experienced speaker on public relations and social media measurement. Click here for the schedule of Katie’s upcoming speaking engagements. Katie and Beth Kanter are authors of the book “Measuring the Networked Nonprofit,” published last year by Wiley.
The Measurement Standard is a publication of News Group International.
Read our other coverage of the new measurement standards: So What If We Have Standards?, 6 Ways For You To Use The New Measurement Standards: Here's Your Checklist, and 7 Reasons Why You Should Implement the New Measurement Standards Now!
The exciting new progress in public relations and social media measurement standards will be the hot topic of conversation at business-related cocktail parties in the near future. No worries about keeping up your end, though. Just review the talking points below, and you'll sound like an expert. Bonus: If you're going to have any chance at all at getting that cute new account manager to notice you, you're going to need to know enough about standards to get some positive exposure. So pay attention:
Q: What's up with the new measurement standards?A: Multiple organizations have just finished an eighteen-month
marathon effort at defining and standardizing public relations and
social media measurement. (Read the new standards and comment at smmstandards.org and at the IPR's Standards Center.) These are important because until now the measurement marketplace has suffered from inconsistent and highly variable products, techniques, and results. At last the disparate industry groups have cooperated and at last they have produced the first concrete results at standardizing the marketplace.
Cocktail party sound bite: The new standards will bring consistency to the measurement marketplace.
Q: Sounds great, but what does that mean for my business?
A: By adopting the new standards for your public relations and social media measurement programs you will improve your business outcomes -- and you’ll save time and money on measurement.
Cocktail party sound bite: Standards mean better, less expensive measurement. Which means better PR and better business.
Q: How can standards do that?
A: All the standards stress the importance of starting with clear goal definitions and deriving metrics from the goals. Tying your metrics to business goals enhances your worth to the organization, your credibility, and your overall awesomeness.
Cocktail party sound bite: The new standards will help you measure your impact on business. Counting Likes is bush league.
Q: What does it mean for the market? Are all the vendors adopting them?
A: Several organizations, including agencies like Ketchum and brands such as Southwest Airlines, have already pledged to adopt them. Many measurement vendors, including SocialEyez and Salience Insight, already adhere to all the social media measurement standards. Best thing to do is demand that your agencies and research suppliers comply.
Cocktail party sound bite: If the vendor doesn't abide by the standards, then they're selling apples instead of oranges.
Q: So what does that mean for my job?
A: It means that you will be able to compare results across programs, across vendors, and across agencies. You can now do your job better and faster: Instead of playing mix-and-match with vendors and data and results, you jump ahead to making decisions based on useful data.
Cocktail party sound bite: Wouldn't you rather use your brainpower making decisions with the data, rather than trying to get useful data from incompatible sources?
Q: So, you're saying standards are going to help my career?
A: Management will feel more comfortable investing in measurement if they don’t feel like they are tied to some mysterious sole-source proprietary system. You play this right, and your budget will go up. Heck, there's a promotion in your future.
Cocktail party sound bite: The new standards mean my future's so bright I've got to wear Google Glass. Say, you want to get out of here and go somewhere quiet and discuss standards some more?
--Katie Paine and Bill Paarlberg (Thanks to oprah.com for the image.)
###
The Measurement Standard is a publication of News Group International.
The Paine of Measurement
About six months ago in these pages I wrote a piece about the state of measurement standards called “It’s a Bridge, It’s a Bridge!” I made an analogy with the long-anticipated bridge being built across the Piscataqua River between Portsmouth, NH, and Kittery, ME.
I’m thrilled to now be able to tell you that, in terms of social media measurement, we now have the equivalent of a shiny new bridge. (And, as it happens, the actual bridge over the Piscataqua is almost finished.)
It was quite the week for progress in measurement
On June 6th the Conclave released its standards for social media measurement, providing the industry long-awaited standard terminology and best practices for social media measurement. (Read them at smmstandards.org, and please comment. Public comments are open until July.) It was a herculean effort by a number of very smart and dedicated people. The industry owes them a tremendous round of applause. At the same time Rob Flaherty, CEO of Ketchum, pledged that his agency would adhere to the new Coalition and Conclave standards, and that he would urge his peers in other major agencies to do the same.
Also, at the recent AMEC Madrid Summit, AMEC debuted its PR Professional's Definitive Guide to Measurement, and Don Bartholomew and Richard Bagnall presented their new Framework for Social Media Metrics and Measurement.
The goal for all of these efforts is to improve the quality and value of measurement programs, to help people and organizations “do measurement right.” We are long past the time for debating whether people should or should not meaure their results. Unfortunately, in the rush to measure something -- anything -- a lot of really bad metrics and measurement programs were created. Many of the people who leapt onto bandwagons like KLOUT, AVE, and HITS (How Idiots Track Success) found that, sooner or later, the wagon’s wheels fell off. Generally in the board room.So they’ve been seeking better solutions. The efforts in Madrid and elsewhere earlier this month are those solutions.
So what if we have standards?
Of course, once the applause dies down people will ask, “So What?” And since I tell my clients and audiences to ask that question at least three times when looking at any report, it’s appropriate to preempt the skeptics and answer them here.
The most immediate So What? is that a great many people can spend a lot less time debating the proper definitions of terms and appropriate techniques for measuring social media. The standards specifically address frequently debated issues like ROI, influence scores, engagement metrics, etc. These tricky terms will now be much more consistent from project to project and vendor to vendor.
But the bigger So What? will be that clients can now make better and faster decisions. The new standards mean that the process of collecting the right data will become more -- yes -- standardized; less of a stumbling block and more of a commodity. We can now spend our brain power on analyzing the data and using it to make decisions. After all, the real purpose of measurement is to improve programs, and the new standards will get us to those improvements faster.
The next hurdle: Adoption
Of course, these benefits will depend a lot on how broadly and quickly our industry adopts the standards. Now it's time for clients, industry associations, and academia to step up to the plate. If you're responsible for award programs in PR and communications, I urge you to require that all entries adhere to the standards. If you’re a client, make adherence to standards a requirement of any RFP. And academics, let's stop teaching all the silly stuff and make sure that your graduates understand the existing standards. With luck, they’ll be able to advance them in the future.
Wishing you large measures of success,
Katie Delahaye Paine is Chairman, KDPaine & Partners, (a Salience Insight company), and Chief Marketing Officer of News Group International. KDP&P delivers custom research to measure brand image, public relationships, and engagement. Katie Paine is a dynamic and experienced speaker on public relations and social media measurement. Click here for the schedule of Katie’s upcoming speaking engagements. Katie and Beth Kanter are authors of the book “Measuring the Networked Nonprofit,” published last year by Wiley.
The Measurement Standard is a publication of News Group International.
New standards for social media measurement have recently been developed by a diverse group of organizations, and largely as a result of the Barcelona Principles. Read the standards and about the process of their development at www.smmstandards.org. And please post your comments there; the official comment period is open through July.
Here are seven reasons why you should start using these new measurement standards in your work right away:
###
Katie Delahaye Paine is Chairman, KDPaine & Partners, (a Salience Insight company), and Chief Marketing Officer of News Group International. KDP&P delivers custom research to measure brand image, public relationships, and engagement. Katie Paine is a dynamic and experienced speaker on public relations and social media measurement. Click here for the schedule of Katie’s upcoming speaking engagements. Katie and Beth Kanter are authors of the book “Measuring the Networked Nonprofit,” published last year by Wiley.
The Measurement Standard is a publication of News Group International.
Any vendor or agency claiming to have a “universal measure” of anything having to do with social media.
The latest examples of this type of villain are Starcom MediaVest Group and ShareThis, who earlier this week announced that they've used the magic of algorithms to blend two proprietary indices into a “universal measure of social activity on the web.”
My favorite response to this was from Michelle Powers Godfrey: “If only I could call a quarter a dollar!” Yep. You know, I’ve been trying to call my Honda Element an Audi TT for years, and somehow it never seems to drive like an Audi.
Just because you call something a standard doesn't make it so!
A couple years ago we ranted about Klout calling itself “the standard measure of influence." And, when a group of very smart people from companies like Procter & Gamble and GM and McDonald's and SAS and Thompson Reuters and a whole host of professional organizations came together to talk abut setting real standards, the one thing they could all agree on was that Klout was not a measure of influence.
Porter Novelli made the same mistake when it launched PN Sonar last month, saying it tracks “the totality of the conversation.” It may well be a wonderful algorithm, but no, it does not track the conversations that go on in my living room, or in Business Class on a flight to London, or around a soccer field. It doesn’t track the text messaging that goes on between my friend’s teenage daughters, or the private conversations that take place on Facebook. Nor does it track emails, direct messages on Twitter, or private messages on LinkedIn. What it does is track about 1% of the “totality” of conversations.
The reason the Conclave came together was in a large part frustration with declarations like the ones that Starcom, Porter Novelli, and Klout made. We knew the claims were bogus, we just didn’t have a better alternative. Now we do, it’s called www.smmstandards.org. The Conclave and other groups are working together to hammer out standard language for measurement.
Standards are not set by huddling in back rooms dreaming up proprietary algorithms. Standards are set by hard-working practitioners who take time out of their busy schedules to come together and address in a very thoughtful way the things that are important to our industry.
That is a true universal standard.
###
Katie Delahaye Paine is Chairman, KDPaine & Partners, (a Salience Insight company), and Chief Marketing Officer of News Group International. KDP&P delivers custom research to measure brand image, public relationships, and engagement. Katie Paine is a dynamic and experienced speaker on public relations and social media measurement. Click here for the schedule of Katie’s upcoming speaking engagements. Katie and Beth Kanter are authors of the book “Measuring the Networked Nonprofit,” to be published this year by Wiley.
The Measurement Standard is a publication of KDPaine & Partners, a company that delivers custom research to measure brand image, public relationships, and engagement. Katie Paine, Chairman of KDPaine & Partners, will be glad to talk with you about measurement for your organization.
When Research and Standards Collide
by Katie Delahaye PaineThere were two papers at IPRRC 2013—one by Marianne Eisenmann and Julie O'Neal, and one by Sean Williams—that have significant impact on the Conclave's pursuit of public relations and social media measurement standards.
Eisenmann and O'Neal: Standards for Media Analysis Prove Unreliable—Or Do They?
The paper with the greatest implications for standards-setting efforts was by Marianne Eisenmann and Julie O’Neal: “Testing the reliablility of metrics proposed as standards for traditional media analysis.” The results at first appear to be unpleasant news, but are subject to a couple different interpretations.
The purpose of the research was to test the reliability of coding decisions made by human coders using the standard definitions proposed by the Institute for Public Relations in its June 2012 paper "Proposed Interim Standards for Metrics in Traditional Media Analysis." That paper specified definitions for:
Eisenmann and O'Neal prepared a detailed set of instructions that specified definitions for each element to be coded. The coding books were reviewed by three PR practitioners experienced in media coding and measurement, who made minor modifications.
Then the researchers collected a systematic random sample of clips about Wal-Mart between July 1, 2011 and June 30, 2012, ultimately analyzing 106 items.
Three graduate students with some experience in public relations were selected to do the actual coding and were trained for about two hours. After the training the three coders and two researchers independently coded seven stories and the results were compared to identify discrepancies. The coding book was modified accordingly. A second set of five items were also subjected to a pretest to make sure that the more subjective qualitative items were coded correctly.
After coding the entire sample of items, reliability results indicated the highest level of agreement between coders on elements like media type, prominence, and shared vs. sole mention. There was low to moderate agreement when coding for the presence of reputational messages, and lowest agreement of all was for sentiment.
One way to interpret these results might be to suggest that the standards as currently written yield unreliable results. However, the way I choose to interpret the results—from the perspective of someone who has managed and trained human coders for more than two decades and has written, reviewed, and tested thousands of coding instructions in that time—is to note that developing accurate coding is a complicated process. It requires a code book that has been tested for at least a month, and the best coders require six months of testing and training before they can be relied upon to code accurately on their own. This training was obviously not within the scope of this study.
Thus I don't recommend using the results of this study to draw too many sweeping conclusions about the fate of the standards. That having been said, I would be seriously remiss if we didn’t report these results.
So, it is possible to draw a couple different conclusions from this study:
1. The definitions of sentiment and the use of latent sentiment analysis are fundamentally flawed, or,
2. The definitions are correct, but it takes more than graduate students and the time available to do this study right.
I’m going with #2.
Williams: Social Media Influence
Sean Williams, like many of us, has long been disturbed by the various claims out there regarding influence in social media. His paper, "Is that all there is? A literature review and potential approach to measuring influence in social media," addressed the following aspects of influence:
He conducted a massive literature search on the subject, referencing everyone from the 1955 communications theorists Elihu Katz and Paul Lazarsfeld, to Valdis Krebs, who has lead the charge into defining online influence as it relates to social network analysis.
His research dovetails nicely with the current efforts of the Conclave and WOMMA on the topic. What the standards effort has revealed is just how challenging it can be to achieve consensus on such a multi-faceted topic. One can argue that there are 7 billion influencers on the planet, or that there are just a handful—it depends on the topic or product. Williams’ paper outlines a framework for a far more exhaustive research study that could be the first step towards more definitive consensus around this subject.
Learn more at this summary of his paper on the IPR website, where you will find a link to the actual paper.
###
Katie Delahaye Paine is Chairman, KDPaine & Partners, (a Salience Insight company), and Chief Marketing Officer of News Group International. KDP&P delivers custom research to measure brand image, public relationships, and engagement. Katie Paine is a dynamic and experienced speaker on public relations and social media measurement. Click here for the schedule of Katie’s upcoming speaking engagements. Katie and Beth Kanter are authors of the book “Measuring the Networked Nonprofit,” to be published this year by Wiley.
The Measurement Standard is a publication of KDPaine & Partners, a company that delivers custom research to measure brand image, public relationships, and engagement. Katie Paine, Chairman of KDPaine & Partners, will be glad to talk with you about measurement for your organization.
This webinar from Carma International presents Dr. Jim Macnamara, Professor and Deputy Dean at University of Technology, Sydney. He identifies and explores several key concepts of public relations measurement, including engagement, relevant influence, and impact/value. It's really a broad review of what public relations does and how it can be evaluated. It also includes an overview of current standards efforts, and a model of how PR efforts link to business outcomes. --WTP
The Word of Mouth Marketing Association (WOMMA) has submitted proposed standards for the definition of "influence" and related terms. In part, they read:
WOMMA defines Influence as:
The ability to cause or contribute to a change in opinion or behavior.
Where the initial actor is a Key Influencer who is:
A person or group of people who possess greater than average potential to influence due to attributes such as frequency of communication, personal persuasiveness or size of and centrality to a social network, among others.
Key Influencers interact with others and those they influence are Influencees:
A person or group of people who change their opinion or behavior as the result of exposure to new information.
See the entire post at #smmstandard.com, and leave your comments and suggestions there.
###
The Digital Analytics Association, in conjunction with SMMStandards, has proposed definitions for the terms "item," "mention," "reach," and "impressions." You can view and post your comments on these definitions at the #SMMStandards site. Here's a quick summary:
-- ITEM: An ITEM of content is a post, micro-post, article, or other instance appearing for the first time in a digital media.
Comments: This definition of ITEM replaces “clip” “post” and other unclear terminology. ITEMs of content refer to the content vehicle in its entirety, which means that a single ITEM can contain multiple MENTIONs and derivatives. Derivatives of ITEMs such as comments, likes, etc., should not be counted as additional ITEMs.
-- MENTION: A MENTION refers to a brand, organization, campaign, or entity that is being measured.
Comments: MENTIONs are typically defined in Social Media using Boolean search queries. These queries may include AND as well as OR statements to capture specific brand, campaign, or subject matter topics, as they pertain to the goals of the search objective. Further, MENTION queries may also include NOT statements to filter off-topic MENTIONs from the data set.
-- REACH: REACH represents the total number of unique people who had an opportunity to see an ITEM or a valid reproduction of that ITEM across any digital media.
Comments: REACH is typically quantified using social media monitoring tools, social platforms and/or panel based measurement solutions. Each tool, platform, and solution may have a unique method of calculating REACH. For this reason it’s critical to use the Transparancy and Methods table to identify data collection sources.
-- IMPRESSIONS: IMPRESSIONS represent the number of times an ITEM was displayed.
Comments: IMPRESSIONS represent the gross number of items that could have been seen by all people, including repeats. The term “displayed” applies across channels, browsers, devices, and other methods by which an individual might see an item.
###
(This post is an excerpt from an earlier Measurement Standard post "The State of Measurement Standards January 2013: It’s a Bridge, it’s a Bridge!" It is provided here to provide a quick way for readers to find standards on how to calculate tone or sentiment. See the earlier post for more background, detail, and standards.)
The Coalition has released two standards-setting papers for the PR industry. The first, “Proposed Interim Standards for Metrics in Traditional Media Analysis," by Marianne Eisenmann, offers recommendations for how to calculate some of the most commonly debated data points in traditional media analysis.
The paper proposes standard definitions for assessing the quality of media coverage including visuals, placement, prominence, message penetration, and spokesperson effectiveness. And it reiterates that AVEs should not be used as a measure of media.
How to Calculate Tone or Sentiment
For what or whom you want to determine sentiment? You may be looking to understand tone regarding an industry or sector, or sentiment around a specific product or service, individual, or organization. A single article could mention all of these, therefore it is necessary to define specifically what element(s) you are targeting for sentiment.
Define from whose perspective you are judging the sentiment. It could be the point of view of the general public, or of a specific stakeholder group such as investors, physicians, teachers, parents, etc.
Whatever process is defined and applied, it must be used consistently throughout any analysis.
Sentiment coding options:
###
Please note: We encourage you to comment on these developing standards. However, we suggest you go to the IPR site and read the entire paper first, then post your comment there.
(This post is an excerpt from an earlier Measurement Standard post "The State of Measurement Standards January 2013: It’s a Bridge, it’s a Bridge!" It is provided here to provide a quick way for readers to find standards on what items should be included in analyses. See the earlier post for more background, detail, and standards.)
The Coalition has released two standards-setting papers for the PR industry. The first, “Proposed Interim Standards for Metrics in Traditional Media Analysis," by Marianne Eisenmann, offers recommendations for how to calculate some of the most commonly debated data points in traditional media analysis.
The paper proposes standard definitions for assessing the quality of media coverage including visuals, placement, prominence, message penetration, and spokesperson effectiveness. And it reiterates that AVEs should not be used as a measure of media.
Items for Analysis: What Counts as a Media "Hit"?
A story counts only if it has passed through some form of “editorial filter,” i.e., a person has made a decision to run or not run the story. An item is:
What does not count?
###
Please note: We encourage you to comment on these developing standards. However, we suggest you go to the IPR site and read the entire paper first, then post your comment there.