As usual, the International Public Relations Research Conference (IPRRC) in Miami last week left my head spinning with new ideas, new data and new friends. I’ll be writing more about some of the specific papers but the highlight for me was the paper by Marianne Eisenmann, (Chandler Chicco Companies), Julie O’Neil, (Texas Christian University) and David Geddes (Geddes Analytics, LLC.) They presented the results of two years’ worth of testing on the proposed standards for media analysis. Perhaps the most critical key to getting standards adopted is validation and this research has provided it.
Their study showed that with trained readers and a good code book, you can get reliable results using the standards for measuring PR that were proposed by the Institute for PR and endorsed by four major corporations last fall.
At last year’s IPRRC, they presented results of a multi-reader test to see whether yon can get accurate coding by following the proposed standards. The short answer was in fact no, you can’t. However, the test was done with untrained student coders and the results as measured by Krippedorf’s Alpha (a standard test for intercoder reliability) showed that there was little agreement between the coders based on the standards as written. I was one of several in the audience that raised an objection to that conclusion because I’ve always believed training and experience was key to good coding.
The authors took our suggestions and did the study again, but this time they used experienced coders and a much more detailed code book (download the codebook here.) The Krippedorf alphas vastly improved indicating a much higher level of agreement between the three coders. You can read all the details here, but suffice it to say that anyone that is even thinking about measuring or tracking their media should read this. More importantly if you are considering a vendor, make sure that the vendor conforms to these new standards.