As we stressed in our article on how critical words are to measurement, a reliable, accurate, and useful measurement program must start with well defined objectives and clear definitions of all the vague terminology we tend to use. If you’re having trouble with nailing down definitions of important terms, this 27-term sample measurement project dictionary will help. It was developed to communicate with media analysis coders, but it will serve as a template or starting point for a measurement project dictionary for most any program:
Phase: | What's Involved: | Estimated Completion Time: |
---|---|---|
Vendor search/RFP process | Define basic requirements. Get agreement from everyone who will use the system on what the criteria are & how they should be weighted. Agree on channels & media outlets; a vendor’s ability to collect those outlets are fundamental criteria for success. | 1 week |
RFP period | Write the RFP & distribute it to vendors who meet your criteria. | 2 weeks |
Vendor selection | Once the proposals are in, rank them based on the criteria. If there’s no clear winner, test the top two vendors to see which one meets your needs best. | 2 weeks after proposals are due |
Contract | The contract will need to make its way through legal. Most vendors won’t start until a formal contract is in place. | 1-2 weeks after vendor selection |
Preparation | While the contract is being prepared, nail down the research specifics. Define & approve search terms. Clearly define the programs/messages/initiatives/subjects to be studied. Provide this data to the vendor so they can set up your account. | 1-2 weeks |
Set up | After the contract is signed, the vendor will set up your account. This could be a matter of hours or weeks. | 1-4 weeks from signing |
Testing | Spend at least two weeks monitoring the data to insure it is complete, accurate, & relevant. If you’re using automated sentiment analysis, then validate it against trained human coders or at least a human in-house. 85% agreement means you can trust the data. If anything less, then work with the vendor to modify the criteria to make sure it is interpreting positive & negative correctly. Once you’ve identified errors, the vendor will take at least a week to correct them. Test again to make sure the fixes worked. | 2-4 weeks |
Tweaking | You will have to make adjustments; clean data takes time and effort. | 1-2 weeks |
Roll out | As the testing & tweaking phases come to an end, figure out who needs access and what type they need. Provide the vendor their email addresses. If it’s an enterprise-wide system, publish guidelines for usage and conduct a basic training course. | 4 weeks |
First results reporting | The first report never goes smoothly, so don’t promise anything on a tight deadline. Don’t show results unless you’ve validated them yourself. If you’re using Q1 results as a benchmark, your cut-off date for the data is March 31. Give yourself at least a week to check it & another week to fix mistakes. Then prepare the report. Bottom line: Don’t promise Q1 results until May. | 1-3 months depending on reporting deadlines |
Good luck with your measurement project dictionary, we hope this sample helps. ∞