Definitions can have a profound affect on measurement, and the devil is always in the details. Clearly defined analytics measures can save you a great deal of headaches later, but it can take a bit of detective work to get to the bottom of what goes into those definitions.
Here's a real world example of how the uncertainty of definitions can make for complications. Working for a group of TV networks in digital analytics in a world where TV Everywhere was rapidly becoming the focus, videostarts became an important measure for our businesses. It seems simple enough, it's the start of a video, right? Well, kinda sorta. After looking at our event counts in Google Analytics, and comparing total events to unique events, there seemed to be a disconnect between the counts. The unique events were disproportionately lower than the total event counts. After considerable discussion with the developers and content managers, the question arose, just how are we defining videostarts in GA? After a few minutes of discussion, it hit us - we were counting every incident of videostart occurring on our video streams. What did this mean? Every start was being counted, whether the user started the video, re-started the video, trick played it (rewind, fast forward or pause) and so forth. Videostart needed to be clearly defined - and in alignment with what was important to the business.
But wait, there's a twist in this as well! While we had to make decisions about how we wanted to define videostarts for our business, we also had to take into consideration how videostarts were being defined by our partners who were collecting data on the performance of our content on their videoplayers, which was external to us. In determining how to define videostarts for ourselves, we also had to understand how our partners defined this measure, in order to aggregate all our videostart data consistently across all sources in our own internal reporting system.
Where to start? By talking to our internal stakeholders, we got an understanding of what the needs were for this definition, and we found we needed to build a few iterations of videostarts. One version counted the start of the videoplayer before pre-roll ads, one counted the start of the content itself after the pre-roll completed, and there were a few additional versions as well. Among those different definitions, we compared our variations on videostarts against our partner data to see what looked the most like the data from the external sources. In creating an overall videostart count between our video players and our partner's players, we were able to combine the right version of our own videostart data with our partners data to get an aggregate total.
So what's in your definitions? Work closely with your developers to understand how a measure is being defined. In my example above, Research created a set of definitions that were shared within our network so that anyone handling the data would be clear on what it meant and could interpret the data properly. It may take a bit of detective work if this isn't established already in your organization, but a little work on the front end to establish what goes into your analytics definitions can save a great deal of confusion later.