The Dilemma with Data

h2m-the-dilemma-with-data.jpg

Quit celebrating that 3% CTR (click through rate). There are still 97% of people who don’t care. In my last blog, I talked about knowing who your audiences aren’t. I mentioned the joy of knowing not everyone is going to like your brand. That blog and this 3% vs. 97% are not the same. Sure, there are some in that 97% that will never want to join your tribe, but I’d venture to guess there are plenty who would. And that gets to the heart of this article — the dilemma with data. 

I love data, but I don’t worship data. Big difference. 

Time and time again I see clients and colleagues hold on to the belief that they can crack the marketing/communication code with the right amount of data filtered through the perfect algorithm. What winds up happening more often than not are cases of analysis paralysis or simple inaction while they “wait to get more research.” In the end, there are months of radio silence or content reruns put into the market that don’t provide any new data, just more of it.

Deep down, I think all marketers have a disproportionate fondness of data for one simple reason: it’s comforting. Let’s face it; human buying behavior is downright messy. It’s often emotional or based on an incalculable number of individual experiences and associations. Even a little bit of data and research insight gives us the thinnest piece of armor to withstand the blowback if our plans fail. That tin-foil-thick armor brings relief. 

The reality is that data can tell us what someone did, but data alone will never conclusively tell us why someone did something. 


Here are three things we consider when conducting research or digging through data. 

Compete Against Yourself: I get these types of questions a lot, “What is a good email open rate? How many likes should a post get? What’s more important: click through rates or conversion rates?” My usual response goes something like, “Are the numbers going up?”

While there are some ways to dig into industry averages, there usually isn’t enough context to really give a good answer. For example, if your business has been operating and augmenting its email nurturing program for five years, your performance should be higher than someone just starting out. 

Unless you’ve got a dedicated team and are cranking out droves of content weekly, don’t hone in on every single data point. Choose the metrics you think are the most important and work to bolster those numbers. Compare them to your sales figures. If there tends to be a positive correlation between an increase in success and an increase in revenue, you probably chose the correct metrics. If not, reevaluate and try something new. 

Intentionally Tinker: Back to a point I made earlier, the problem with analysis paralysis is that it prevents companies from producing and testing content. Come up with a plan, develop a calendar and then make stuff. Measure the results and adjust. 

Here’s an added tip: figure out what you want to test before making any content. These changes don’t need to be major either. Test two different headlines. Try two different image types. Publish at two different times of day. (Obviously don’t test headlines, images and deliveries at the same time or you won’t have clear results.) Based on those findings you can repeat what you thought worked and tinker with something else.

It sounds like the basic scientific method because it is. For creatives or those who believe marketing is purely an art form, it can take a while to add this bit of logic into your skillset. 

Take Reviews to Heart: I’m triggered. There has been more than one occasion where a client has said something to the effect of, “People only write reviews when they’re having a fantastic day or are just pissed off at the world.”

Okay, so ignore the one-star reviews. (And go back to read the five stars when you need a boost>) Consider the two-, three- and four-star comments. People took time out of their day to not only engage with you brand, but give insight into what’s missing. 

Look for patterns. I did some consulting for a restaurant where the most common complaint was slow service. The place was enormous but only had three server stations, two soda stations and the kitchen on the other side of most of the seating. Instead of listening, the owner revamped the menu and built a killer patio (equally as far away from the kitchen.) Nothing really changed and the reviews continue to say the same thing.

Sure, customers write reviews with emotions attached. But that’s good!

A two-star review means they’re severely disappointed but willing to give you a second chance under the right circumstance. (They told you what’s wrong and you have an opportunity to make it right. Win!)

A three-star review means you’re average and this reviewer is giving you their opinion on what is keeping you mediocre. 

A four-star review means you are a tweak or two away from becoming a favorite. Loyalty is in reach. Loyalty - the most inexpensive way to generate revenue. 


Live with the Duality of Data

Data — whether online analytics or anecdotal reviews — provide patterns to learn from, not pillars to stand on. Analytic data may be pure, but once it gets into a person’s hands, it’s susceptible to bias. Control that bias, so you are up front with what you think are the most important metrics. 

Humans are emotionally driven pack animals, not robots. Use data to unearth insights and find ways to better connect with the pack animals we are.

Jason Jacobson1 Comment