Filter Bubble? Not Really. Try Novelty Bubble

on under Digital Strategy.

(This did not happen.) Image Credit: Didit Putra

(This did not happen.) Image Credit: Didit Putra

Imagine a world where you were only told what you wanted to hear.

A world where your information diet was tailored by faceless entities, hiding information from you that might make you uncomfortable, make you question yourself, or force you to consider an alternative point of view.

Welcome to the filter bubble: a world where invisible algorithms personalize your search results, and your Facebook News Feed is automatically filtered to your tastes. A world that cuts you off from the outside, and forces you down a path of narrow-minded stubbornness.

A world that just might be a figment of Eli Pariser's imagination.

Don't misunderstand me. Our search results, our social feeds, and even our ads are personalized. And we definitely, definitely, live in a world where our personal biases color our perception. But those two things aren't as closely connected with each other as you might imagine.

We've always filtered our information, and allowed ourselves to succumb to the faults of confirmation bias. (See my post at Hubspot.) It's built into our biology, and it's a difficult problem to solve, but it has nothing to do with computer algorithms.

In fact, the filter bubble, as currently imagined, is not be the biggest threat to the way information is dispersed today. I believe we are facing a very different problem, in some ways the opposite problem, and one that I am intimately familiar with as a marketer: somebody who's livelihood depends on dispersing information.

I've decided to give this problem a name: The Novelty Bubble.

And today, I'm going to talk to you about what that problem is.

How The "Filter Bubble" Works In The Real World

I don't want to spend too much time dissecting the filter bubble, or the issues I have with this view on things. AJ Kohn has already done a good job with that, arguing that it should probably be called "the preference bubble" instead.

I do, however, want to touch briefly on what science has to say on the topic, and how it really influences people, as opposed to what our worst nightmares might have us believe.

A study conducted by researchers from The Wharton School at the University of Pennsylvania is one of very few to investigate the impact of a recommendation engine on user behavior.

In the study, iTunes users download a recommendation engine that recommends music based on shared preferences with similar users. While the study wasn't technically an "experiment," they were able to separate users into groups similar to a control group and a treatment group. They accomplished this by comparing users who downloaded the engine while the study was happening with users who downloaded it after the study was conducted.

While this isn't the perfect scientific design for such a study, the nature of the separation between these two groups is arbitrary enough to make it worth paying attention to.

The results completely deconstruct the idea of the filter bubble:

  • In aggregate, after downloading the app, users' music purchases became more similar to one another, not more fragmented.
  • The researchers weren't content to look at these aggregate results, since clusters of more similar users could mask the fact that clusters were getting farther from one another. Looking at clusters of similar users, though, the story is the same. After downloading the app, clusters of similar users became more disperse. Users were using the engine to expand their tastes, not limit them.
  • There were two reasons for increased similarity between users:
    • The app caused them to purchase more music in general, which naturally exposes them to a wider variety of music
    • The mix of products that they purchased after the app was more diverse in general, even without considering the larger volume of music
  • Overall diversity of music consumption increased after downloading the app. It wasn't just that the recommendation engine was suggesting everybody listen to the same songs.

Far from turning everybody into an extremist, real world recommendation engines seem to bring us closer together, yet more diverse as a whole.

Why? Why should these supposed filter bubbles actually cause us to expand our tastes? Why don't they turn us into closed-off creatures of habit? If we're only being shown what these algorithms think we will like, why should they make us look more like well-rounded generalists?

I would speculate that part of the answer is that our brains are actually a far worse filter bubble than any algorithm. Remember confirmation bias? Then there's the fact that we only know what we know. We can't seek out the things we don't know we'll like. Algorithms, on the other hand, can show us things we would never have looked for.

But there's another reason recommendation engines don't fragment us, or force us down a path of habit and closed-mindedness. Are you ready? It's important. It's the crux of our real problem, and it's not going away. Here it is.

PEOPLE. GET. BORED.

The 7 C's of Internet Popularity

If you're like me, and you spend a lot of time thinking about how to reach larger audiences on the internet, you've probably noticed a trend. I like to call these the 7 C's of internet popularity:

  • Conspiracy theories
  • Celebrities
  • Comedy
  • Controversy
  • Communities
  • Counterintuitive facts
  • Cats

What these 7 C's have in common are a collection of traits I'm collectively referring to as "novelty." To clarify, I'm not just talking about things that are new. More accurately, I'm talking about things that aren't boring. Specifically:

  • Anger
  • Awe
  • Usefulness
  • Interest
  • Anxiety
  • Overall emotional intensity
  • Surprise
  • Overall emotional positivity (versus negativity)

New information is much more likely to elicit these emotions than old information.

Why these specific factors? Because of a study conducted by Jonah Berger, who analyzed 3 months of New York Times articles to see which were most emailed. I've organized the factors above by the impact they had on how shareable the articles were. Here are the specific impacts from the paper:

Percentage Change in fitted Probability of Making the List for a one-standard-deviation increase above the mean in an article Characteristic: Anxiety + 21%, Anger + 34%, Sadness minus 16%, Awe + 30%, Positivity + 13%, Emotionality + 18%, Interest + 25%, Surprise + 14%, Practical Value + 30%, Time at top of home page + 20%

Notice that sad articles don't tend to get shared, either.

The importance of awe is an especially important factor to draw attention to. In the study, awe is defined as:

...the emotion of self-transcendence, a feeling of admiration and elevation in the face of something greater than the self. It involves the opening or broadening of the mind and an experience of wow that makes you stop and think.

This is the polar opposite of what we would expect from the filter bubble. People actually love ideas that change the way they think about the world, and share them almost compulsively.

Now, why is this a problem? Do we really want to see more boring things in our social feeds? Do we want fewer awe-inspiring blog posts in the world? Not really. What doesn't have an emotional impact on us is usually irrelevant to us. We've always filtered out the boring stuff. Sacrificing emotionally intense content for a more accurate picture of the world has its benefits, but there will always be some filtering, so long as we're human beings.

So let me explain what I think the actual problem is, and why it's different from what we've seen before.

The Novelty Bubble: How Instincts and Algorithms Work Together to Show Us Interesting Things, Whether or Not They're True

I've mentioned in previous posts that I spend a lot of time second guessing widely held beliefs. I do this because it helps me arrive at interesting insights, but it's also proven to be a useful skill as a marketer. Controversy and counterintuitive facts capture attention because they elicit emotional responses, and give us new things to think about.

This is often a good thing. Counterintuitive facts often give us a new perspective on the world, and can make us more open-minded in general. When it comes to controversial topics, I typically try to find a unique perspective, one that doesn't fall within the clearly defined dividing lines most controversies thrust us into, and this has a similar effect.

For these very reasons, content like this also tends to do well on social networks. It gets shared. It gets linked to. It grabs attention in search results.

But there's a problem. None of this stuff actually has to be true, or even well researched, to have the same effect.

If you're like many citizens of the internet, you've probably heard this little piece of controversy: you lose the rights to everything you post to Facebook. Once it's published on their site, you no longer own it. It becomes theirs, and you surrender your copyright.

Well, actually, that's not at all true.

This is an idea that has all the right kinds of properties to spread through social networks. It makes people angry. It's controversial. It relates to your everyday life, so it has practical implications.

It also has the unique characteristic of sounding like it's true. After all, you never read the EULA when you joined the site. Facebook is entangled in all kinds of privacy controversies. It doesn't sound like it would be illegal, and patent and copyright trolls abound on the internet. Why wouldn't Facebook be doing something like this?

Well, they aren't. You keep the copyrights to everything you post on Facebook. All the EULA says is that you give Facebook the right to republish your content and use it for other purposes on the site. You know: things like allowing others to share your content, or parsing your content to personalize ads. But if you take your content off Facebook, all those rights are revoked. You still own the copyright, and any copies that were made using Facebook's native systems are removed.

The problem is that human beings are notoriously bad at sorting through accurate information. According to research by Walter Quattrociocchi at Northeastern University in Boston, people spend the same amount of time sharing and commenting on posts from mainstream news sites as posts from alternative news sources or, more importantly, troll sites and deliberate hoaxes.

In fact, research from Cornell University shows that people are far more likely to share news from an alternative media outlet than from a respected scientific publication.

With the rise of social media, we've traded in our gate keepers for the voice of the people. The benefits of that are obvious: more diversity of thought, less control over the distribution of information by a small minority, more widely available marketing for smaller businesses, and so on. But the trade-off is a decrease in accuracy.

Add to that algorithms that attempt to predict what is going to get shared before it even gets shared, and you have a recipe for misinformation.

The problem is of such concern that the World Economic Forum listed digital misinformation as one of the biggest global risks in 2013.

Misinformation is now armed to move faster than ever before. The argument that rebuttals can arise just as fast isn't necessarily comforting. The mere-exposure effect makes people more likely to agree with the first thing they were told than the second, even if they otherwise have no preference.

We're now facing a fundamental quirk of the human brain in a way we never have before. Human beings, in general, care far more about novelty than truth. We're more likely to pay attention to things that are interesting than things that are boring. It's the opposite of Occam's razor; the simplest explanation may usually be the right one, but it's not the one people are interested in talking about.

Safeguards we once had in place to protect us from inaccuracies are falling away.

It's an ugly problem, and a not a simple one to solve. As content marketers and social media personalities, it's important that we start acting as journalists as well. We have a social responsibility not just to get our clients in front of eyeballs, but to avoid spreading misinformation, sacrificing long term credibility for short term attention.

  •  
  •  
  •  
  •  
  •