Opinions expressed by Entrepreneur contributors are their own.
Have you ever been innocently browsing the web, only to find that the ads shown to you line up a little too perfectly with the conversation you just finished before you picked up your phone? Maybe you’ve noticed that a title you’ve seen a dozen times in your recommendations on Netflix looks different all of a sudden, and the thumbnail entices you to give the trailer a watch when maybe it didn’t before.
That’s because Netflix, and most other companies today, use massive amounts of real-time data — like the shows and movies you click on — to decide what to display on your screen. This level of “personalization” is supposed to make life more convenient for us, but in a world where monetization comes first, these tactics are standing in the way of our free choice.
Now more than ever, it’s imperative that we ask questions about how our data is used to curate the content we’re shown and, ultimately, form our opinions. But how do you get around the so-called personalized, monetized, big-data-driven results everywhere you look? It starts with a better understanding of what’s going on behind the scenes.
How companies use our data to curate content
It’s widely known that companies use data about what we search, do and buy online to “curate” the content they think we’ll be most likely to click on. The problem is that this curation method is based entirely on the goal of monetization, which in turn silently limits your freedom of choice and the ability to seek out new information.
Take, for example, how ad networks decide what to show you. Advertisers pay per impression, but they spend even more when a user actually clicks, which is why ad networks want to deliver content with which you’re most likely to interact. Using big data built around your browsing habits, most of the ads shown to you will feature brands and products you’ve viewed in the past. This reinforces preferences without necessarily allowing you to explore new options.
Based on how you interact with the ads shown to you, they’ll be optimized for sales even further by presenting you with more of what you click on and less of what you don’t. All the while, you’re living in an advertising bubble that can impact product recommendations, local listings for restaurants, services and even the articles shown in your newsfeed.
In other words, by simply showing you more of the same, companies are maximizing their profits while actively standing in the way of your ability to uncover new information — and that’s a very harmful thing.
What we’re shown online shapes our opinions
Social media platforms are one of the most powerful examples of how big data can prove harmful when not properly monitored and controlled.
Suddenly, it becomes apparent that curated content almost forces us into siloes. When dealing with products and services, it might prove inconvenient, but when faced with news and political topics, many consumers find themselves in a dangerous feedback loop without even realizing it.
Once a social media platform has you pegged with specific demographics, you’ll begin to see more content that supports the opinions you’ve seen before and aligns with the views you appear to hold. As a result, you can end up surrounded by information that seemingly confirms your beliefs and perpetuates stereotypes, even if it isn’t the whole truth.
It’s becoming harder and harder to find information that hasn’t been “handpicked” in some way to match what the algorithms think you want to see. That’s precisely why leaders are beginning to recognize the dangers of the big data monopoly.
How do we safely monitor and control this monopoly of data?
Data sharing is not inherently bad, but it is crucial that we begin to think more carefully about how our data is used to shape the opinions and information we find online. Beyond that, we also need to make an effort to escape our information bubbles and purposefully seek out different and alternative points of view.
If you go back generations, people read newspapers and magazines and even picked up an encyclopedia every once in a while. They also tuned in to the local news and listened to the radio. At the end of the day, they had heard different points of view from different people, each with their own sources. And to some degree, there was more respect for those alternate points of view.
Today, we simply don’t check as many sources before we form opinions. Despite questionable curation practices, some of the burdens still fall onto us as individuals to be inquisitive. That goes for news, political topics and any search where your data is monetized to control the results you see, be it for products, establishments, services or even charities.
It’s time to take back ownership of our preferences
You probably don’t have a shelf of encyclopedias lying around that can present mostly neutral, factual information on any given topic. However, you do have the opportunity to spend some time seeking out contrasting opinions and alternative recommendations so that you can begin to break free from the content curation bubble.
It’s not a matter of being against data sharing but recognizing that data sharing has its downsides. If you’ve come to solely rely on the recommendations and opinions that the algorithms are generating for you, it’s time to start asking more questions and spending more time reflecting on why you’re seeing the brands, ads and content coming across your feed. It might just be time to branch out to something new.