Posted inInternet / Politics / ToMl

What the Internet Is Hiding from You

When you follow your friends on Facebook or run a search on Google, what information comes up, and what gets left out? That’s the subject of a new book by Eli Pariser called The Filter Bubble: What the Internet Is Hiding from You. According to Pariser, the internet is increasingly becoming an echo chamber in which websites tailor information according to the preferences they detect in each viewer. Yahoo! News tracks which articles we read. Zappos registers the type of shoes we wear, we prefer. And Netflix stores data on each movie we select.
The top 50 websites collect an average of 64 bits of personal information each time we visit and then custom-designs their sites to conform to our perceived preferences. While these websites profit from tailoring their advertisements to specific visitors, users pay a big price for living in an information bubble outside of their control. Instead of gaining wide exposure to diverse information, we’re subjected to narrow online filters.
Eli Pariser talking:
I didn’t know that that was, you know, how it was working, until I stumbled across a little blog post on Google’s blog that said “personalized search for everyone.” And as it turns out, for the last several years, there is no standard Google. There’s no sort of “this is the link that is the best link.” It’s the best link for you. And the definition of what the best link for you is, is the thing that you’re the most likely to click. So, it’s not necessarily what you need to know; it’s what you want to know, what you’re most likely to click.
if you look at how they talked about the original Google algorithm, they actually talked about it in these explicitly democratic terms, that the web was kind of voting—each page was voting on each other page in how credible it was. And this is really a departure from that. This is moving more toward, you know, something where each person can get very different results based on what they click on.
And when I did this recently with Egypt—I had two friends google “Egypt”—one person gets search results that are full of information about the protests there, about what’s going on politically; the other person, literally nothing about the protests, only sort of travel to see the Pyramids websites.
the way that people use Google, most people use just those top three links. So, if Google isn’t showing you sort of the information that you need to know pretty quickly, you can really miss it. And this isn’t just happening at Google; it’s happening all across the web, when I started looking into this. You know, it’s happening on most major websites, and increasingly on news websites. So, Yahoo! News does the exact same thing, tailoring what you see on Yahoo! News to which articles it thinks you might be interested in. And, you know, what’s concerning about this is that it’s really happening invisibly. You know, we don’t see this at work. You can’t tell how different the internet that you see is from the internet that anyone else sees is, but it’s getting increasingly different.
they say, “We’re just giving people what we want.” And I say, “Well, what do you mean by ‘what we want’?” Because I think, actually, all of us want a lot of different things. And there’s a short-term sort of compulsive self that clicks on the celebrity gossip and the more trivial articles, and there’s a longer-term self that wants to be informed about the world and be a good citizen. And those things are intentional all the time. You know, we have those two forces inside us. And the best media helps us sort of—helps the long-term self get an edge a little bit. It gives us some sort of information vegetables and some information dessert, and you get a balanced information diet. This is like you’re just surrounded by empty calories, by information junk food.
this was actually the starting point for looking into this phenomenon. And basically, after 2008 and after I had transitioned out of being the executive director of MoveOn, I went on this little campaign to meet and befriend people who thought differently from me. I really wanted to hear what conservatives were thinking about, what they were talking about, you know, and learn a few things. And so, I had added these people as Facebook friends. And I logged on one morning and noticed that they weren’t there. They had disappeared. And it was very mysterious. You know, where did they go? And as it turned out, Facebook was tracking my behavior on the site. It was looking at every click. It was looking at every, you know, Facebook “like.” And it was saying, “Well, Eli, you say that you’re interested in these people, but actually, we can tell your clicking more on the progressive links than on the conservative links, so we’re going to edit it out, edit these folks out.” And they disappeared. And this gets to some of the danger of this stuff. Facebook edited out my conservative friends.
what the play here is, is there’s this thing called confirmation bias, which is basically our tendency to feel good about information that confirms what we already believe. And, you know, you can actually see this in the brain. People get a little dopamine hit when they’re told that they’re right, essentially. And so, you know, if you were able to construct an algorithm that could show people whatever you wanted, and if the only purpose was actually to get people to click more and to view more pages, why would you ever show them something that makes them feel uncomfortable, makes them feel like they may not be right, makes them feel like there’s more to the world than our own little narrow ideas?
democracy really requires this idea of discourse, of people hearing different ideas and responding to them and thinking about them. And, you know, I come back to this famous Daniel Patrick Moynihan quote where he says, you know, “Everybody is entitled to their own opinions, but not their own facts.” It’s increasingly possible to live in an online world in which you do have your own facts. And you google “climate change,” and you get the climate change links for you, and you don’t actually get exposed necessarily—you don’t even know what the alternate arguments are.
when you’re just basically trying to get people to click things more and view more pages, there’s a lot of things that just isn’t going to meet that threshold. So, you know, take news about the war in Afghanistan. When you talk to people who run news websites, they’ll tell you stories about the war in Afghanistan don’t perform very well. They don’t get a lot of clicks. People don’t flock to them. And yet, this is arguably one of the most important issues facing the country. We owe it to the people who there, at the very least, to understand what’s going on. But it will never make it through these filters. And especially on Facebook this is a problem, because the way that information is transmitted on Facebook is with the “like” button. And the “like” button, it has a very particular valence. It’s easy to click “like” on, you know, “I just ran a marathon” or “I baked a really awesome cake.” It’s very hard to click “like” on, you know, “war in Afghanistan enters its sixth year”—or “10th year,” sorry. You know, so information that is likable gets transmitted; information that’s not likable falls out.
if you’re logged in to Google, then Google obviously has access to all of your email, all of your documents that you’ve uploaded, a lot of information. But even if you’re logged out, an engineer told me that there are 57 signals that Google tracks—”signals” is sort of their word for variables that they look at—everything from your computer’s IP address—that’s basically its address on the internet—what kind of laptop you’re using or computer you’re using, what kind of software you’re using, even things like the font size or how long you’re hovering over a particular link. And they use that to develop a profile of you, a sense of what kind of person is this. And then they use that to tailor the information that they show you.
And this is happening in a whole bunch of places, you know, not just sort of the main Google search, but also on Google News. And the plan for Google News is that once they sort of perfect this personalization algorithm, that they’re going to offer it to other news websites, so that all of that data can be brought to bear for any given news website, that it can tailor itself to you. You know, there are really important things that are going to fall out if those algorithms aren’t really good.
And what this raises is a sort of larger problem with how we tend to think about the internet, which is that we tend to think about the internet as this sort of medium where anybody can connect to anyone, it’s this very democratic medium, it’s a free-for-all, and it’s so much better than that old society with the gatekeepers that were controlling the flows of information. Really, that’s not how it’s panning out. And what we’re seeing is that a couple big companies are really—you know, most of the information is flowing through a couple big companies that are acting as the new gatekeepers. These algorithms do the same thing that the human editors do. They just do it much less visibly and with much less accountability.
there aren’t perfect opt-out options, because even if you take a new laptop out of the box, already it says something about you, that you bought a Mac and not a PC. I mean, it’s very hard to get entirely out of this. There’s no way to turn it off entirely at Google. But certainly, you can open a private browsing window. That helps.
I think, in the long run, you know, there’s sort of two things that need to happen here. One is, we need, ourselves, to understand better what’s happening, because it’s very dangerous when you have these kinds of filters operating and you don’t know what they’re ruling out that you’re not even seeing. That’s sort of a—that’s where people make bad decisions, is, you know, what Donald Rumsfeld called the “unknown unknowns,” right? And this creates a lot of unknown unknowns. You don’t know how your experience of the world is being edited.
But it’s also a matter of pushing these companies to sort of—you know, these companies say that they want to be good. “Don’t be evil” is Google’s motto. They want to change the world. I think we have to push them to sort of live up to their best values as companies and incorporate into these algorithms more than just this very narrow idea of what is important.
I had a brief conversation with Larry Page, in which he said, “Well, I don’t think this is a very interesting problem.” And that was about that. But, you know, further down in Google, there are a bunch of people who are wrestling with this. I think the challenge is—I talked to one Facebook engineer who sort of summed it up quite well, and he said, “Look, what we love doing is sitting around and coming up with new clever ways of getting people to spend more minutes on Facebook, and we’re very good at that. And this is a much more complicated thing that you’re asking us to do, where you’re asking us to think about sort of our social responsibility and our civic responsibility, what kind of information is important. This is a much more complicated problem. We just want to do the easy stuff.” And, you know, I think that’s what’s sort of led us to this current place. I think there are also people who see the flipside of that and say this is one of the big, juicy problems in front of us, is how do we actually take the best of sort of 20th century editorial values and import them into these new systems that are deciding what people see and what people don’t see.
just this neutral term of “personalization” sounds attractive. But it’s geared and tailored for you.
in some ways, this is the driving struggle on the internet right now between all of these different companies, to accumulate the biggest amounts of data on each of us. And Facebook has its strategy, which is basically ask people to tell Facebook about themselves. Google has its strategy, which is to watch your clicks. Microsoft and Yahoo! have their strategies. And all of this feeds into a database, which can then be used to do three things. It can target ads better, so you get better targeted ads, which honestly, I think, you know, sometimes is fine, if you know that it’s happening. It can target content, which I think is much more problematic. You start to get content that just reflects what it thinks you want to see. And then the third thing is, and it can make decisions about you.
So, one of the sort of more surprising findings in the book was that banks are beginning to look at people’s Facebook friends and their credit ratings in order to decide to whom to give—to offer credit. And this is based on this fact that, you know, if you look at the credit ratings of people, you can make predictions about the credit ratings of their friends. It’s very creepy, though, because really what you’re saying then is that it would be better not to be Facebook friends with people who have lower credit ratings. It’s not really the kind of society that we want to be building, particularly.
once all of this information, personal information, is gathered, it saves the government, in its ability to surveil its population, a lot of work, because basically the private companies can gather the information, and all the government has to do is issue the subpoena or make the call that “for national security, we need this information.” So, in essence, it doesn’t have to do the actual surveillance. It just has to be able to use it when it needs to.
There’s a funny Onion article that has the headline “CIA Rules Out Very Successful New Facebook Program,” implying that the CIA started Facebook to gather data. And it’s funny, but there is sort of some truth there, which is that these companies do have these massive databases, and the protections that we have for our data that live on these servers are far—you know, far less protection than if it’s on your home computer. The FBI needs to do much less paperwork in order to ask Google for your data than it does to, you know, come into your home and look at your computer. And so, increasingly—so this is sort of the downside of cloud computing, is that it allows more and more of our data and everything that we do to be available to the government and, you know, for their purposes.
it’s a natural byproduct of consolidating so much of what we do online in a few big companies that really don’t have a whole lot of accountability, you know, that aren’t being pushed very hard by governments to do this right or do it responsibly. It will naturally lead to abuses.
Google Wallet. it’s just another—I mean, the way that Google thinks is, how can we design products that people will use that allow us to accumulate even more data about them? So, obviously, once you start to have a sense of everything that people are buying flowing through Google’s servers, then you have way more data on which to target ads and target content and do this kind of personalization. You know exactly how to slice and dice people. And again, you know, in some contexts, that’s fine, actually. I don’t mind when I go on Amazon, and it recommends books. They’re obviously not very good recommendations sometimes, but it’s fine. But when it’s happening invisibly and when it’s shaping not just what you buy but what you know about the world, I think, you know, is more of a problem. And if this is going to be sort of the way that the future of the internet looks, then we need to make sure that it’s much more transparent when this is happening, so that we know when things are being targeted to us. And we have to make sure that we have some control as consumers over this, that it’s not just in the hands of these big companies that have very different interests.
there’s sort of this dance here, because basically MoveOn takes on the issues that its members want to take up. So I’ve been very—you know, I don’t want to sort of impose by fiat that I wrote a book, and here’s—now we’re going to campaign about this. But, you know, there are campaigns that we’re starting to look at. One of them, I think, that’s very simple but actually would go a significant way is just to, you know, have a basic—have a way of signaling on Facebook that something is important, even if it’s not likable. Obviously this is sort of just one small piece, but actually, if you did have an “important” button, you would start having a lot of different information propagating across Facebook. You’d have people exposed to things that maybe aren’t as smile-inducing, but we really need to know. And Facebook is actually considering adding some new verbs. So, this could be a winnable thing. It’s not—it won’t solve the whole problem, but it would start to indicate—it would start to remind these companies that there are ways that they can start to build in, you know, some more kind of civic values into what they’re doing.
Discussion with Eli Pariser.
Eli Pariser, author of the new book, ‘The Filter Bubble: What the Internet Is Hiding from You’. He is also the board president and former executive director of MoveOn.org, which at five million members is one of the largest citizens’ organizations in American politics.
– from democracynow.org

Leave a Reply

Your email address will not be published. Required fields are marked *