You probably know that sites like Facebook are using the information they have about you – like your age, gender and interests – to serve up ads that are most likely to appeal to you. That’s a little bit harmless and perhaps even helpful. But how about the more subtle filtering that is going on that you may not be aware of?
Search engines are using information they have about you to show you news that these search tools think will most likely appeal to you based on your previous search activities. The problem with that? You might find yourself living in a bubble – sheltered from ever hearing about things you might not agree with, but which might also open your mind a bit and make you what your parents always wanted for you – to be “well-rounded”.
Psychological Theories Discussed in this Episode
- Social Norms: behaviors that society expects of everyone. Examples: being quiet in the library or bookstore, saying thank you when someone holds the door for you, holding the door for the next person, etc.
- Social Roles: behaviors expected from you when you’re in a predefined role. Examples: “students” are respectful of teachers and they hold up their hands when they have a question.
- Group Polarization: the tendency for a group, after a discussion, to hold a more extreme attitude than any individuals might have held prior to a discussion. This may happen because during discussion, members of the group provide several reasons for why the group should have a certain position, and this convinces group members of the “rightness” of their attitudes.
Resources for this Episode
- Thank you to This Week in Technology for allowing me to take excerpt from episode 339, Somewhere Between Murder And A Messy Room
- One of the co-hosts on this episode was Baratunde Thurston
- The other co-host was Brian Brushwood
- The TED talks episode I excerpted is from a talk by Eli Pariser: Beware online “filter bubbles”
As web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: We get trapped in a “filter bubble” and don’t get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.