The third person effect is a cognitive bias where people believe that media messages influence other people more than they influence themselves. First described by communication researcher W. Phillips Davison in 1983, the idea captures something most of us have felt intuitively: when you see an advertisement, a piece of propaganda, or a misleading headline, you tend to assume you can see through it while “other people” probably can’t. Davison defined it as the likelihood that individuals exposed to a persuasive communication “will expect the communication to have a greater effect on others than on themselves.” Decades of research have confirmed that this perception gap is real, consistent, and consequential.
The Two Parts of the Effect
The third person effect is actually a two-part phenomenon. The first part is perceptual: you genuinely believe others are more susceptible to media influence than you are. The second part is behavioral: you then act on that belief. You might support restricting certain content, share warnings about misinformation you think will fool others, or change your own behavior based on what you assume the media is doing to everyone else. Both parts have been confirmed across a wide range of studies since Davison’s original paper.
This distinction matters because the effect isn’t just about a private mental error. It shapes real-world decisions. The perception that “other people will fall for this” becomes a motivation to do something about it, whether that means calling for regulation, avoiding a product, or dismissing an entire media outlet.
Why Your Brain Does This
Several psychological mechanisms drive the third person effect, but the most well-supported explanations center on ego protection and ego enhancement. People are motivated to feel good about themselves, and one way to do that is to see yourself as more discerning and critical than the average person. You think of yourself as too smart or too media-savvy to be manipulated, while assuming that “other people” lack those defenses.
Closely related is a tendency called biased optimism. In social comparisons, people generally believe two things: that they are more likely than others to experience good outcomes, and that bad outcomes are more likely to happen to someone else. Applied to media, this means you tend to assume positive messages (like a public health campaign) will resonate with you, while negative or manipulative messages will only work on others.
There’s also an element of how you explain behavior. When you resist a persuasive message, you credit your own intelligence or values. When you imagine someone else encountering the same message, you’re more likely to assume they’ll go along with it because you don’t extend that same charitable reasoning to strangers. You know your own mental defenses from the inside; you can only guess at everyone else’s.
The Social Distance Factor
One of the most consistent findings in this area is that the effect grows stronger as the “other person” feels more distant from you. If you’re asked whether a misleading political ad would influence your close friends, you might estimate a small effect. Ask about people in your city and the estimate goes up. Ask about “the general public” or people in another country and the gap between your perceived vulnerability and theirs widens further.
This social distance pattern makes intuitive sense. You know your friends, so you extend them some of the same credit you give yourself. Strangers are easier to imagine as gullible or unsophisticated. Research has also found that this distance effect interacts with how relevant a topic feels to different groups. If you believe an environmental news story matters more to people in a specific region, for example, you might adjust your estimates in more complex ways than simple “close versus far” would predict.
When the Effect Reverses
The third person effect is strongest for messages people consider undesirable or harmful: political propaganda, violent media, misleading advertising, pornography. For these kinds of content, the gap between “me” and “others” is large, with meta-analyses finding a substantial effect size of 0.83 for undesirable messages.
But something interesting happens with messages that are socially desirable. Think of a public service announcement encouraging people to spend more time with their children, or a campaign promoting charitable giving. For these messages, it becomes flattering to admit you’re influenced. Saying “that ad really moved me” makes you look compassionate, not gullible. Under these conditions, some studies have found a reversal called the first person effect, where people claim the message influenced them more than it influenced others. A meta-analysis found this reversal for desirable advertising messages (effect size of -0.47) and a smaller but still significant reversal for public service announcements (effect size of -0.16).
The pattern reveals something important about what’s really going on. People aren’t making careful, objective assessments of how media affects them. They’re managing their self-image. When admitting influence makes you look bad, you deny it. When admitting influence makes you look good, you claim it.
How It Shapes Support for Censorship
One of the most studied consequences of the third person effect is its link to censorship. When people believe that harmful media content will influence others (but not themselves), they become more willing to support restricting that content. This has been documented across domains including election campaign messages, violent media, pornography, and tobacco advertising.
A nationwide survey during the 1996 U.S. presidential campaign, for instance, confirmed both parts of the third person effect: respondents perceived greater media influence on other people than on themselves, and that perception predicted their support for restrictions on campaign messages they considered unfair. The logic, from the individual’s perspective, feels reasonable. If you believe you’re immune but others are vulnerable, restricting the content seems like it protects them without costing you anything.
This dynamic has implications well beyond formal censorship. It helps explain why people support content moderation policies on social media platforms, why parents favor restricting media their children’s peers consume, and why public pressure campaigns target advertisers. In each case, the underlying assumption is the same: I can handle this, but others can’t.
The Third Person Effect and Fake News
The third person effect has found renewed relevance in the age of social media misinformation. Studies conducted during the COVID-19 pandemic found that social media users consistently believed fake news about the virus would influence other users more than themselves. The same ego-enhancement and optimistic bias mechanisms that Davison identified decades ago apply directly to how people think about misinformation online.
This creates a paradox. Nearly everyone believes they can spot fake news while others cannot, which means nearly everyone overestimates their own resistance. If you assume you’re already good at detecting misinformation, you’re less likely to slow down and fact-check, less likely to consider that your own views might have been shaped by unreliable sources, and more likely to focus your concern on what “other people” are falling for. The third person effect, in other words, can make people more vulnerable to the very influence they believe they’re immune to.
On the other hand, the behavioral component can be productive. People who believe misinformation is harming others are more motivated to share corrections, support platform-level interventions, and advocate for media literacy education. The perception may be biased, but it can still produce useful action.
Why It Matters in Everyday Life
You don’t need to be studying communication theory for the third person effect to show up in your thinking. It appears whenever you roll your eyes at a commercial and assume it must work on “someone” or it wouldn’t be on the air. It appears when you dismiss a political ad as obvious manipulation while worrying that undecided voters will be swayed. It shows up when you read a sensational headline, recognize it as clickbait, and then worry about the people who won’t.
The core insight is simple but easy to forget: the confidence that you’re too savvy to be influenced is itself a predictable psychological pattern, not evidence that you’re actually immune. Advertising, political messaging, and misinformation work not because people are stupid, but because influence often operates below the level of conscious awareness. The person most convinced they can’t be fooled is, in some ways, the easiest to fool, because they’ve stopped paying attention to how messages might be shaping their thinking.

