Philosophy for Kids

What Does It Really Mean to Believe Something?

You’re walking home from school when you spot your friend Maya ahead of you. She’s wearing a bright yellow backpack—the one she got for her birthday. You’re about to call out to her when you realize: you believe that’s Maya. But what does that mean, exactly? What is happening inside your head when you believe something?

That’s a surprisingly tricky question. Philosophers have been arguing about it for decades, and they still haven’t agreed on an answer. Here’s what the debate looks like.

The Basic Puzzle

Let’s start with something simple. You read that garden snails are hermaphrodites—they have both male and female organs. Now you believe that. But what changed inside you?

One natural way to think about it: you now have a new thing in your mind, a kind of mental file or picture that represents the fact about snails. When someone mentions snails, that file gets pulled up, and you might say “Did you know garden snails are hermaphrodites?” That belief also affects how you act. If someone says snails can’t have babies by themselves, you might correct them.

So far, so good. But philosophers disagree about what that “mental file” really is. Some think it’s like a sentence stored in your brain. Others think it’s more like a map. And some think the whole “file” picture is wrong—that believing something is really about how you’re disposed to act, not about what’s stored inside you.

The “Sentence in Your Head” View

Imagine a robot that works by manipulating sentences in its internal machine language. When you type numbers into a spreadsheet, the computer creates a sentence that says something like “value 4 in cell A1,” then follows rules to display “4” on the screen. A thinking robot might work the same way, storing sentences like “the chemical formula for water is H₂O” and using them when needed.

Philosopher Jerry Fodor thought human minds work like that. He argued we have an innate “language of thought”—not English or Mandarin, but a basic mental language that all humans share. When you believe that snails are hermaphrodites, there’s a sentence in that language sitting in your “belief box” (not a literal box, but whatever part of your mind stores beliefs). That sentence is ready to be used in reasoning and decision-making.

This view has real appeal. It explains why we can think so many different things. Fodor pointed out that thought is “productive”: you can potentially believe an unlimited number of things (that elephants hate bowling, that 245 + 382 = 627, that riverbottoms aren’t usually made of blue beads). If each belief were stored as a whole sentence that could be broken into parts and recombined, that makes perfect sense. It also explains why thought is “systematic”—if you can think “Mengzi criticized Gaozi,” you can also think “Gaozi criticized Mengzi.” You’re just rearranging the pieces.

The “Map in Your Head” View

But other philosophers think the “sentence” picture gets something wrong. Consider what happens when you change one belief. If you believe that the mountain peak is 15 kilometers north of the river, and then you learn it’s actually 20 kilometers north, the sentence in your head just changes that one fact. To figure out the consequences—like how far the mountain is from the oasis—you’d have to do a bunch of reasoning.

But that’s not how it feels. When you learn something new, lots of other things seem to shift automatically, without you having to think about them. This is more like how a map works. If you move a mountain farther north on a physical map, everything else adjusts immediately: the distance to the coast, the direction you’d need to hike, everything. No separate reasoning required.

So some philosophers suggest that our beliefs aren’t stored as sentences but as something more map-like. This view also explains productivity and systematicity—maps can represent unlimited arrangements of things, and if you can represent the river as north of the mountain, you can also represent the mountain as north of the river.

Which view is right? It’s not settled. The sentence view might better explain why people sometimes hold inconsistent beliefs—you can have two contradictory sentences sitting in your belief box without noticing. But the map view captures how beliefs seem to hang together in a web, changing together without effort. Both have strengths. Both have weaknesses.

The “It’s About What You Do” View

Now here’s a completely different approach. Maybe what’s going on inside your head isn’t the important thing at all. What matters is how you act.

Imagine an alien named Rudolfo lands on Earth. He’s made of completely unknown stuff—maybe not even a brain, as we understand it. But he looks and acts perfectly human. He becomes a football fan and a lawyer. He says “the 1040 is due April 15” and shows up to file his taxes. He argues about whether field goals are worth 3 points.

Wouldn’t we say Rudolfo believes these things? Even if what’s inside him is totally different from what’s inside us? The philosopher who emphasizes behavior would say yes. Believing something just is being disposed to act as though it’s true—to say so when asked, to be surprised if it turns out false, to rely on it when making plans.

This view is called dispositionalism. It has a problem, though. The same belief can lead to very different behavior depending on what else you believe or want. If you believe it’s raining, you’ll take an umbrella—but only if you also believe umbrellas work and you don’t want to get wet. If you like being soaked, you might dance in the rain instead. So you can’t just say “believing it’s raining means you’ll take an umbrella.” You’d have to add all sorts of conditions about other mental states, which makes the simple behavioral picture messy.

The “It’s About How We Interpret Each Other” View

A related approach, called interpretationism, focuses on how we make sense of other people’s behavior. When you see someone boarding up windows, stocking provisions, and making worried phone calls, you naturally say “She believes a hurricane might be coming.” That’s the best way to predict what she’ll do next.

Philosopher Daniel Dennett called this the “intentional stance.” You can predict a falling rock’s behavior using physics (the “physical stance”). You can predict a bird’s behavior by knowing its biological functions (the “design stance”). But for humans, the most useful stance is to treat them as rational beings with beliefs and desires. When that works—when it helps you predict behavior simply and accurately—then it’s real enough.

On this view, beliefs are like the equator. The equator isn’t a real line painted on the ground. But saying a country is on the equator tells you something true about its position and climate. Beliefs aren’t real in the way rocks are real, but attributing beliefs to people captures real patterns in their behavior. Whether there’s literally a “sentence” or “map” inside someone’s head is almost beside the point.

Do Beliefs Even Exist?

Some philosophers take the radical position that beliefs don’t exist at all. This is called eliminativism. The idea is that our everyday “folk psychology”—talking about beliefs, desires, hopes, fears—is like an old, wrong theory. Just as we no longer say the sun “goes down” because we know the Earth rotates (well, we still say it, but we know better philosophically), maybe someday we’ll stop talking about beliefs once neuroscience gives us a better understanding of the brain.

Most philosophers find this too extreme. But it raises a genuine question: Are beliefs just useful fictions?

Degrees of Belief

Here’s something you’ve noticed: you don’t believe everything with the same strength. You might be nearly certain that 2+2=4, pretty confident that your best friend will be at school tomorrow, and only somewhat sure that your teacher actually likes dogs as much as she claims.

Philosophers sometimes measure this on a scale from 0 to 1, where 0 means “absolutely certain it’s false” and 1 means “absolutely certain it’s true.” A .5 means you think it’s equally likely to be true or false. This is called your credence or degree of belief.

But is having a high credence the same as believing? Not obviously. Imagine you hold a ticket in a fair lottery with a million tickets. Your credence that you’ll lose is .999999—very high. But do you actually believe you’ll lose? Many people say no. You know it’s possible you’ll win, even though extremely unlikely. So you might have a higher credence in losing the lottery than in some things you do believe, like that your friend is at school. This suggests that believing isn’t just a matter of having high confidence.

What About Beliefs You Don’t Think About?

Most of your beliefs aren’t active right now. You believe that Paris is the capital of France, but you probably weren’t thinking about it until just now. Philosophers distinguish between occurrent beliefs (the ones you’re actively considering) and dispositional beliefs (the ones stored away, ready to use).

This seems straightforward enough. But consider: do you believe that the number of planets is less than 9? Less than 10? Less than 11? That’s already several beliefs. What about less than 1,000,000? Do you have millions of separate beliefs stored somewhere? That seems unlikely. So maybe you only explicitly believe that there are 8 planets, and the rest are implicit—you could figure them out quickly if asked, but they’re not stored anywhere.

This is a real challenge for the “sentence in your head” view. If belief requires a stored mental sentence, then you’d need millions of sentences about the number of planets alone. But if you say implicit beliefs count too, then where do you draw the line? Is there a fact of the matter about whether you believe the number of planets is less than 435,982? Or is it just fuzzy?

What Makes a Belief True or False?

Here’s something philosophers generally agree on: beliefs have a special relationship with truth. When you believe something and it’s false, you’ve made a mistake. That’s not true of desires. If you desire something that doesn’t exist—like a pizza-flavored ice cream—you haven’t made a mistake in the same way. Your desire just hasn’t been fulfilled.

This is called the “norm of truth” for belief. Beliefs are supposed to fit the world. Desires are the opposite: the world is supposed to fit them. This is why we say people “should” believe what’s true and revise their beliefs when they find evidence they’re wrong.

But this raises another puzzle. Can you choose what to believe? Can you decide to believe that you’ll ace the test, even if you know you haven’t studied? Most people say no—you can’t just decide to believe something, the way you can decide what to eat for lunch. Belief seems to be something that happens to you, based on evidence, not something you choose. But if that’s true, in what sense are you responsible for having correct beliefs? Philosophers still argue about this.

Where Things Stand

So what is it to believe something? Is it having a mental sentence? A mental map? A pattern of behavior? A useful fiction? Something that admits of degrees? Something you can have without ever thinking about?

The honest answer is that nobody really knows. Philosophers have proposed many different accounts, each with strengths and weaknesses. The debate continues because belief is central to so much of our lives—how we explain behavior, how we decide what’s true, how we understand other people and even ourselves. Whatever beliefs turn out to be, they’re at the heart of what it means to be a thinking creature.

And that’s worth thinking about.


Key Terms

TermWhat it does in this debate
RepresentationalismThe view that believing involves having something like a mental picture, sentence, or map stored in your mind
DispositionalismThe view that believing is mainly about how you’re prone to act, not about what’s inside your head
InterpretationismThe view that having beliefs is whatever makes it useful for others to interpret your behavior as if you have beliefs
EliminativismThe radical view that beliefs don’t really exist—they’re just a mistaken theory we use to talk about minds
Occurrent beliefA belief you’re actively thinking about right now
Dispositional beliefA belief you have stored away, not currently thinking about
Implicit beliefA belief you don’t have explicitly stored but could figure out quickly from what you do believe
CredenceA measure of how confident you are in something, often on a 0-to-1 scale

Key People

  • Jerry Fodor — A philosopher who argued that thinking happens in an innate “language of thought,” with beliefs stored as mental sentences in something like a “belief box”
  • Daniel Dennett — A philosopher who argued that having beliefs is about being interpretable from the “intentional stance”—treating something as a rational agent with beliefs and desires
  • Donald Davidson — A philosopher who tied belief closely to language, arguing that creatures without language (like dogs and babies) can’t really have beliefs

Things to Think About

  1. Can you think of a belief you have that you’ve never expressed or acted on? If so, what makes it a belief? Is it something stored in your mind, or something else?

  2. If someone says “I believe I’ll win the lottery” while also buying a ticket, do they really believe it? How would you test whether someone genuinely believes something?

  3. When you change your mind about something, does it feel like editing a sentence, adjusting a map, or something else entirely? Does the way it feels tell you anything about what belief really is?

  4. Could a chess computer have beliefs about the game? If you had to predict what move it would make, would you find it useful to say “it believes it can trap my queen”? If that prediction works, does that mean the computer really believes it?

Where This Shows Up

  • Everyday arguments — When you say “You don’t really believe that!” you’re making a philosophical claim about what counts as genuine belief
  • Law and courts — Courts have to decide whether someone “knowingly” or “intentionally” did something, which depends on what they believed
  • Artificial intelligence — Engineers debate whether AI systems like language models actually “believe” anything or just simulate believing
  • Psychology — Scientists study “implicit bias,” where people’s behavior suggests beliefs they consciously deny having—raising the question of which beliefs are “really” theirs
  • Politics and misinformation — People argue about whether others “truly believe” conspiracy theories or just pretend to, and whether spreading false information is the same as lying