Animism in psychology refers to a child’s belief that inanimate objects are alive and have feelings, intentions, or consciousness. The concept comes from Jean Piaget’s theory of cognitive development, where he identified it as a hallmark of how children between ages 2 and 7 think about the world. A child who scolds a chair for “tripping” them or insists the sun is following their car is displaying animistic thinking.
Where Animism Fits in Child Development
Piaget placed animism within what he called the preoperational stage of cognitive development, spanning roughly ages 2 to 7. During this stage, children can use symbols and language but haven’t yet developed the logical reasoning tools that come later. They tend to perceive things from their own perspective and struggle to separate their inner experience from the outside world. Piaget called this egocentrism, and animism is one of its direct consequences: because a child feels alive and has thoughts, they naturally assume everything else does too.
This leads to some very recognizable behaviors. A child playing with blocks who gets hurt may want to punish the blocks, believing the objects acted on purpose. A tree swaying in the wind must be alive because it’s moving. The sun and moon have feelings and follow you around. These aren’t random fantasies. They reflect a consistent logic: if I’m alive and I move and I have feelings, then other things that move probably do too.
As children enter what Piaget called the concrete operational stage, around age 7 to 11, their egocentrism fades. They begin understanding other people’s perspectives and developing more systematic ways of categorizing the world. Animistic thinking gradually gives way to a clearer distinction between living and nonliving things.
How Researchers Study Animistic Thinking
Piaget’s original method involved simply interviewing children, asking them questions like “Is the sun alive?” and “Does the wind know it’s blowing?” Their answers revealed consistent patterns of attributing life and awareness to objects that moved, made noise, or seemed to act on their own.
Later researchers refined this approach. In one well-known study, preschool-aged children watched films of both animate and inanimate objects moving in different ways. Researchers recorded whether the children labeled each object as alive or not alive, what properties they attributed to it, and how they justified their choices. This kind of design helps tease apart whether children are confused about the word “alive” or genuinely perceive objects differently.
Piaget himself documented children as old as 12 still showing a tendency to deny that plants are alive while attributing life to nonliving objects that appear to move on their own, like clouds or rivers. This suggests the concept of “alive” is genuinely difficult for children to master, and movement is the feature they rely on most heavily as a shortcut.
Animism vs. Anthropomorphism
These two terms get mixed up constantly, even in academic research, but they describe different things. Animism is perceiving an object as alive. Anthropomorphism is perceiving an object as human. The distinction matters because being alive doesn’t imply humanness. Plants and animals are alive, but they aren’t human. When a child says the wind is alive, that’s animism. When someone says their car is “angry” at them or gives their Roomba a name and personality, that’s closer to anthropomorphism.
Animism is cognitively simpler and easier to trigger. It extends to a much broader range of things: plants, animals, weather, even microorganisms. Anthropomorphism is narrower, requiring someone to map specifically human traits like facial expressions, speech, or intentional deception onto a nonhuman entity. In practice, the two often overlap, but they operate as separate cognitive processes.
Does Animistic Thinking Disappear in Adults?
Piaget argued it should. Once children develop logical reasoning, they stop treating rocks and rivers as living things. But research tells a more complicated story. A study testing 75 participants ranging from teenagers to people in their seventies found that every age group made animism errors, judging at least some nonliving items as alive. Animism errors actually increased with age, directly contradicting Piaget’s prediction that the tendency fades permanently.
The researchers attributed this partly to declining fluid intelligence (the type of reasoning used to solve novel problems) and partly to the fact that for most people, the biological distinction between living and nonliving things is peripheral knowledge they rarely need to use precisely. If you’re not a biologist, the exact criteria for “alive” aren’t something you think about on a Tuesday afternoon, so your intuitive judgments stay loose.
There’s also a broader evolutionary argument. Some psychologists, notably Stewart Guthrie, have proposed that animistic thinking is the result of a survival strategy baked into human cognition. Our brains are wired to detect agents, things that might act on us, even when none are present. Hearing a rustling in the bushes and assuming “something alive is there” is a safer mistake than assuming nothing and getting attacked. Animism, in this view, is the occasional misfire of a generally useful detection system. Adults still have that system running, which is why we instinctively flinch at a shadow or feel like our computer is “deliberately” crashing at the worst moment.
The Psychological vs. Anthropological Meaning
If you search “animism” outside of psychology, you’ll find a very different definition. In anthropology and religious studies, animism refers to a worldview common to many indigenous cultures in which elements of the natural environment, such as trees, rivers, mountains, and animals, are understood as persons with whom humans can have social relationships. The Victorian anthropologist E. B. Tylor defined it as a belief in the “animation of all nature.”
The psychological definition treats animism as a cognitive error that children grow out of. The anthropological definition describes a way of relating to the world that entire cultures practice as adults, not because they lack logical reasoning, but because they experience the natural environment as fundamentally relational and communicative. Newer anthropological work pushes back against framing this as a “mistake,” arguing that the animist who says whales are persons isn’t making a factual claim to be tested. They’re describing a way of engaging with whales, knowing how to relate to them rather than knowing facts about them.
This tension matters because Piaget’s framework implicitly positions animism as primitive, something to outgrow. Contemporary researchers in early childhood education have challenged this, proposing that children’s animistic tendencies aren’t just cognitive immaturity but can reflect curiosity, care, and a kind of imaginative attentiveness to the nonhuman world. One recent research project involving children ages 2 to 8 described their playful, speculative engagement with objects as “enchanted animism,” framing it as a space for wonder and immersion rather than a developmental limitation.
Why Children Grow Out of It (Mostly)
The shift away from animistic thinking happens gradually and is tied to broader cognitive changes. Around age 7 or 8, children develop what Piaget called conservation, the understanding that properties of objects stay the same even when their appearance changes. They also become less egocentric, gaining the ability to mentally step outside their own viewpoint. These changes make it possible to reason about objects more objectively, recognizing that a toy doesn’t have feelings just because you do.
But “growing out of it” is a matter of degree. Children learn to correctly classify most things as living or nonliving, but the underlying cognitive tendency to detect life and agency in ambiguous situations never fully switches off. It’s why horror movies work, why people talk to their plants, and why you’ve probably apologized to a piece of furniture after bumping into it.

