“Once you learn to read, you will be forever free.” ~ Frederick Douglas.
Like so many other things our brains do with seemingly little effort, we tend to take the remarkable act of reading for granted. Think about what is happening while you read. Your brain has to recognize arbitrary squiggles designated as letters, on a screen or page, regardless of the quirks of cursive or printed handwriting, or letter case, font or size. It must then bind the letters into words and link them with their stored meaning so they can be understood. And it does so even if thelettersallruntogether with no space or punctuation as they did in early writing. Plus, this must all occur instantaneously and smoothly, letter after letter and word after word, for fluent reading and comprehension to occur. Impressed?
Another fascinating thing is that unlike the ability to speak, reading is not innate. Babies are born wired to process language and speak, and an infant exposed regularly to spoken language will learn to speak it. However, learning to read relies on instruction. This is thought to be because unlike speech, reading is a relatively new phenomenon. Writing, and hence reading, originated only about 5000 years ago. The first books appeared around the year 23 B.C. Books were rare and expensive until the invention of the printing press in the 15th century, and it was after the Industrial Revolution in the 1800s that large populations of people in the world began to read.
Because reading is such a new skill, human brains do not come with dedicated reading regions or circuits. Fortunately, brains possess the incredible power of neuroplasticity – the ability to reorganize connections and circuits for new uses in response to learning or experience. In order to read, the human brain re-purposes what was developed over thousands of years for other needs. As Dehaene (2009) describes it, “The brain circuitry inherited from our primate evolution is co-opted to the task of recognizing printed words – the brain’s existing neural networks are “recycled” for reading.”
So, reading ends up using parts of the brain involved with visual processing and language comprehension. You use vision to see letters via the primary visual cortex in the occipital lobe. That information is then relayed on and several brain regions work together to let you read: the temporal lobe (including Wernicke’s area) which recognizes letters, decodes sounds, links letters and words to their spoken equivalents and processes language for comprehension; parts of the parietal lobe which allow letter shapes to be put together to form words; Broca’s area in the inferior frontal lobe which governs production of word sounds and speech; and several important white-matter pathways which connect some of these regions.
Studies suggest there is a streamlined process in proficient readers whereby they rapidly recognize familiar words by their shape – like pictures, and by their sound - as if spoken aloud. As people learn to read, an area of the left temporal lobe (the Visual Word Form Area), which would otherwise be involved in recognizing faces, becomes specialized for word recognition and builds up a visual dictionary of words. Then, once a word image is recognized, Broca's area activates as if generating the sound of the word so the reader ‘hears’ it internally.
Interestingly, fMRI data show that the same processes and areas of the brain activate no matter what language people read in, despite differences such as whether writing is alphabetic, or not (e.g. Chinese). In addition, differences in the same brain areas are found in people with dyslexia anywhere in the world. Yet another reminder that despite our many apparent differences we, humans, are much more alike than we think.
Post by: Nadia Fike
Read more: 1.Stanislas Dehaene (2009). Reading in the Brain: The New Science of How We Read. New York: Penguin. 2. Steven Roger Fischer (2003). A History of Reading. London: Reaktion Books. 3.Laurie S. Glezer et al. Adding Words to the Brain's Visual Dictionary: Novel Word Learning Selectively Sharpens Orthographic Representations in the VWFA. J. Neurosci. 2015 March 25; 35 (12): 4965-4972. 4. Lorenzo Magrassi et al. Sound representation in higher language areas during language generation. Proc Natl Acad Sci USA. 2015 Feb 10; 112(6): 1868–1873.