Scientific papers

This section looks at papers from prestigious journals, with take-home messages for teachers.

We invite you first to listen to this talk, ‘Embracing Dyslexia‘ (2014) on YouTube by Dr Ken Pugh, renowned cognitive neuroscientist.

Literacy and visual processing

A good reader of alphabetic scripts must be able to traverse a string of letters and discriminate them fast and accurately. She has to suppress mirror invariance, e.g. recognize that ‘b’ is different to ‘d’, as well as repetition when the target is correctly identified, the latter being a measure of how accurately the stimulus is perceived.

Heading up an international team, Felipe Pegado* assessed whether competence in reading was correlated with a broad enhancement of early visual processing of, amongst others, letter strings, faces, tools, houses and abstract patterns.

The group used both functional magnetic resonance imaging and encephalographic techniques, which record event-related potentials, on the same subjects from Portugal and Brazil whose ability ranged from early schooled, good readers, illiterates and ex-illiterates. The first technique is used to find where specific brain activity is located and the second to show precise timing of events.

They found that that in good readers, automatic mirror discrimination is present for letters and words in a site called the visual word form area (which Dehaene has dubbed the ‘brain’s letter-box’) on the left side of the brain in what is called the occipitotemporal region (towards the back and lower part of the brain, near the visual cortex at the very back of the brain). This region is robustly activated when orthographic stimuli (i.e. letters) are presented to literate subjects regardless of the writing system. Reading practice enhances this activation even in children with dyslexia. In contrast, activation of the right occipitotemporal was heightened when viewing faces. This suggests that as reading develops, there is a sort of competition for processing space in the ‘higher’ regions of cognition and that as the left area develops for reading the discrimination of faces is displaced to the right homologous region.

An interesting finding was that reading fluency – i.e. accurate and fast responses to letter strings – also increased sensitivity to other visual stimuli such as pictures of houses and tools.

As said, the electroencephalographic technique can measure the precise points of time at which specific events in the brain occur. It was found that the occipitotemporal elevated activity took place early in the visual response sequence when automatic processing predominates. This explains why it is important that reading should be practiced until it is automatized.

There is an encouraging final note: even when literacy is acquired by adults the new skill has an impact on the precision of visual processing of varied stimuli.

Take-home message for teachers:

This paper supports advice that one learns to read by reading, just as the best way to play good tennis is to play tennis.

Fast, accurate and automatic processing of letter strings profoundly changes the brain. Even older illiterate people can acquire the skill.

Teachers and designers of books should do everything they can in their schooling and display of texts for reading in order to encourage the specific motor, perceptual and processing skills involved in reading. Features of good display include uncluttered layouts and a graded scheme of letter sizing and spacing as the child progresses.

Commentary reviewed by Dr Jenny Thomson, University of Sheffield, UK

*‘Timing the impact of literacy on visual processing’, Proceedings of the National Academy of Sciences, 9 December 2014

Pegado, F., Comerlato, E., Ventura, F., Jobert A., Nakamura K., Buiattia, M., Ventura, P., Dehaene-Lambertz, G., Kolinsky K., Morais, J., Braga, L.W., Cohen, L., and Dehaene S.

Back to top

 

Music and dyslexia

– a brief review by Dr Vincent Goetry, Teacher-training Consultant to Dyslexia International

A growing body of studies show that learning a musical instrument specifically tailored for the needs of learners with dyslexia significantly improves their abilities in reading and writing.

Musical perception (general and melodic) predicts the reading abilities amongst dyslexic learners aged between 8 and 10 years of age (Cogo-Moreira et al., 2013a). Learning to play music significantly enhances the reading performance in the same group (Cogo-Moreira et al., 2013b).

Several other studies suggest that music therapy, combined with an appropriate support program for literacy development, improves performance in reading and spelling (see Habib & Besson, 2008). According to Habib, learning to play music develops the connections between the visual and the auditory areas of the brain – that is seeing the instrument and hearing the music. Exactly the same connections are used in the acquisition of literacy. When reading, we pass from the visual modality, seeing the word, to the auditory modality, hearing the word aloud or silently. When we write, we pass for the auditory modality, listening to the word, to the visual modality, seeing the words as one spells them.

What is more, reading, like playing music, activates a motor area in the frontal lobe in the left hemisphere, Broca’s area, from which the connections to the posterior (sensory) areas are dysfunctional in people with dyslexia. The motor component of learning to play a musical instrument seems to be a critical component of the therapeutic effect of music for people with this deficit (Habib, personal communication).

Take-home message for teachers:

Apart from the intrinsic value of music education there is a lot of evidence that it will also help children struggling with reading and writing.

Some useful references

Back to top

Beat synchronization and reading readiness in preschoolers

There is evidence that children who struggle to synchronize a beat also struggle to read and write.

A group of researchers led by Kali Woodruff Carr* under the supervision of Nina Kraus who is at Northwestern University, USA, and is well known for her work in music remediation, has published a paper in the Proceedings of the National Academy of Sciences of America1.

They asked 35 children preschoolers between the ages of three and four to listen to drum beats paced at the average rate of phoneme production in speech. (Phonemes are the smallest units of speech that make a difference to meaning; for example ‘t’ and ‘p’ are phonemes in ‘mat’ and ‘map’). The children were then asked to reproduce the beat by drumming themselves.

The team concludes that ‘by establishing links between beat keeping, neural precision, and reading readiness, our results provide an integrated framework that offers insights into the preparative biology of reading’.

Take-home message for teachers:

Verbal games, reciting of nursery rhymes, rhythmic clapping are common activities within a nursery school or in a home which offers a stimulating environment. This paper shows why, and also hints at a practical intervention for very young children who may be showing risk factors for later difficulties in reading and writing.

* Woodruff Carr, K., White-Schwoch, T., Tierney, A.T., Strait, D. L. and Kraus, K. (2014), ‘Beat synchronization predicts neural speech encoding and reading readiness in preschoolers’, PNAS, 111 (40) 14559-14564

Back to top

Phonetic perception in infants

It has been shown that when adults listen to speech they also activate areas of the brain that generate the motor patterns acquired to produce speech.

Patricia Kuhl and her colleagues* used a non-invasive imaging technique to look at both auditory and motor brain activation in 7 month year old babies and then at about one year, ages which straddle the period when the child switches from receptivity to any language to a finer degree of receptivity to the ambient native language but when also exposed to sounds in a nonnative language.

They found that the younger infants activated auditory areas (in the superior temporal lobe) as well as motor brain areas (Broca’s area in the frontal lobe, and in the cerebellum) in response to speech, equivalently for native and nonnative syllables. However, in 11- and 12-month-old infants, native speech activated auditory brain areas to a greater degree than nonnative, whereas nonnative speech activated motor brain areas to a greater degree than native speech.

The extraordinary ability to produce vocal patterns of the native language at one year which can be deciphered by their parents has not yet been matched by computers although these machines improve when trained in ‘motherese’, and scientists are being challenged to identify the mechanisms underlying this early phonetic learning.

Behavioural studies have yielded two approaches. One is ‘computational’: the phonetic perception of infants is altered by the distributional (statistical) frequency of the speech sounds they hear as they respond to native speech. Unsurprisingly, a social impact has been shown: infants exposed socially to a new language at 9 months in play sessions by a live tutor learn to discriminate foreign language sounds at levels equivalent to infants exposed to that language from birth; however, no learning occurs if the same material on the same schedule is presented via video.

Turning back to motor theories of speech perception, we should say that they have a very long, somewhat controversial history but some current ideas are gaining favour. ‘Analysis by Synthesis’ proposes that speech perception involves a dual ‘hypothesize and test’ process. Bottom-up analysis and top-down synthesis jointly and actively constrain perceptual interpretation. Listeners generate an internal model of the motor commands needed to produce the signal: in essence, a ‘guess’ or prediction about the input. This hypothesis, based on listeners’ experience producing speech, is tested against the sensory signals received back. (This accords well with learning other types of motor skill, hitting a tennis ball, say; you work out what you hope to do, do it and then assess the result in order improve subsequent efforts.)

When adults listen to speech they activate several areas of the brain but the pattern of activation is different when they are listening to native and nonnative speech. For example, a single speech contrast (/r-l/) was tested in adults for whom the contrast was native (English) versus nonnative (Japanese). Greater activity in auditory areas was reported when the signal was a native contrast, and greater activity in motor brain areas when the same signal was a nonnative phonetic contrast. This implies that, when listening to speech, adults generate internal motor models of speech based on their experience producing it.

One of the experiments used a phonetic unit common to English and Spanish as the standard sound and two deviant sounds, one exclusive to English and the other to Spanish. At 7 months infants’ responses to native and nonnative contrasts were equivalent in both auditory areas and motor brain areas whereas at 11 months there was greater activation in motor brain areas for nonnative speech, matching the pattern for adults.

The impact of ‘motherese’ on language acquisition

Kuhl’s laboratory had previously emphasized the role of ‘motherese’, ‘the acoustically exaggerated, clear speech produced by adults when they address young infants’ and the enhancement of internal motor models of speech. Her group has described a ‘Social-Gating’ hypothesis where they had found dramatic differences and, when exposed to novel language, robust learning in live face-to-face learning situations but ‘non-existent via television’.

Take-home message for teachers:

This study does not address dyslexia directly but it reinforces the importance of very early, live, child-parent interaction well before the child attends nursery school. It also bolsters the case for oral-kinaesthetic discovery shown in our films to help children struggling with literacy.

Commentary reviewed by Dr Jenny Thomson, University of Sheffield, UK

*Open access paper, ‘Infants’ brain responses to speech suggest Analysis by Synthesis’

Patricia K. Kuhl, Rey R. Ramírez, Alexis Bosseler, Jo-Fu Lotus Lin, and Toshiaki Imada, Proceedings of the National Academy of Sciences, August 5, 2014, vol. 111, no. 31, 11241

http://www.pnas.org/content/111/31/11238.full.pdf

Back to top

Faulty brain connections?

In order to be able to read fast and accurately the reader must be able rapidly to access the ‘representations’ she has built in the brain of units of sound called phonemes, for example the two separate phonemes in each pair of ‘ba’ and ‘da’. These representations must be robust and separable from surrounding sounds and/or sounds that are nearly the same. At the same time they must ignore all the variations of loudness, accent, tone, speaker’s sex etc most commonly encountered when listening.

The question arises: does a poor reader not encode these phonemes adequately in the first place or does she have trouble in accessing them fast so that they can be linked to the output programmes of the brain in order to produce the right sounds when speaking?

A team at the Catholic University of Leuven, Belgium, led by Dr Bart Boets claims to have the answer1. They collected brain scans from 23 adults diagnosed as dyslexic and 22 matched normal readers. The subjects were asked to detect the differences between pairs like those above, randomized in order and sequences. Their results showed that there was no significant difference in the accuracy of detection between the two groups and concluded that phonetic representations were intact in adult dyslexics.

The team then assessed the functional connectivity between different parts of the brain, expressed as time to identification. They found a difference but only in specific paths. There is one part located in the frontal lobe of the brain on the left lower side. This part is involved in sensory-motor integration, that is producing the muscular output of speech, as well processing the stream of phonological data. But this part, known as Broca’s area, does not hold the phonetic representations per se, which are in the primary and secondary auditory cortices on both sides of the brain, called Wernicke’s area.

In a superficial view of the cortex you can see that these two areas are separated by a large slanting fissure.

The brain connection on left side is effected by a bundle of neurons, or fasciculus, deeper in the brain.

Using a different scanning technique, the team showed that in dyslexics the integrity of the white matter of the fasciculus was weaker and that this was the explanation for the less strong intercortical brain connection.

The team stresses that these findings do not challenge the prevailing hypothesis of a phonological deficit in people with dyslexia but point to an explanation of the underlying cause.

It should be said that not all scientists agree. One is not convinced that the ‘hot spots’, so to speak, of very precise areas for individual phonemes were in fact measured. Another objected to the ‘coarse’ nature of the chosen phoneme pairings2.

Nevertheless, these results are compatible with the hypothesis by Ramus and  Szenkovits (2008)3 that phonological representations in dyslexics are intact but less accessible than in non dyslexics. They do not support the view that phonological representations are degraded in dyslexics.

Take-home message for teachers:

This work provides an explanation of why speed of processing can be so difficult in people with dyslexia and why accommodations for extra time should be given in classroom and exam situations. Dyslexics should have at least one third additional time compared to non-dyslexics.

1.     Boets, B. et al. (2013) ‘Intact But Less Accessible Phonetic Representations in Adults with Dyslexia’, Science 342, 1251
2.     Emily Underwood, ‘Faulty Brain Connections in Dyslexia? Ibid, page 1158
3.     Ramus, F. and Szenkovits, G. (2008) ‘What phonological deficit?’ Quarterly Journal of Experimental Psychology 61, 129-141

Both images from Wikipedia, accessed 22-01-2104

Back to top

Handedness and dyslexia

It has long been known that a child’s inability to settle on one hand is a potential ‘risk factor’ for dyslexia and that one of the markers for a person with dyslexia is a difficulty in telling left from right or navigating in space, a deficit sometimes allied with less good motor skills.

Left/right asymmetry is laid down early in embryonic development and is patterned by molecules called morphogens. One of these then regulates the sequence of gene transcription for the cilia (fine hair-like protrusions from the cell) which literally sweep further developmental, genetic transcription factors to one side so that the body develops asymmetrically – contrary to superficial appearance: the heart is usually on the left side and the two lungs have different sizes. The brain is also asymmetric in its fine structure.

This could be important in developmental dyslexia because early studies by Shaywitz, Galaburda and others clearly demonstrated that the typical brain in people with dyslexia is more symmetric. But Dehaene and Stein postulate that such symmetrical development – which so impedes the dyslexic in the conventional classroom – actually contributes to a better ability in visualizing objects in three dimensions, the possession of a more holistic approach to cognition, and creativity.

A paper published in the Public Library of Science (PLOS Genetics)* hints at a possible link between some genes and abnormalities of asymmetric development. (The EU research programme ‘Neurodys’ had contributed to the screening of the preliminary studies.)

A large team led by the University of St Andrews, UK, carried out a genome-wide association study to look for variations in the DNA ‘code’ which would correlate with phenotypic variation expression in individuals. This approach has been successful in mapping many genes involved in diseases; the phenotypic, or behavioural part of the study, was the skill in manipulating pegs on a board, comparing those with and without reading difficulties.

The researchers say that currently the precise relationship between handedness, cerebral asymmetry, and neurodevelopmental disorders like dyslexia remains to be determined. However, their results  ‘implicated some of the same genes involved in body asymmetry in handedness, and therefore the development of cerebral asymmetry in both individuals with reading difficulties and the general population’.

They also ‘propose that handedness is under the control of many variants, some of which are in genes that also contribute to the determination of body left/right asymmetry’.

One, PCSK6, regulates the morphogen ‘NODAL’ in the development of left/right asymmetry in mice. PCSK6 and others determines the development of left/right asymmetry in the huge group of animals from snails to vertebrates, which means that they have been evolutionary conserved for millions of years. When PCSK6 is disrupted in mice an abnormal distribution of visceral organs is observed.

It is now known that several genes are consistently associated with dyslexia. Interestingly, they seem to be involved in regulating cilia development and function. ‘Ciliopathies’, a class of diseases caused by disruption of cilia function, are often associated by structural asymmetry defects like situs inversus. These latest results suggest that a link between handedness and dyslexia might exist but it is likely to be very complex and could be mediated by cilia function.

Commentary reviewed by Dr Silvia Paracchini of the School of Medicine, University of St Andrews, UK

Take-home message for teachers:

The authors of this paper are circumspect but have certainly found some interesting correlations between dyslexia and body asymmetry. Meanwhile Dyslexia International’s advice would be to worry less about handedness rather than continuing to use proven multisensory techniques and developing cursive handwriting on lined paper (see Section 3 of the online course).

* ‘Common Variants in Left/Right Asymmetry Genes and Pathways Are Associated with Relative Hand Skill’, 2013, PLOS Genetics : doi.org/nsb

William M. Brandler, Andrew P. Morris, David M. Evans, Thomas S. Scerri, John P. Kemp, Nicholas J. Timpson, Beate St Pourcain, George Davey Smith, Susan M. Ring, John Stein, Anthony P. Monaco, Joel B. Talcott, Simon E. Fisher, Silvia Paracchini

Professor John Stein is a member of Dyslexia International’s Scientific  Advisory Panel.

Back to top

Reading in Western scripts and in Chinese … and the importance of hand movements

It has long been debated whether the neural circuits for reading in, say, French, which is an alphabetic language, and in Chinese, which is a logographic (or ideographic, more ‘picture’-based) language, are radically different.

Previous studies had suggested that there was a radical difference but Stanislas Dehaene and colleagues have published a paper in the journal Proceedings of the National Academy of Sciences*, US, indicating otherwise, provided that cross-cultural confounds are controlled for. One such confound is the great memory load needed for Chinese characters. More precisely, they used dynamic handwritten stimuli in a cursive style rather than letter strings with a static roman typography such as that you are reading in now. It is suggested that a fast neural pathway automatically recovers the intended visually perceived handwritten traces in both languages.

At the microscopic level, as would be expected, restricted neural networks are found better tuned to specific graphic shapes and sounds for the different languages but at the macroscopic level their findings suggest universal mechanisms in pre-existing innate networks. In fact there are two networks: one for the analysis of shape and another which decodes motor gestures. A motor memory for writing in Chinese is not specific to that culture because it also plays a general role in the acquisition of literacy in French; however it is more activated in Chinese than in French presumably owing to the fact that the Chinese have to memorize a huge number of characters. The environment, that is teaching, then fine-tunes and balances these networks specifically for each language as literacy is taught. Culture does matter but the brain is plastic and adaptable, significantly more so in children.

In both French and Chinese a character may be presented whole or a compound of dynamic movements. For instance, ‘a’ is in fact written in a different way with a forward movement first to form the arc and then a backward movement to complete the upright. Chinese characters are also formed with forward and backward movements but the researchers were attempting to discover what happens when this effect is shown dynamically rather than when the complete character is presented statically.

The subjects were in France, near Paris, and Taiwan, where a modified form of Mandarin Chinese is spoken. Complex combinations of tests were run in two types of experiment. The first, a behavioural test, used a research paradigm known as ‘priming’. A stimulus is flashed briefly on a screen, followed by a mask then a target character or word presented statically or dynamically and reaction times recorded. The quicker reaction time shows that the target has been swiftly detected. The second type deployed functional magnetic resonance imaging (fMRI) to capture areas of the brain activated by specific tasks. This method is not used to determine the speed of the reaction but where it is taking place.

The conclusions of the studies are as follows.

There was support for the view that the brain uses motor patterns of hand-writing gestures in order to perceive the shapes of letters.

There is a distributed network of regions in the brain, which is fast, automatic and is activated in fluent reading. There is no difference across different cultures. Interestingly, elements of this network are also found in the perception of hand movements in non-human primates, indicating a long lineage for this ability.

In summary, when the expert reading network has been well trained it actually uses two distinct pathways in all cultures but with fine-grained modifications depending on the orthography. There is a visual word-from area, ‘reading by eye’, and another area in a specific area that decodes gestures, ‘reading by hand’. The authors write:

‘… recent developmental data show that reading acquisition is facilitated when young children are taught to write or finger-trace the letter shapes compared with classical grapho-phonemic teaching without a haptic [touch] component … Conversely, fMRI of normal and dyslexic children also suggests that reading difficulties lead to a greater reliance on [a specific area in the left hemisphere], suggesting partial compensation through the gesture system …’

Take-home message for teachers:

These experiments lend strong support for multisensory methods of teaching, especially for children with dyslexia.

* December 12, 2012, Volume 109, pages 20762 – 20767

Back to top