Lecture 13

Language

Goals

  • To reconnect with concepts of localization of function, modularity, and connectionism in the context of language, and to introduce the topic of hemispheric specialization.
  • To introduce the language system from a neuropsychological perspective.
  • To discuss the influence of brain injuries upon language processing.
  • To discuss the plasticity in the neural basis of language acquisition.

Topic slide

Paul Broca (1824-1890) was a French neurologist. His 1861 study of two patients whose strokes led to an inability to speak fluently despite understanding spoken speech had a profound effect upon the concept of localization of function in the brain. His patients had lesions in the inferior left frontal lobe. Patients with this expressive, or non-fluent aphasia are sometimes said to have a "Broca's aphasia".

Carl Wernicke (1848-1905) was a German physician who described a complementary aphasia to Broca. In Wernicke’s aphasia, speech production was intact, but auditory comprehension was severely disturbed. Carl Wernicke wrote his book “The Symptom Complex of Aphasia” in 1874 at age 26.

Reading

  • Reading: PN6 Chapter 33

Brief history

Brain lesions

The first writings on the relationship between brain and language can be found in the Edwin Smith papyrus, which is a 1700 BC copy of a 3000 BC manuscript written by Egyptian battle surgeons. This document describes cases of paralysis and loss of language following brain injuries endured in battle. These functional deficits are described quite matter of factly. For example, in the notes for one patient regarding a brain injury, is reference to aphasia: "He speaketh not to you".

Despite these historical data, and for reasons we discussed previously, the relationship between localized brain injuries and specific behavioral deficits was not pursued scientifically for many centuries. In the realm of language, this started with the work of Marc Dax and, later and more famously, by Paul Broca.

Paul Broca's patient 'Tan' suffered (probably by stroke) a lesion to the left lateral frontal lobe that caused him to be unable to speak (1861).

Carl Wernicke first described a complementary aphasia to Broca, in which speech production was intact, but auditory comprehension was severely disturbed (1874).

We will return below in more detail to these different forms of disabilities in language production and comprehension.

Disconnection

The lesion results have led some to conceptualize language as a modular process instantiated into different brain regions. Some scientists have emphasized the connections between modular brain regions in their models of language functions. American neurologist Norman Geschwind explained certain symptoms that occurred in patients with stroke-related language comprehension and speech deficits as disconnexions (yes, he used the 'x' in his spelling) between different language modules in the brain.

Brain stimulation

Wilder Penfield (1891-1976) electrically stimulated the exposed cortex of human patients undergoing neurosurgery, and mapped sensory and motor regions of the brain (the homunculus) and other brain regions involved in language and memory. His methods are still used today to limit damage to these regions from inadvertent damage during neurosurgical procedures.

I presented results from two patients to illustrate the use of stimulation.

  • Broca's area was defined by stimulation that caused 'speech arrest' despite the fact that the patient could still stick out tongue, and move mouth.
  • Wernicke's area was harder to define. Stimulation in and about the posterior temporal lobe caused deficits in confrontational naming (couldn't name object) or paraphasias ('cup' for 'cat'), and also comprehension deficits (e.g., inability to complete a sentence spoken by the investigator, inability to carry out simple commands).

Functional neuroimaging

The ability to observe activation of discrete brain regions in neurologically intact humans engaged in language tasks has had a large impact on the neuroscience of language. I provided in lecture several examples of functional MRI studies of language. Functional MRI makes images of changes in the oxygen levels in blood that occur in discrete regions of the brain as a consequence of a sensory, motor, or cognitive task. You can read more about functional MRI in the notes for Lecture 03.02.

Language

Visual language processing

Visual processing of language – reading – is a relatively new human capacity that is dependent upon learning. The first evidence for written language is from ~3200 BC, and there is evidence for the use of toke symbols from about ~10,000 BC. There are many languages in the world that are spoken and not written, and there are many individuals with normal brain functioning who cannot read. However, there does appear to be visual regions of the visual brain that play a particular role in reading. Because I discussed visual deficits following brain damage in humans, I used the example of alexia to introduce language processing and to bridge the vision and language lectures.

Pure Alexia (or, Alexia without agraphia)

I briefly described pure alexia in a prior lecture on agnosias, as Farah considered pure alexia to be a form of ventral simultanagnosia.

Pure alexia is characterized as follows:

  • Characterized by letter-by-letter reading, and finger tracing.
  • Patients can write, but then they can't read what they just wrote.
  • Pure alexia usually involves lesions to left occipitotemporal cortex.
  • Pure alexia has been characterized as a disconnexion syndrome, whereby the percept of letters is disconnected from higher level regions (presumed to be located in the left angular gyrus) that connects to meaning.

Here is a video example of a patient with Alexia without agraphia. The video quality is of low quality, so you will need to watch and listen carefully. Note, in particular, how the patient draws individual letters with her finger to help her read.

Alexia with agraphia

Pure alexia is relatively rare, but alexia often co-occurs with an inability to write (agraphia). Alexia with agraphia is usually associated with left angular gyrus lesions.

Auditory language processing

We have previously discussed the auditory system, beginning with the transduction of sound waves into a neural code in the cochlea and ascending through the medial geniculate thalamus to auditory cortex. Auditory cortex is located in the temporal lobe. Primary auditory cortex (A1) is located on Heschl's gyrus and is organized tonotopically; i.e., а just like the basilar membrane of the cochlea. A1 is surrounded by auditory 'belt cortex' i.e., higher level auditory cortex.

Spoken language deeply involves the auditory system. However, language is not only auditory. Language involves the following:

  • Speech reception (auditory)
  • Speech production (motor)
  • Prosody
  • Reading (visual, ~5000-6000 years of reading/writing)
  • Writing (motor)
  • Semantics and meaning (memory)

We are taking a functional neuroanatomical approach to language, and thus are NOT considering linguistics, nor will we go into detail about the expressly cognitive aspects of language.

Levels of analysis in language

Spoken language can be conceptualized as occurring in stages of analysis:

  • Acoustic envelope
  • Phonemes
  • Syllables
  • Acoustic word form (phonology)

Written language can also be conceptualized as occurring in stages of analysis:

  • Visual analysis
  • Orthographic input code
  • Visual word form (orthography)

From this point forward, the conceptual system for word processing is considered the same regardless of whether the word enters the system from acoustic signals (hearing) or visual signals (reading).

  • Mental lexicon (is this a word?)
  • Semantic information а what does the word mean?
  • Syntactic information а how to use the word with other words in sentences.

The various levels of processing specified for language processing are the content of courses in cognition and in linguistics. I will focus on the neurological basis of language processing using the methods we have discussed in prior lectures.

Acoustic properties of language

The acoustic envelop of a phrase does not break on word boundaries. Some words are broken up into clearly different acoustic envelops, while some words are not.

The basic building blocks of auditory speech are phonemes а there are about 200 in the world languages, English uses about 42 phonemes.

Infants appear to be able to differentiate all phonemes, but over time there is a perceptual narrowing of phoneme reception and older infants lose the ability to discriminate phonemes that are not part of their native language (e.g., 'l' and 'r' for Japanese speakers). This was demonstrated in lecture using ERP evidence.

Hemisphere differences

Acoustic stimuli

There are hemispheric differences in auditory processing at the level of primary and belt auditory cortex. Investigators have also taken advantage of the fact that the right ear projects primarily (but not exclusively) to the left hemisphere, while the left ear projects primarily (but not exclusively) to the right hemisphere. These studies have revealed:

  • Right ear advantage for words
  • Left ear advantage for melodies

In my auditory system lecture, I presented a case study of patient with acquired amusia (inability to perceive music as music а e.g., singers seem like they are talking).

In today's lecture, I presented fMRI evidence from an experiment from the McCarthy lab of subjects listening to sentence segments spoken in English (native) and Turkish (not understood) by the same speaker. The contrast of English vs. Turkish showed consistent differences in lateral anterior frontal regions (Broca's area) and posterior lateral temporal lobe (Wernicke's area) in the left hemisphere.

I also presented a fMRI study from the McCarthy lab showing left occipitotemporal activation by visual word form stimuli.

Written language

Investigators have taken advantage of the fact that the left visual field projects to the right occipital cortex, and the right VF projects to the left occipital cortex to study. Studies with brief presentations of words to the Left and Right visual fields have demonstrated a RVF – LH advantage for words (right visual field – left hemisphere).

I will have more to say about visual field studies of other stimulus types (e.g., faces) in my next lecture focused more generally upon hemispheric specialization.

Language disorders

I presented a number of language deficits that occurred following strokes or other brain damage:

  • Broca’s, or nonfluent, aphasia
  • Pure word deafness
  • Wernicke’s, or fluent, aphasia
  • Anomia
  • Conduction aphasia
  • Alexia with agraphia
  • Alexia without agraphia (pure alexia)

Broca's, or nonfluent, aphasia

  • Non-fluent productive aphasia а difficulty speaking, slow and deliberate
  • Agrammatical speech
    • Often telegraphic – just content words (e.g., 'son college')
  • Difficulty with complex syntax (e.g., 'the boy was hit by the girl')
  • Similar to Broca's patient 'Tan'
  • Can result from left sided middle cerebral artery stroke

Here is video in which Norman Geschwind exams a patient with Broca's aphasia and demonstrates the patient's difficulty with syntax.

Here is a more recent video in which a patient with Broca’s aphasia speaks with a therapist.

Wernicke's, or fluent, aphasia

  • Patient is fluent; i.e., unlike the Brocaеs aphasic, words can easily be produced with appropriate prosody, but the speech can be nonsensical and devoid of meaning. Much of speech can consist of paraphasias.
  • Patient can have severe impairments in understanding speech а auditory comprehension deficits.
  • Patients can be unaware of their deficits.
    • This lack of awareness of a deficit is called anosagnosia.
  • I showed a picture of posterior stroke damage to emphasize how large the lesions can be.
  • I showed a very wordy slide shown that describes the different types of paraphasias that can be found in patients with Wernicke's or fluent aphasia.

Here is a particularly good example of a patient with Wernicke’s, or fluent aphasia.

Anomia

  • Doesn't use the names of objects and often substitutes 'that' or 'this'.
  • Circumlocution
  • I presented the 'Cookie jar' transcript to demonstrate anomic speech.

Here is a short video demonstrating anomic aphasia.

I showed a video example of anomic patient trying to find the words 'saw' and 'hammer'. Here is a similar video with better production qualities. In this video, a daughter is trying to help her anomic mother practice strategies for word finding.

Conduction aphasia

Conduction aphasia was described by Norman Geschwind as an example of a disconnexion syndrome. Unlike the other aphasias, spontaneous speech is much better, and auditory comprehension is typically good.

However, the hallmark of the disorder is repeating something the experimenter says. The patient responds with paraphasias.

Geshwind thought conduction aphasia was caused by disruption of fibers connecting Wernicke's and Broca's area (which course through angular gyrus).

Here is a video clips of Nobel Prize Winner Eric Kandel discussing a video clip of a woman with conduction aphasia. One of the very interesting aspects of this video is that the woman makes many paraphasias in trying to follow instructions.

Strokes, hemispherectomy and language development

What happens to language development in those individuals who endure perinatal strokes affecting the left hemisphere?

I discussed that in rare circumstances – for example, children or infants with Rasmussen’s Encephalitis – an entire hemisphere is removed.

Despite having the left hemisphere removed, in many young children (but not all) in which this surgery is performed language appears to develop close to normally in the right hemisphere. Some patients also demonstrate residual motor function in the limbs contralateral to the hemisphere that was removed (can walk and use hand – although there are motor deficits).

We discussed the ‘crowding’ hypothesis that posits there should be some loss of ‘right hemisphere’ spatial skills when language is crowded into the right hemisphere. No good evidence for this hypothesis appears available.

Videos

Prerecorded lectures for fall 2020

The video embedded below was recorded live in fall, 2018.