[Pd] OT- FFT and human auditory cortex

Charles Henry czhenry at gmail.com
Wed May 24 18:25:53 CEST 2006

Okay, just had this material in a class this semester...and I've done
some research...
There has been a long standing debate over functional specialization
of the different hemispheres.  One hypothesis is that the left
hemisphere deals with high frequency information, and the right deals
with low frequency information.  This does not mean that the auditory
information is actually divided disjointly to the hemispheres, but as
a general theme in lateralization, the left hemisphere resolves higher
frequency information...
This hypothesis explains that the left hemisphere can represent fine
timing information by having a more full set of frequencies, and that
the right hemisphere resolves pitch contour by pertaining more to slow
changes in frequency.

but....the neurological side doesn't really compare with this
hypothesis.  It's not something that can be boiled down to a single
hypothesis.  What's really going on is that subcortical structures in
the auditory pathway differentially project to the right and left

in an article by Ligeois-Chavel (2001, New York Academy of Sciences),
intracerebrally recorded evoked potentials (IEPs) were taking during a
simple pitch experiment.  The IEPs showed that the right hemisphere
tonotopically encodes pitch information->there were position different
"signatures" like event-related potentials that varied with respect to
frequency.  In the left hemisphere, there was no tonotopic
organization, the areas of the brain under study responded equally to
a wide band of frequencies.

While the coding in the auditory cortex is "like" an FFT, it is NOT an
FFT.  there are many fine differences....the whole auditory pathway is
not real clear cut where functions are localized, exactly.  At
different stages in the auditory pathway, frequencies can be
tonotopically or rate encoded.  The cochlea is actually a dynamic
organ, in and of itself->it's not a passive organ with a system of
resonators and transducers, it's like a bank of critically tuned
resonators at unstable equilibrium (like a hair trigger).  The cochlea
encodes a train of phase locked pulses, which are transmitted by the
auditory nerve, which contains both tonotopically and timing encoded
(fine timing) information.  The cochlea can represent frequencies with
timing encoding (I'm searching my brain for a better term than timing
encoding, but not sure) up to around 5 kHz (it seemed a little high to
me, when I first read the number...I thought it would be around 1
kHz).  As we proceed through the auditory pathway, the ability to
encode timing decreases, so that at the cortical level, timing
encoding only pertains to the low <100-200 Hz frequencies.  Hence, the
primary auditory cortex accomplishes things by rate encoding (i.e. the
strength of the tone is represented by the frequency of neuron action
potentials, and not by timing between them)

(I can't remember these too well at the moment, I just graduated this
spring...and my brains a little fried...there may be more.  I just
feel like I've left some out)
okay, it goes cochlea->auditory meatus (midbrain ~
pons/mesencephalon)->lateral lemniscus (of pons)->superior olivary
complex (of pons)->inferior colliculus->medial geniculate nucleus (of
thalamus)->primary auditory cortex.  (the named areas occur on both
sides of the brain.  There is also, of course, and auditory chiasm
taking place somewhere around the superior olivary complex)

It's not real clear what each of these things does, and how they
contribute to specific functions of the auditory system.  There are
some differences in encoding at different areas, such as the inferior
colliculus.  The IC is organized into isofrequency layers, each layer
encodes a different band of frequencies (this is more "like" a wavelet
transform), and fine timing information here works to accomplish sound
localization (also has to do with eye position information from
superior colliculus)
There may be other information that is encoded at the level of the IC
(I'm leaning strongly towards pitch in the IC).  Different kinds of
information are projected from the subcortical structures to the
cortical levels...it just depends upon what kind of processing is
taking place, and how it's organized.

but the point (there IS a point) is that there are many different
representations of frequency information along the auditory pathway.
Each structure named does some kind of processing and passes the
processing along to the next structure.  Ultimately, there are several
different ways that sound is encoding up to and including the primary
auditory cortex.  It is "like" and FFT, but it is also "like" a
wavelet transform, and it is also "like" a bank of hair-trigger


On 5/24/06, Chuckk Hubbard <badmuthahubbard at gmail.com> wrote:
> http://cercor.oxfordjournals.org/cgi/content/full/11/10/946
> I came across this fact researching my final paper for Perception
> class: the left auditory cortex is known to resolve temporal changes
> in sound better, while the right auditory cortex resolves tonal and
> harmonic information more finely.  As soon as I read this I thought of
> FFT.  Could it be the difference between the brain hemispheres is
> related to the auditory cortices having different block sizes?
> -Chuckk
> --
> "Far and away the best prize that life has to offer is the chance to
> work hard at work worth doing."
> -Theodore Roosevelt
> _______________________________________________
> PD-list at iem.at mailing list
> UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list

More information about the Pd-list mailing list