You’re driving down a long stretch of highway in the middle of the night. It’s hard to stay awake.

You turn on the radio and tune into your favorite station: 103.3 FM. “Changes” by David Bowie is playing. You stick to the same station, you hear another song: “Don’t Go Breaking My Heart,” a duet with Elton John and Kiki Dee. Classic. 

Then you want something a little different. You switch to 93.7 FM. Pachelbel’s “Canon” is playing. A massive shift from 103.3 FM, but you like it. You stick with this station and you keep on cruisin’.

But wait, is it actually a massive shift? 

Bourne and her collaborators have attempted to map characteristics in music to entity and relational categories. Courtesy of Tibor Janosi Mozes / Pixabay

What makes music similar? What makes “Canon” so different from “Don’t Go Breaking My Heart?” In what ways do “Canon” and “Changes” share an unexpected resemblance?  

What is the psychological basis for the categories people conceive of while listening to music? These are questions that researchers in the UC Santa Barbara Music Cognition Lab have been trying to answer experimentally.

Janet Bourne, an assistant professor in UCSB’s Music Department, held a seminar on Oct. 23 where she discussed the lab’s recent work. 

“Music theorists assume that listeners categorize musical themes using what they call structural features. Music theorists say that listeners categorize music based on pitch, based on the melody notes, based on rhythm or based on harmony — all of these parameters and features that music theorists think are very important and prioritize,” Bourne said. 

“However, previous empirical work in both similarity and categorization find that listeners use, predominantly, what’s been called surface-level features.” 

Surface-level features include dynamics, how loud or how soft a piece is played; texture, the style of a piece or how it’s embellished; articulation, how individual notes are sounded; and timbre, what musical instruments are used — among other things.  

Bourne illustrated this with two tracks. Both were renditions of Richard Wagner’s “Ride of the Valkyries,” however, one was the original and the other carried stylings of tango music. 

In cognitive science, distinctions exist in the ways that people categorize what they encounter in daily life. Bourne and her collaborators have attempted to map characteristics in music to two types of categories: entity and relational.

Courtesy of Janet Bourne

Entity categories are categories in which members share “intrinsic features and feature correlations,” according to Bourne. To illustrate this, Bourne described the unifying features of cows. Cows have udders, two horns, white, brown or black fur and a tail. People grouping cows together base what is considered a cow on these traits.  

In contrast, relational categories are categories in which members are related through their relationship to things, rather than properties they literally have in common. Relational categories find similarity through analogy. 

“We have classic examples of barriers, a wall, a gate, a barrier reef … [But] cookies can be a barrier, if you’re on a diet. Exhaustion can be a barrier, if you’re trying to study,” Bourne said. 

“It [seems] from experiments that have been run that people tend to be drawn to categorizing based on entity categorizing, and it’s a lot more difficult to persuade people to perceive and categorize things according to relational categories.” 

However, there are relational category experiments which have shown that people have a propensity to group visual stimuli using relational categories if they have been exposed to multiple “exemplars” of that category alongside one another. This spurred a question for the researchers. 

“What my lab and I asked was: ‘How would listeners categorize music if we used tasks from relational category experiments?’” 

Bourne and her collaborators mapped elements in music to either entity or relational categories. Register, dynamics, articulation, texture and timbre were classified as entity category features, while pitch, rhythm, meter, harmony and contour were categorized as relational category features. 

In the control, there were only three excerpts. Participants listened to one excerpt and had to choose between two different excerpts — one being a relational match and the other an entity match — regarding which was most similar to the target excerpt. 

However, in the experimental group, a fourth excerpt was added. This was a relational match meant to establish a basis of comparison. The participants listened to the two excerpts and then were asked to rate their degree of similarity. Following this, the participants were asked the same question as those in the control. 

These were conducted three times, focusing on a popular music harmonic schemata, an 18th century thematic schemata and an 18th century contrapuntal schemata.

The results showed that participants chose relational matches significantly more often after making comparisons, and that comparison, regardless of musical training, influenced the parameters which participants of the studies considered in grouping excerpts. 

Following this, Bourne and her collaborators carried out another experiment. 

“What our experiment two was trying to tackle … [was] whether it is comparing the exemplars or just hearing multiple exemplars that sways participants to categorize using relational metrics.” 

However, the results from this have been mixed. Bourne and her collaborators’ hypothesis did not bear fruit. 

“It’s possible that this is a flawed adaptation of [relational category experiments] for auditory stimuli,” Bourne said. “There is a huge difference in visual stimuli versus auditory stimuli.” 

Now, looking forward, the researchers are contemplating how to address potential issues with their experiment — possibly by changing the tempo of excerpts or making more controlled stimuli, isolating certain musical parameters.  

In the meantime, while this is getting sorted out, take a closer listen to “Canon” and “Changes.”

Print

Sean Crommelin
Sean Crommelin is the Science and Tech Editor for the Daily Nexus. He can be reached at science@dailynexus.com