RESEARCH INTERESTS
Information structure in language comprehension and production
Not all information is created equal: in a sentence, some words are more important or meaningful than others. How do people focus their attention on these meaningful words?
Speakers can mark important words prosodically - for example, through a strong accent (YOU forgot the keys, not ME!). Or, they can use particular syntax (It was you who forgot the keys, not me!). How do people rely on these different types of cues?
I studied this and related questions in my doctoral dissertation by tracking peoples' eye-movements as they read sentences (see my HSP poster), and I recently published a successful replication of a study by Cutler and Fodor (1979) on this topic.
I also investigated how speakers use prosody to focus important words when they talk to voice-activated Artificially Intelligent devices like Amazon's Alexa (see the media advisory about this ongoing project).
Temporal prediction in language
The brain is always predicting what will happen next. But it is also always predicting when things will happen, a concept called temporal prediction.
Spoken language unfolds over time, so it's important for us to predict not only which words we're likely to hear, but also when we are likely to hear them. How does our brain do that?
I started thinking about this in my 2018 review on the temporal prediction of prosodic stress in speech and learned about cortical tracking, whereby neural oscillatory signals measured with EEG appear to track the speech we're hearing. In a collaborative 2021 paper we argue that to better understand the role of cortical tracking in language comprehension (and in potentially predicting the timing of upcoming words) there needs to be more interdisciplinary collaboration across the fields of signal and sentence processing.
I address this complex topic in my postdoctoral research funded by the National Institutes of Health, through an NRSA F32 grant titled "The role of temporal prediction in guiding attention through time during language comprehension".
Cross-cultural perspectives in music cognition
What do you feel when you hear a song from a far-away culture? You might not be familiar with the style - maybe the scale is different, maybe it uses unusual rhythms. Yet you might still be moved by it.
What allows people to enjoy music from other cultures? Does music around the world share common features in the way it communicates emotion?
I asked this question in my paper on chills - strong emotional responses felt as a shiver down the spine - in response to music across cultures. More broadly, I am interested in cross-cultural perspectives in the study of music cognition.
Learn more about these project and find a full list of publications, conference talks and posters on my CV page!