Gesture Research

You can find my web & graphic design portfolio here: adainspired.com. Also check out my creations here: instagram.com/ada_inspired/

What I do in Gesture Research

Overview

Investigating multimodal alignment amongst gesture, prosody, and discourse in collaboration with Dr. Stefanie Shattuck-Hufnagel, Speech Communication Group, RLE, MIT. Building a modular gesture labeling system to fit multiple gesture research needs with potential to expand into automated labeling. This requires teasing apart the kinematic dimensions from the meaning of gestures in coding schemas. One of our goals is to identify these dimensions and make sure they are mutually exclusive. A second goal is to add quantitative measures to qualitative attributes of speech-accompanying gestures. Overall, we are interested in the multimodal alignment across speech, prosody, and discourse.

Exploratory Gesture Visualizations

I design and code exploratory visualizations using HTML, CSS, Perl, Processing (see processing.org), hand illustration, graphic design, and video manipulation. Check them out on the Exploratory Visualizations page.

Coding Manual

If you are interested in coding gestures by mutually exclusive dimensions, here’s our coding manual.


Publications

  • Shattuck-Hufnagel, S., Ren, A., Mathew, M., Yuen, I., Demuth, K. (2016) Non-referential gestures in adult and child speech: Are they prosodic?. Proc. Speech Prosody 2016, 836-839. (link)
  • Shattuck-Hufnagel, S., Ren, A., (2012) Preliminaries to a Kinematics of Gestural Accents. Presented at International Society for Gesture Studies 6 San Diego.
  • Shattuck-Hufnagel, S., Ren, A., Tauscher, E. (2010) Are torso movements during speech timed with intonational phrases? Presented at Speech Prosody 2010.