Gesture Research

[fusion_builder_container hundred_percent=”no” equal_height_columns=”no” menu_anchor=”” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” class=”” id=”” background_color=”” background_image=”” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” parallax_speed=”0.3″ video_mp4=”” video_webm=”” video_ogv=”” video_url=”” video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” overlay_color=”” video_preview_image=”” border_size=”” border_color=”” border_style=”solid” padding_top=”” padding_bottom=”” padding_left=”” padding_right=””][fusion_builder_row][fusion_builder_column type=”1_1″ layout=”1_1″ background_position=”left top” background_color=”” border_size=”” border_color=”” border_style=”solid” border_position=”all” spacing=”yes” background_image=”” background_repeat=”no-repeat” padding_top=”” padding_right=”” padding_bottom=”” padding_left=”” margin_top=”0px” margin_bottom=”0px” class=”” id=”” animation_type=”” animation_speed=”0.3″ animation_direction=”left” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” center_content=”no” last=”no” min_height=”” hover_type=”none” link=””][fusion_tagline_box backgroundcolor=”” shadow=”no” shadowopacity=”0.7″ border=”0″ bordercolor=”” highlightposition=”left” content_alignment=”left” link=”” button=”” linktarget=”_self” modal=”” button_size=”” button_type=”” button_shape=”” buttoncolor=”default” title=”” description=”” margin_top=”” margin_bottom=”” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” class=”” id=”” animation_type=”” animation_direction=”left” animation_speed=”0.3″ animation_offset=””]

Investigating multimodal alignment amongst gesture, prosody, and discourse in collaboration with Dr. Stefanie Shattuck-Hufnagel, Speech Communication Group, RLE, MIT.

Building a modular gesture labeling system to fit multiple gesture research needs with potential to expand into automated labeling. This requires teasing apart the kinematic dimensions from the meaning of gestures in coding schemas. One of our goals is to identify these dimensions and make sure they are mutually exclusive. A second goal is to add quantitative measures to qualitative attributes of speech-accompanying gestures. Overall, this looks at the broad span of multimodal alignment across speech, prosody, and discourse.

[/fusion_tagline_box][fusion_content_boxes layout=”clean-horizontal” columns=”2″ title_size=”” title_color=”” body_color=”” backgroundcolor=”#f6f6f6″ icon=”” iconflip=”” iconrotate=”” iconspin=”no” iconcolor=”” icon_circle=”” icon_circle_radius=”” circlecolor=”” circlebordersize=”” circlebordercolor=”” outercirclebordersize=”” outercirclebordercolor=”” icon_size=”” icon_hover_type=”” hover_accent_color=”” image=”” image_width=”” image_height=”” link_type=”button” link_area=”” link_target=”” icon_align=”left” animation_type=”” animation_delay=”” animation_offset=”” animation_direction=”left” animation_speed=”0.3″ margin_top=”” margin_bottom=”” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” class=”” id=””][fusion_content_box title=”Exploratory Gesture Visualizations” backgroundcolor=”” icon=”” iconflip=”” iconrotate=”” iconspin=”no” iconcolor=”” circlecolor=”” circlebordersize=”” circlebordercolor=”” outercirclebordersize=”” outercirclebordercolor=”” image=”” image_width=”35″ image_height=”35″ link=”http://adainspired.mit.edu/gesture-research/exploratory-gesture-visualizations/” linktext=”Be wowed!” link_target=”” animation_type=”” animation_direction=”left” animation_speed=”0.3″ animation_offset=””]

Gestures like you’ve never seen them before!

[/fusion_content_box][fusion_content_box title=”Coding Manual” backgroundcolor=”” icon=”” iconflip=”” iconrotate=”” iconspin=”no” iconcolor=”” circlecolor=”” circlebordersize=”” circlebordercolor=”” outercirclebordersize=”” outercirclebordercolor=”” image=”” image_width=”35″ image_height=”35″ link=”http://adainspired.mit.edu/gesture-research/gesture-coding-manual/” linktext=”Full version” link_target=”” animation_type=”” animation_direction=”left” animation_speed=”0.3″ animation_offset=””]

Get started coding gestures. We also have a short version: Dive in here.

[/fusion_content_box][/fusion_content_boxes][fusion_title margin_top=”” margin_bottom=”” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” class=”” id=”” size=”2″ content_align=”left” style_type=”default” sep_color=””]

Publications

[/fusion_title][fusion_text]Shattuck-Hufnagel, S., Ren, A., Mathew, M., Yuen, I., Demuth, K. (2016) Non-referential gestures in adult and child speech: Are they prosodic?. Proc. Speech Prosody 2016, 836-839. (link)

Shattuck-Hufnagel, S., Ren, A., (2012) Preliminaries to a Kinematics of Gestural Accents. Presented at International Society for Gesture Studies 6 San Diego.

Shattuck-Hufnagel, S., Ren, A., Tauscher, E. (2010) Are torso movements during speech timed with intonational phrases? Presented at Speech Prosody 2010.[/fusion_text][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” menu_anchor=”” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” class=”” id=”” background_color=”” background_image=”” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_mp4=”” video_webm=”” video_ogv=”” video_url=”” video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” video_preview_image=”” border_size=”” border_color=”” border_style=”solid” margin_top=”” margin_bottom=”” padding_top=”” padding_right=”” padding_bottom=”” padding_left=””][fusion_builder_row][fusion_builder_column type=”1_1″ layout=”1_1″ spacing=”” center_content=”no” link=”” target=”_self” min_height=”” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” class=”” id=”” background_color=”” background_image=”” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_size=”0″ border_color=”” border_style=”solid” border_position=”all” padding_top=”” padding_right=”” padding_bottom=”” padding_left=”” dimension_margin=”” animation_type=”” animation_direction=”left” animation_speed=”0.3″ animation_offset=”” last=”no”][fusion_title margin_top=”” margin_bottom=”” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” class=”” id=”” size=”3″ content_align=”left” style_type=”default” sep_color=””]

Tools for Researchers

[/fusion_title][fusion_text]

I’m putting together a set of handy tools for researchers that I used on a daily basis. Right now there’s a simple t-test/chi-square test, but soon there will be more on data visualizations and research poster templates!

[/fusion_text][fusion_content_boxes layout=”clean-horizontal” columns=”3″ title_size=”” title_color=”” body_color=”” backgroundcolor=”” icon=”” iconflip=”” iconrotate=”” iconspin=”no” iconcolor=”” icon_circle=”” icon_circle_radius=”” circlecolor=”” circlebordersize=”” circlebordercolor=”” outercirclebordersize=”” outercirclebordercolor=”” icon_size=”” icon_hover_type=”” hover_accent_color=”” image=”” image_width=”” image_height=”” link_type=”” link_area=”” link_target=”” icon_align=”left” animation_type=”” animation_delay=”” animation_offset=”” animation_direction=”left” animation_speed=”0.3″ margin_top=”” margin_bottom=”” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” class=”” id=””][fusion_content_box title=”Statistic Tests” backgroundcolor=”” icon=”” iconflip=”” iconrotate=”” iconspin=”no” iconcolor=”” circlecolor=”” circlebordersize=”” circlebordercolor=”” outercirclebordersize=”” outercirclebordercolor=”” image=”” image_width=”35″ image_height=”35″ link=”http://adainspired.mit.edu/gesture-research/statistic-tests/” linktext=”Verify your data” link_target=”” animation_type=”” animation_direction=”left” animation_speed=”0.3″ animation_offset=””]

Get your p-values here!

[/fusion_content_box][/fusion_content_boxes][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]