Most gesture research articles show gestures as video snapshots, tracings of video snapshots, or time-aligned blocks as seen in the multimedia annotation software like ELAN or ANVIL. Not much has changed since the beginning of gesture research using microfilm.
Custom exploratory data visualizations allow us to view our data in different ways to give insights into the data and better communicate the patterns we see that support our hypotheses.
I designed a growing set of visualizations to show trajectory movement, to find patterns in gesture phasing, and to investigate gestural alignment with the speech production. These encompassed a variety of tools and skills that I have refined over the years. They include: HTML, CSS, Perl, Processing (see processing.org), hand illustration, graphic design, and video manipulation.
About Gesture Phases
Our stroke-defined gestures have a central gestural stroke surrounded by the optional gesture phases of preparation, holds, relaxation (also known as recovery) and a static relaxed phase.
Gestures and Intonational phase alignment (Aug 2019)
These show the syllables with the accented ones in yellow, the gesture phases with the strokes in yellow, and corresponding trajectory paths drawn inside the phases. The gesture phases with the top left corner rounded is the preparation phase, and the ones with the top right corner rounded are the relaxation/recovery phase. The illustrated trajectory shape shows the path of the entire grouping in white, in the direction of the arrow, and the trajectory of the phase in purple. This is coded in Processing, with the trajectory paths added in using Adobe Illustrator.
Single Gesture Sketches in Perceived Gesture Groups (April 2018)
From our Louvre speaker sample, showing similarity in trajectory path shape for gestures within a perceived group. Here are 19 sample groups, showing individual stroke-defined gestures in the group.
This image shows the trajectory path shape of the whole group, showing a consistency of form amongst the individual gesture strokes within the group, and where they are in space, whereas the image above separates them out.
Time-Aligned Gesture Phases (Nov 2015)
Here, the gesture phases are simplified to time-aligned polygons. The preparation phase is an upwards sloping triangle, the static phases of holds and relaxeds are long rectangles, and the relaxation or recovery phase is a downwards sloping triangle. The stroke is a large rectangle with a small circle in the center to draw the eyes to focus on them. The blue bars on top show preliminary grouped gestures, while the light orange background behind each set of phases show the stroke-defined gesture. This is coded in Processing.
Discourse and Timing
This visualization shows the perceptual gesture groupings in the text. The text below takes the transcribed syllables–which is why most words appear misspelled–and checks whether the syllable is in a grouping (alternating colored red and teal) or in a region of video where the speaker is occluded by a presentation slide (grey). Since this is an HTML file, there is some degree of interactivity. You can hover over each syllable to get information about the location of the syllable in the video, and which grouping it overlaps with. I included a 99 millisecond margin of error, which is the equivalent of 3 frames of video in a 30fps video. The data is exported from ELAN as a tab-delimited text file. This is coded in Perl to write an HTML file.
Hand Tracking Video
We explored the possibility of using the velocity of the hands to help easily label gesture features such as different phases, the curvature of the motion, among other possibilities. Using ProAnalyst, we tracked the index finger on both hands, subtracted the body movement, extracted the positioning data. The data is transformed in Processing code for the visualization, with added a line of code to turn each 1/30 second frame into an image. Afterwards I used a time-lapse utility to turn it into a video.