Skip to content

zhichaohu/sdg-corpus

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Story Dialogue with Gestures (SDG) Corpus

This page was migrated from https://nlds.soe.ucsc.edu/sdg. You can download a sample above (sample_sdg_corpus.zip). Please contact me to download the full SDG Corpus Stories & Annotations (452MB) or the IVA Paper Videos & Results (314MB).

We are still working on finilizing the corpus, please refer to this sheet for annotation status of all the stories.

If you use this data in your research, please refer to and cite: Zhichao Hu, Michelle Dick, Chung-Ning Chang, Michael Neff, Jean E. Fox Tree, and Marilyn Walker. "A Corpus of Gesture-Annotated Dialogues for Monologue-to-Dialogue Generation from Personal Narratives," In Language Resources and Evaluation Conference (LREC), Portorož, Slovenia, 2016.

Overview and Data:

The Story Dialogue with Gestures (SDG) Corpus contains:

  • 50 personal blog stories with IDs from the Spinn3r ICWSM corpus
  • Human generated dialogues for those stories
  • Human annotated gestures for those dialogues (with videos of all the gesture forms)
  • Audios generated using AT&T Text-to-Speech for each dialogue (voices Mike & Crystal)
  • Gesture annotations are time-synched with generated TTS audio

Works that use this corpus:

The SDG Corpus also contains all experimental stimulus videos and Mechanical Turk results from Zhichao Hu, Marilyn A. Walker, Michael Neff and Jean E. Fox Tree. “Storytelling Agents with Personality and Adaptivity,” Intelligent Virtual Agents: 15th International Conference (IVA), Delft, Netherlands, 2015

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published