This documentation is a collection of all the resources used for conducting the RoboProfs VR experiment (for which the Unity project can be found here), the data that was collected during a small proof of concept and an exemplary analysis of this data.
In the following, the structure of this Git Repositories will be shortly explained.
The resources directory entails all the data files of the prior research and groundwork done in preparation for the VR experiment.
The subdirectory resources/social_cues encompasses the groundwork for the emotional aspects of the robot and that are further used in resources/stories as well as final Unity VR project. There are the materials and CSV for the animation of the robot's face, the reference video (created with a professional actor) used in the creation of the robot's body animations, and the SSML files used to create the robot's video (for which the audio files can be found in resources/stories/audio).
In the subdirectory resources/stories, there are text the robot says in different scenarios and lessons, the audio files for the robot's voice, the presentation videos displayed on the main body of the robot, and the questionnaire that the subject should fill out after finished the VR experiment.
In the blender directory the Blender file and textures of the robot model used in the VR experiment can be found.
The analysis directory contains the sample data collected with
Videos of example trials (both positive and negative condition) can be found in recording.
documentation.pdf contains extensive descriptions of the process of creating and conducting this experiment as well as analyzing the collected data.