In this assignment, students in Prof. Golan Levin's fall 2018 intromediate course, Electronic Media Studio: Interactivity and Computation (60-212) were asked to write software which creatively interprets (or responds to) the actions of the body, as observed by a camera. (More details about the assignment brief are here.) In support of the assignment, the professor gave presentations about creative augmentations of the body, full-body interactive art, and face tracking in new media art.
The majority of the students in this course are sophomores in the Carnegie Mellon School of Art. Their GIFs below link to pages with more information and, in many cases, live embedded sketches. The students were provided with template code for using various face-trackers and body-trackers in popular open-source software toolkits for the arts, including PoseNet with ml5js, FaceOSC with Processing, BVH parsing in Processing and three.js, and clmTracker with p5.js. We express our gratitude to the developers of these tools and tracking systems.
Dinkolas puppeteered a 3D face, using motion capture from a Kinect, and Python scripts in Blender. | |
Lass wrote a program to generate unique 3D characters for everyone in the class. The bodies and costumes (presented with three.js) are generated from unique hashes of our names. | |
Weirdie's software allows you to puppeteer a butterfly with your head movements and eye blinks. This project was developed with FaceOSC and Processing (Java). | |
Casher's program allows you to draw with your nose. The software was developed with p5.js (JavaScript), ml5.js, and PoseNet. | |
Chewie presents a corn-based facial mirror. The project was developed using p5.js and clmTracker. | |
Nixel created a program to visualize structural relationships in multi-body dance choreography. The project uses p5.js, ml5.js and PoseNet. | |
Chromsan used PoseNet in p5.js to investigate "the space between" people, visualizing the gaze interactions between film characters engaged in tense conversations. | |
Sapeck used three.js to create a field of "3D bubblewrap", whose bubbles are popped by the motion-captured movements of a martial artist. | |
Harsh created a determined flock of beard hairs. This project was developed with FaceOSC and Processing. | |
Breep explored different visual interpretations of motion capture data. Developed in Processing. | |
Spoon developed a spaghetti-like interpretation of motion-capture data, using Processing and simulated physics from the ToxicLibs library. | |
Airsun used PoseNet and p5.js to develop "Ovia", a shy character who responds to one's body movements. | |
Nerual created a system to puppeteer clouds. Opening your mouth creates a "wind" which blows your cloud one way or the other. | |
Shuann used PoseNet to explore the use of head movements as a controller for an interactive game. | |
Nannon created an interactive game in which the player has to move selected parts of their face into indicated sectors of the screen. Developed with FaceOSC and Processing. | |
Yuvian developed "Flappy Face", a game in which the player navigates narrow passageways by smiling to various degrees. Developed with clmTracker in p5.js. | |
Rigatoni created a physics-based game controlled by the jerking movements of the player's face. Developed using FaceOSC and Processing. | |
Ocannoli developed an outer-space-themed eating game. Developed using FaceOSC and Processing. | |
Paukparl explored the aesthetics of vide0-textured metaballs in three.js. A facetracker ensures that the viewer's face is always wrapped around the blobby forms. | |
Yalbert made a world which is revealed by the opening of the participant's mouth. Developed using FaceOSC and Processing. | |
Maya created an interactive visualization of the Japanese proverb, "Out of the mouth comes evil". Maya's project uses p5.js, clmTracker, and a voronoi GLSL shader. | |
Chaine developed an expressive display based on eye movements and other facial expressions. Developed using FaceOSC and Processing. |