Description: In this interactable I wanted to experiment with motion, specifically how one might make a convincing chicken using Unity’s 2D physics. For me, it was fun to think about the action of pulling worms out of the ground and play with the tools, rather than do anything too complicated. The result is a funny little experience in which you play as a giant headed chicken.
My goal was to make a game, but this doesn’t have an explicit goal or UI or instructions. Instead I would call it a prototype. The controls are left and right or A and D to move the chicken around, and the mouse to reposition its head and the left mouse to bite down. I initially had plans for a larger sort of story about a chicken whose eating gets too big for its surroundings, but the systems are in place to tell a story like that in the future.
I wanted to create a project that used some kind of gaze tracker after I had seen a simple experiment using a webcam published on the chrome experiments page. I also loved the idea of a project that purposely avoids being seen!
My initial idea was to create a simple 2D platformer where players own ability to look at their own character would be hindered by a massive physics repulsion from their own gaze, making jumping, running, etc. much more difficult. I created simple flying rectangles that were planned to be enemies/obstacles that were also disrupted by gaze.
However, after some initial experimentation, I became fascinated by the behavior and visual nature of the simple system. I found that the small beads took on their own life and seemed to squirm away whenever they were put into the microscope. Rather than force them into the platformer, I gave them a small voice to portray their anger at being observed.
I was interested in using a fluid simulation (i used this one), but I couldn’t think of ways to satisfactorily visualize and interact with it that hadn’t been extensively explored before.
I thought to use boids to drive the interaction with the liquid, and have the liquid then apply forces to the boids, to create interesting emergent interaction. This made it hard to balance the system, as this is a positive feedback loop.
I experimented with a number of ways to visualise the boids and the liquid, with the boids being above at first, then moving behind unless they were under a specific size (to give the impression of them “jumping out”).
However, neither of these gave the liquid “depth” and “murkiness,” which were my emergent goals for this piece as it went on. Eventually, I realised I could just make the liquid black, and so I did.
I am satisfied by the result of exploring a fluid simulation system in a new (to me) way, with “live” <s>creatures</s> boids. I am especially happy with the murky, inky effect that sometimes happens. Looking back, I wish I had added more dampening to the boids. In addition, it would be interesting to explore how they interact with a GPU fluid system, letting one simulate the fluid at a much finer resolution (here it is just 64×64).
You can see the sketch (and the source code) at https://editor.p5js.org/aman/full/HysZkJ6fV.
This gargle activated sketch can be found online here.
I cannot say that this warmup had any particularly important content driving it, I was mainly interested in getting used to Box2d and merging it with other interactive qualities. I tried a couple things first using face tracking (as I wanted the water to be generated through a gargling gesture), mouseIsPressed, or with mouseX, mouseY, and finally settled on having it be voice activated. The size and quantity of water droplets being spurted out of the faucet are driven by the threshold of the audio, generated by a gargling spectator (Although there is no sound in the documentation, one could imagine me frantically gargling alone in my bedroom at 3am).
SinkVid
The creation of this assignment was heavily supported by Dan Schiffman’s Box2D documentation.
Inspired by the effect of Pendulum Waves, I made an iPhone app with Unity that simulates this effect, using the accelerometer of the phone.
My goals are to recreate the visual effect of diverging/converging patterns of pendulums swaying at different frequencies and to introduce interaction with the accelerometer inside of a phone. The final product does not recreate the diverging/converging patterns, yet it presents a mesmerizing wave pattern with the threads of pendulums and produces sound that corresponds to the pattern.
My first iteration:
My second iteration:
Although not originally planned, I connected the pendulums and the anchor with gradient threads, which created a very mesmerizing wave-like effect. For a more interesting interactive experience, I added sound to each pendulum, such that when each pendulum makes a click sound when it returns to the center axis.
The final version:
I used Jacqui Fashimpaur and Alexander Woskob’s help with Unity.
These alphabet blocks and balloons reveal when they’ve been arranged into a word. The user types on their keyboard to create blocks and balloons for each letter.
With this project, I wanted to develop a playful application that would be appealing to children and that had the potential to be broadened to encompass several languages at once. This 2-D physics game, where blocks can be brought into the world with the keyboard and floated away on balloons, does that, to some extent. The list of English words that it pulls from could be broadened to include other languages that use Roman alphabet letters as well, allowing the discovery not only of words you didn’t know were there, but of words you didn’t know. Unfortunately, one of my major limits on adding these other dictionaries right now is optimization, and as I go through my English dictionary very frequently in my code, I slow down my app considerably.
Matter.js + Webcam Hand Tracking: Real Hand Puppeteering
This project allows users to puppeteer a rag doll with their real hands in their browsers through gesturing in front of webcam. (https://puppeteers.glitch.me)
^ an animated GIF showing a play session
I came up with the idea since it appears to me that human body tracking and physic engines combines well. Maybe it’s because we always wanted touch and feel objects.
I never learned to puppeteer in real life. I thought rag doll physics might simulate it well, so it’s also a chance to try out puppeteering virtually.
Hand tracking has been around for a long time, but I want to make a new tracker in js for the browser so everyone can play without acquiring hardware or software.
I used matter.js for the physics engine, mainly because I wanted to try something new. But in retrospect, maybe box2d.js would have worked better. However, matter.js seem to have an easier API compared to box2d.js’s automatically ported C-style code.
^ screenshot showing debug screen and camera view.
^ Screenshot showing detection and tracking of multiple hands and fingers
OpenCV.js is used to write all the computer vision algorithms. I used HSV ranges, convex hulls etc. to find fingers in web cam image.
Assumptions:
Face and hands of the same person have at least somewhat similar color
Background is not exactly same color as skin
Person relatively near the camera, and their face is visible
Person is not totally naked
Person not wearing gloves or have stuff over their faces
I’ve worked with OpenCV in C++ and python, but haven’t used the JavaScript port. So this time I gave it a try. I think the speed is actually OK, and API is almost the same as in C++.
^ Testing algorithm before different backgrounds and lighting situations to make sure it is robust.
Result
I think the result is quite fun, but I’m most bothered by matter.js’s rag doll simulation (which I based on their official demo). Sometimes the rag dolls fly away for no apparent reason. One probability is that I’m missing something in the parameters that I need to tweak.
Another problem is that hand tracking is lowering the FPS a lot. When there’s either only physics or only hand tracking, it’s pretty smooth, but when there’s both, things started to get slow.
In terms of puppeteering, I was only able to make the puppet jerk its arm or move a leg. There’s no complex movements such as walking, punching, etc. But I think it’s still interesting to experiment with.
In the future I can also add other types of interactions to the system, for example shooting stuff from fingers, grabbing/pushing objects, etc.
For my 2D physics interaction, I made a squishy frog head that drops through platforms controlled by the mouse. A collision with each platform causes the frog to play a different note. It was created with Processing, Daniel Shiffman’s Box2D-for-Processing, and the SoundCipher library.
I struggled with coming up with an idea for this assignment because I wanted to make good use of the physics library, but wasn’t sure how to make something interesting and fun to interact with. At first the frog was just a rigidbody, but I took inspiration from this video to make a circle out of distance joints instead. I think the squishiness really improved the assignment, and I’m glad that it allowed me to play with the physics a bit more.
I made a bread eating race game with face tracking. I think I’ve mostly accomplished what I set out to do: the game works as desired and the face tracking works better than I’d hoped for. I think some details could be added, like having it so that while the dogs hang and gnaw on the baguettes, the baguettes should swing with the motion, or having the baguettes rotate instead of just translate with the motion of the rope, but with time constraints this is it. Ideally I would upload this on OpenProcessing or some place else so that people could play it, but for OpenProcessing at least, I’m not sure how to combine all my files or how to deal with my image files.
This project is a slow tag game that uses proximate greyscale colors.
instructions: you move around the big cube slowly in the environment. you try to locate and bump into the other big cube to shift the colors.
It aims to induce illusions and test the limit of patience. The physics library and the tag game work together to create a borderline organic impression. The minimal color difference, the slow frame rate, and the limitation on how fast you can move your avatar encourage a slow pace to play and observe. The shifting color succesfully makes the experience more disorienting and straining. The project is made with Dan Shiffman Box2D for processing library.