An experimental piece using a motorized mirror to explore the space of neglection through counter intuitive interactions. The mirror panel can be imagined sticking out of the wall that can rotate on the x- and y-axis. The panel reacts to a person and will rotate away from the person’s face, facing away from the user. For example, if the user goes to the right, the panel will rotate to the left around the y-axis. The panel rotates the other way around if the user goes to the left. Same inverse movement will occur when the user tries to look at the panel from above or below. The motion is frantic and abrupt, making it very clear it does not want you to see your reflection.
Implementation
For this project, I decided to properly learn and use openFrameworks. The hardest part would be getting everything to work together. I had trouble connecting add-ons and problems with random error messages. Finally, I worked my way around the add-ons and had something that utilized ofxFaceTracker, ofxUI and firmata to communicate with an Arduino. I could have used FaceOSC and use ofxOSC to get all the data, but I specifically wanted to have everything build inside of openFrameworks without running any additional software.
I had a couple of options for capturing devices; a regular webcam or the Kinect. I decided going for a webcam, because the Kinect felt like an overkill and the user would be interacting/facing the mirror quite closely. As for the webcam, I used an external HD camera, because the built-in webcam on my laptop was slow and the image quality was pretty bad.
For rotation the mirror on the x- and y-axis, I would need to drive 2 servos with an Arduino. I picked up a couple of Hitect servos together with mounts called ServoBlocks that would allow you to connect multiple servo very easily (and they look bad ass as well).
Tech
- ofxFaceTracker
- ofxUI
- firmataExample for serial communication
- Arduino Nano v3
- 2x Hitec HS-5485HB Servo
- 2x Standard Hitec ServoBlocks
- Microsoft LifeCam Cinema 720p
Code
- If the Arduino is connected & face is detected
- If user is close enough (pseudo using getScale) to initialize the servos
- Rotates servo in a certain range in the x- and y-axis
- User’s head position and calculates angle from the center
- Rotates servo based on head position and angle compared to the servo’s current location
- Buffer time of 4 seconds if the face detection suddenly dropped
- Resets to neutral servo stance on start, no face detected, and on exit
- Neat UI with ofxUI to visualize everything on screen
Construction
I laser cut* some parts specifically for the servo and webcam mount, and then bolted everything together on a stand. Made a couple of minor mistakes in the design, but everything worked well in the end. The mount for the mirror turned out the be a bit heavy causing it to tip down when the servos aren’t active. If I had more time, I would have created a counter weight to balance it more properly. Rotating the entire mirror with the mount was no problem at all for the servos. It does create a bit of a swing when it jumps to a certain rotation, which is actually pretty cool!**
*Special thanks to Nathan for helping me laser cut some parts last minute before the deadline!!
**I broke the mirror during construction and had no time to get a replacement (derp)…
Future Implementation
Redesign some of the mounting pieces for a better fit.
Counter balance for mirror
Find a way to get distance of the face from the camera without using getScale().
Make the construction less ghetto, something worthy of displaying on a wall in a museum for example.