Rachel-Looking Outwards-3

https://vimeo.com/26424619

The Hanging Garden, a collaboration between The Clorofilas and Aer Studio, uses an arduino to monitor a plant’s need for water. Moisture sensors in the soil send information to the arduino, causing LEDS to light when the soil becomes too dry. The organic use of the arduino surprised me; that the arduino can respond to a sensor so obscure as moisture was very interesting. Overall the work is very successful in its relation of plants and technology. The soft, small lights give the plants the voice one would expect a plant to have—the lights are subtle and emerge organically from the leaves, as if they are just extensions of the plant. The work is quiet and self sustaining, and it generates a sense of peace.

 

https://vimeo.com/25765171

Captured, by Nils Völkerand Sven Völker, is an environment installation consisting of space blankets inflated by CPU fans at controlled speeds. An arduino controls all of the fans. This project combines intelligent material choice and application. As the blankets inflate and deflate, they generate soft colliding sounds that rise and fall with each undulation of the environment. The mirror surfaces of the blankets reflect one another, deepening even subtle movements made by the air flow inside. The result is an oceanic effect that is very effective in submerging the viewer visually and spatially in the experience of the environment. The blankets move in response to a coordinated system of lights and music. In this way I see this work as an intense choreographed dance executed by an arduino, with space blankets for limbs.

https://vimeo.com/35014340

Soundmachines, by The Product Studio, is an interactive turn table. The project uses an arduino to collect and interpret signals of light and dark read off a spinning disk. Those signals are then sent to music software to generate sounds. There are three disks in all, each able to produce unique patterns of sound. The sleek design of this project attracted me, but for lasting impact I think the project needs more turntables with a wider range of sounds. At the moment, all of the music produced sounds computer generated and formulaic (there are only so many combinations of patterns in the three disks.) Adding organic instruments or even non-musical sounds could keep the project interactive longer. The project is meant to be successfully used by any one interacting with it, but I would like to see its effects if multiple people were to engage the instrument and generate music together instead of just one individual playing.

 

 

Ticha-LookingOutwards-3

Pixelate by Sures Kumar

Well, I must say I have seen a fair share of games involving food and eating, but never have I seen one that involved eating food the way Pixelate by Sures Kumar does. Albeit simple in its concept, Pixelate can be a surprisingly thought-provoking piece of work; in our current culture that is dominated by cheap fast foods and deliciously unhealthy meals, we tend to be less mindful of eating a balanced diet. As the more wholesome foods tend to be costly and not as easily accessible as the non-wholesome ones, it is understandable for many individuals to not feel inclined to make healthier choices. Pixelate is a simple statement of encouraging people to eat nutritious meals, but what I find most endearing about it is that since it a game it causes the act of eating healthy foods to feel rewarding. This characteristic is very much like the Wii Fit games.

The way the game is able to use pressure to detect which foods are being eaten is a particularly useful feature. However, this may be prone to error or limit the variety of foods. Additionally, the gameplay can get tedious, so the developers will have to adding more features to make the game more engaging.

 

Listen Carefully by Jonas Breme

Listen Carefully is an interesting project by Jonas Breme that makes us reflect on how we experience music today. It speaks to me on a personal level in particular, as I listen to music on a regular basis – but only as a secondary task. I cannot recall the last time I truly focused on listening to a song without simultaneously doing something else.

This project also touches upon the concept of multitasking, its prevalence in our daily routine, and how it prevents us from giving a single activity our full attention. What is poetic about Listen Carefully is that it forces the user to pause and unwind, commanding their full attention – just so that they may experience a moment of slow listening in a fast-paced life.

 

A Beat of Your Heart by Jason Mingay

Jason Mingay’s A Beat of Your Heart uses an Arduino and a pulse sensor to evoke certain emotional reactions from the viewer as they watch a sequence of video clips. What interests me about A Beat of Your Heart is that, while conventional art is made to passively induce emotional responses from the observer, this project almost forcefully injects certain reactions into the observer by provoking changes in heart rate. This project can possibly be seen as a complement to Listen Carefully, due to their similar objective of holding the user’s attention but different emotional reactions.

To me, A Beat of Your Heart  still feels more like a ‘detailed sketch’ than a final product. Regardless, I feel that this idea has a lot of potential and would like how it can be applied in different contexts.

Kristina – Looking Outwards – 3

Arduino Art:

The first piece I am looking at is “Floe”. It was peiced together by a team of four from the Royal College of the Arts in London, GB. At first I wasn’t sure why I liked it so much; the plastic “ice forms” were somewhat scratched and dirty and placed randomly. However, two things left this piece worth merit: The first, was that the plastic sheets really did emulate ice. (In particular, the way which the light caught on the edges and refracted through the middle was really beautiful). Second, because the piece shows visually how a sine wave can be used to emulate non-rotational motion in the physical world as well as in the virtual which I think helped me understand the physical a little better. Cutting holes in different places in a circle was a very clever yet simple way of implementing this.

FLOE from Ozgun Kilic on Vimeo.

The fact that this piece is physical is by far its most important quality: It was the light which really meant the most to me visually, as the implementation of the motion is not necessarily what someone not interesting in the construction of such a piece might look at. For this reason, I believe this is successful arduino art.

“One Hundred and Eight” by Nils Völker is my favorite of the three I have selected: One hundred and eight plastic bags are inflated and deflated in different patterns to recreate a living, breathing organism. This piece speaks to me on a conceptual level as well as on an artistic/aesthetic level. This could easily be a representation of the ecological crisis we face on a day-to-day basis. I saw a quote earlier this week about trees: “Imagine a world where trees emit WiFi. We would be planting them everywhere — you’d never be more than a few metres from a tree! What a shame they only emit the air we breathe.” This sculpture goes the opposite direction while expressing the same sentiment: instead of appealling to the hardware (WiFi) it appeals to the organic (humans/life). (If that explanation made any sense at all.)

One Hundred and Eight – Animated Patterns from Nils Völker on Vimeo.

The Stone Spray project is important again because it is so physical. This robot is clearly somewhat intelligent — show by how it was able to create loops when spraying sand by moving form side to side. These sculptures are not very beautiful formed very small by nature of how they were formed. However I can envision extremely large sculptures protruding from beaches in some sort of “alien landscape” formation. Whether or not this is useful is not a question, it is simply an interesting application which results in 3D printing, which, as we all know, is a very interesting section of today’s market and is what holds the attention of people looking to mass-produce new kinds of technical consumer equipment.

Stone Spray Project from Stone Spray on Vimeo.

Melanie-LookingOutwards-3

arca from tony olsson

It’s a game! Looks like a co-op. I’m already intrigued by the manipulation of physical space to affect virtual space. Are there more levels, more variations? How about its gaming potential? I greatly enjoy the use of sound to give each shape a different feel. The only disappointing part about is that I can’t find other documentations–wait, look! There’s a blog! So turns out it is a two-player “game” self-described as a “full body experience which is created by interaction, communication and play within an space where computer games meet abstract electronic art, video and sound installation.” It’s true! I’m in love with it. I want to play it with my roommate as we troll each other to death. More effective documentation would’ve been nice, though; I had to dig a lot to actually find out its purpose. Other than that, I want this to be a little bigger, and the physical objects to be less ordinary to be extraordinary. Anything other than bunch of cylinders.

[in]security camera from Dennis Rosenfeld

There’s a security camera, but it’s actually not a very effective security camera. It shies away from what it’s supposed to be watching–humans–and finds the wall very interesting instead. It says some things about surveillance and almost subverts it in a non-cynical, adorable fashion. I like the message and the concept; I don’t really like the execution though, as I expected more variations in movement to give the camera more personality. Rather than insecure, it just seemed startled half the time. Take a cue from this cute little robot.

INSIDE-OUT, a sound art performance from Andre Borges

There’s something powerful about the body’s sound reverberating throughout a giant room. INSIDE-OUT is a sound art performance using a man’s body as a musical medium. As he breathes in and out, various sounds play according to the internal sounds of his body. Most prominent is his breathing; it’s a constant loop, soothing and overarching. I’d imagine it would be like you’re really inside something when you enter that room–a soundscape of the body. I feel like it could’ve been better with improvement in presentation, as the presentation doesn’t feel organic, but I do like the rawness of the performance.

DAVE-LOOKINGOUTWARDS-3

Printer Orchestra is made by isthis.gd from old computer hardware which are synced to play MIDI arrangements. The music they chose is lively, and I instantly became happy after seeing how cute these old parts are being as they clicked and clanked to make music. This reminded me of the MIDI arrangements played on floppy drives I had seen in the past. The only thing I wished that the makers would do is for them to upload a making of video, so we can see their “orchestra” play other musical arrangements, and how they configured and set up such a whimsical piece.

 

Fire Hero is by oswaldonfire as a version of Guitar Hero which instead of the original computer graphics effects, uses flamethrowers to represent notes being hit. When I first saw this project, I was struck by how insane the creators were; fire is scary, and I would never choose to use it as a medium. But seeing the flames sprout out in beat with the music, I am deeply impressed. The closest thing I have seen resembling this is equalizers, but this takes that to an extreme and dangerous territory. If only the video quality is more professional and they arranged it more dramatically, then that would bring out that edginess of this project even more.

 

It turns out all the projects I found are somehow related to music. Playa is made by rubendhers, and it uses motors, cables and cloth to play a chillingly ambient arrangement on 14 guitars. This provides a striking contrast to the previous 2 projects, as this generates a mysterious atmosphere, while printer Orchestra is whimsical and Fire Hero is hardcore extreme. The autonomous spinners, cables, and the brownish background gave me a steampunk impression, which contributes even further to that otherworldliness of the music and the mysterious atmosphere. This reminded me of my first visit to CMU: I walked into CFA, and I heard an ambient sound. Someone explained that it was Beethoven’s 9th played over the period of 24 hours. It created a feeling that one is constantly discovering and unraveling the secrets of the universe.

MichelleMa-LookingOutwards-3

The Tropism Well


The Tropism Well by Richard Harvey and Keivor John takes a spin on the organic bending mechanism of plants to give their “well” near human traits. When the structure sees a person approaching, it “bows” in his/her direction through the movement of water (or some other drink) and fills that person’s glass. This well is incredibly well-crafted in my opinion because it does its job of inviting the spectators through its quirky personality and its functionality. Although I may not carry a cup around me at all times, I think this would be much a more pleasurable and sanitary way to distribute water in public places. Some downsides I would worry about are its fragility, its uncovered opening at the top, and perhaps the slow wait for water (especially if there is a line of people viewing it as an art piece). But overall, I would definitely like to see a whole collection of Harvey and John’s solutions to public resource distribution. From there website here: http://www.harveyandjohn.com/, I became immediately fascinated by their ventures into gravity-defying interactive art, which maybe where they got their inspiration for upward moving water, but I can’t see many other pieces that are meant for the outdoors.

Cubepix

Since we are almost on the subject of projection mapping, the Cubepix by Xavi Trivo goes up a level to kinetic projection mapping. They appear to have an array of cardboard boxes that twist and turn with movements coordinated to the projection. In addition, the whole sculpture interacts with the presence in front of it. In fact, Daniel Rozin’s Wooden Mirrors, which I mentioned in my first Looking Outwards, was a source of inspiration for this project. I greatly admire the quickness and sleek movements of the sculpture, which are almost seamlessly in tune with the projection. There seem to be a few boxes that got stuck in the video, but that is pretty minor compared to what they have produced. I also don’t know if I like the cardboard brown color. While it shows the viewer how this project exemplifies “rags to riches”, I don’t think it effectively shows the contrast between analog and electronic media quite like the Wooden Mirrors do. Other than that, I think Xavi’s Lab did a great job creating a piece that viewers will want to play with for a long time. I don’t know if the sound in the video is incorporated in the actual piece, but I think it really adds to the playing experience.

The Singing Plant

In this project, Mads Hobye uses Arduinos inplace of the theremin to make a plant “sing”. He uses a sensor to measure the capacitance of a plant, and if a person touches the plant, that interaction can be converted into sound. I’ve heard of the idea of communicating with animals, plants, and inanimate objects before through electrical pulses, but the fact that a person can build a simple version of such a device with Arduino is really inspiring. The project here is a very basic sensor-output system, but the fact that it’s cute and applicable to any homely plant is quite charming. It allows the viewers to fulfill a dream of “conversing” with the plants they’ve been raising. But on a much larger scale, the capacitance sensor can be used to characterize just about any body that humans wish to have communications. For example, it can be used to create a collection of the “voices” of a forest or ecosystem over time so that people may try to guess what it is saying. I’m thinking of Green messages mostly, but the applications are very far-reaching.

According to his website, Mads Hobye has several large scale installations, including a soundscape playground, as well as many other instructables that serve to disseminate the how-tos of transforming everyday life with digital media. Most of his work involves sound machines, and I can clearly see the theremin as inspiration for this piece.

FaceOSC

Defense Mechanisms

My initial idea was to create an onscreen character which takes a neutral/unfriendly expression and exaggerates it, literally reflecting the person’s “prickliness” or unapproachability. When a person smiles, then the character becomes rounded and happier.

Photo Oct 06, 11 09 09 PM copy

In the end, while I worked on the sketch, I modified the concept. It became more of a creature and less of a puppet. The character becomes rounder and more visible when you smile, and pricklier and less visible for the longer you frown.  It also moves away toward the corner of the window for the longer you frown. Altogether, the character reacts to your expressions (or reflects your emotions, depending how you interpret it) by becoming more or less defensive (reflected in visibility, proximity, and prickliness.)

If I had more time to spend on this sketch, I would have experimented with moving it back toward my original concept—adding back facial features and making this more like a puppet.

Code

Github Repo

pricklyFace (main)

import oscP5.*;
OscP5 oscP5;

// our FaceOSC tracked face dat
Face face = new Face();
float faceScale = 1; // default - no resizing of face
ArrayList faceOutline = new ArrayList();
int numPoints = 100;
float initialPrickliness = 0.2;
float prickliness = initialPrickliness;
float maxPrickliness = 0.7;
float minPrickliness = 0;

float closeness = 0.3;
float maxCloseness = 0.7;
float minCloseness = 0.15;

void setup() {
  // default size is 640 by 480
  int defaultWidth = 640;
  int defaultHeight = 480;

  faceScale = 1; // shrink by half

  int realWidth = (int)(defaultWidth * faceScale);
  int realHeight = (int)(defaultHeight * faceScale);
  size(realWidth, realHeight, OPENGL);

  frameRate(10);

  oscP5 = new OscP5(this, 8338);
}

void draw() {  
  background(250);
  noStroke();

  updatePrickliness();

  if (face.found > 0) {
    
    // draw such that the center of the face is at 0,0
    translate(face.posePosition.x*faceScale*closeness, face.posePosition.y*faceScale*closeness);

    // scale things down to the size of the tracked face
    // then shrink again by half for convenience
    
    closeness = map(prickliness, minPrickliness, maxPrickliness, maxCloseness, minCloseness);
    scale(face.poseScale*closeness);

    // rotate the drawing based on the orientation of the face
    rotateY (0 - face.poseOrientation.y); 
    rotateX (0 - face.poseOrientation.x); 
    // rotateZ (    face.poseOrientation.z); 

    float fill = map(prickliness, minPrickliness, maxPrickliness, 100, 200);
    fill((int)fill);
    
    // drawEyes();
    // drawMouth();
    // print(face.toString());

    faceOutline = new ArrayList();
    getFaceOutlinePoints();
    drawOutline();
    
    /*if (face.isBlinking()) {
      println("BLINKED");
    }

    face.lastEyeHeight = face.eyeLeft;
    face.lastEyebrowHeight = face.eyeRight;
    */
  }
}

// OSC CALLBACK FUNCTIONS

void oscEvent(OscMessage m) {
  face.parseOSC(m);
}

void drawOutline() {
  float x = 0;
  float y = 0;

  if (faceOutline.size() != (numPoints + 1)) {
    getFaceOutlinePoints();
    return;
  }
  else {
    beginShape();
    for (int i=0; i < = numPoints; i++) {
      x = faceOutline.get(i).x;
      y = faceOutline.get(i).y;
      vertex(x, y);
    }  
    endShape();
  }

}

void updatePrickliness() {
  float antiPrickliness = 0;
  int transitionTime = 30000;

  if (!face.isSmiling()) {
    prickliness = constrain(face.timeSinceSmile, 0, transitionTime);
    prickliness = map(prickliness, 0, transitionTime, minPrickliness, maxPrickliness);
  }
  
  antiPrickliness = constrain(face.smilingTime, 0, transitionTime);
  antiPrickliness = -1 * map(antiPrickliness, 0, transitionTime, minPrickliness, maxPrickliness);
  
  prickliness = prickliness + antiPrickliness;
  constrain(prickliness, minPrickliness, maxPrickliness);
  if (prickliness < 0) {
    prickliness = 0;
  }
}

void getFaceOutlinePoints() {
  int xCenter = 0;
  int yCenter = 0;
  
  for (int i=0; i <= numPoints; i++) {
    float radius = 30;
  
    // iterate and draw points around circle
    float theta = 0;
    float x;
    float y; 
    float oldRadius = -1;
  
    theta = map(i, 0, numPoints, 0, 2*PI);
  
    if (i%2 == 0) {
      oldRadius = radius;
      radius = radius * random(1+prickliness, 1+(prickliness*2));
    }
  
    x = radius*cos(theta) + xCenter;
    y = radius*sin(theta) + yCenter;
  
    if (i == numPoints +1) {
      PVector firstPoint = faceOutline.get(0);
      PVector circlePoint = new PVector(firstPoint.x, firstPoint.y);
      faceOutline.add(circlePoint);
    } 
    else {
      PVector circlePoint = new PVector(x, y);
      faceOutline.add(circlePoint);
    }
  
    if (oldRadius > 0) {
      radius = oldRadius;
      oldRadius = -1;
    }
  }
}

void drawEyes() {
  int distanceFromCenterOfFace = 14;
  int heightOnFace = -4;
  int eyeWidth = 6;
  int eyeHeight = 4;
  ellipse(-1*distanceFromCenterOfFace, face.eyeLeft * heightOnFace, eyeWidth, eyeHeight);
  ellipse(distanceFromCenterOfFace, face.eyeRight * heightOnFace, eyeWidth, eyeHeight);
}

void drawMouth() {
  float mouthWidth = 30;
  int heightOnFace = 14;
  int mouthHeightFactor = 3;

  float mLeftCornerX = 0;
  float mLeftCornerY = heightOnFace;

  float pointX = mLeftCornerX + ((mouthWidth/2));

  float mouthHeight = face.mouthHeight * mouthHeightFactor;
  ellipse(mLeftCornerX, mLeftCornerY, mouthWidth, mouthHeight);
}

Face class

import oscP5.*;

// a single tracked face from FaceOSC
class Face {

  // num faces found
  int found;

  // pose
  float poseScale;
  PVector posePosition = new PVector();
  PVector poseOrientation = new PVector();

  // gesture
  float mouthHeight, mouthWidth;
  float eyeLeft, eyeRight;
  float eyebrowLeft, eyebrowRight;
  float jaw;
  float nostrils;

  // past
  float lastEyeHeight;
  float lastEyebrowHeight;
  
  boolean wasSmiling = false;
  float startedSmilingTime = 0;
  float smilingTime = 0;
  
  float stoppedSmilingTime = 0;
  float timeSinceSmile = 10000;

  Face() {
  }

  boolean isSmiling() {

    if (mouthIsSmiling()) {
      if (wasSmiling == false) {
        wasSmiling = true;
        startedSmilingTime = millis();
        timeSinceSmile = 0;
      }
      else {
        smilingTime = millis() - startedSmilingTime;
        println("smilingTime: ");
        print(smilingTime);
        println("");
      }
      return true;
    }
    else {
      if (wasSmiling == false) {
        timeSinceSmile = millis() - stoppedSmilingTime;
      }
      else {
        wasSmiling = false;
        stoppedSmilingTime = millis();
        smilingTime = 0;
      }
      return false;
    }
  }
  
  boolean mouthIsSmiling() {
    float minSmileWidth = 15;
    float minSmileHeight = 2;
    return ((mouthWidth > minSmileWidth) && (mouthHeight > minSmileHeight));
  }
  
  boolean isBlinking() {
    float eyeHeight = (face.eyeLeft + face.eyeRight) / 2;
    float eyebrowHeight = (face.eyebrowLeft + face.eyebrowRight) / 2;

    if ((eyeHeight < lastEyeHeight) &&
      (eyebrowHeight > lastEyebrowHeight)) {
      return true;
    }
    return false;
  }

  boolean isSpeaking() {
    int speakingMouthHeightThreshold = 2;
    if (face.mouthHeight > speakingMouthHeightThreshold) {
      return true;
    } 
    else {
      return false;
    }
  }

  // parse an OSC message from FaceOSC
  // returns true if a message was handled
  boolean parseOSC(OscMessage m) {

    if (m.checkAddrPattern("/found")) {
      found = m.get(0).intValue();
      return true;
    }      

    // pose
    else if (m.checkAddrPattern("/pose/scale")) {
      poseScale = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/pose/position")) {
      posePosition.x = m.get(0).floatValue();
      posePosition.y = m.get(1).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/pose/orientation")) {
      poseOrientation.x = m.get(0).floatValue();
      poseOrientation.y = m.get(1).floatValue();
      poseOrientation.z = m.get(2).floatValue();
      return true;
    }

    // gesture
    else if (m.checkAddrPattern("/gesture/mouth/width")) {
      mouthWidth = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/mouth/height")) {
      mouthHeight = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/eye/left")) {
      eyeLeft = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/eye/right")) {
      eyeRight = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/eyebrow/left")) {
      eyebrowLeft = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/eyebrow/right")) {
      eyebrowRight = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/jaw")) {
      jaw = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/nostrils")) {
      nostrils = m.get(0).floatValue();
      return true;
    }

    return false;
  }

  // get the current face values as a string (includes end lines)
  String toString() {
    return "found: " + found + "\n"
      + "pose" + "\n"
      + " scale: " + poseScale + "\n"
      + " position: " + posePosition.toString() + "\n"
      + " orientation: " + poseOrientation.toString() + "\n"
      + "gesture" + "\n"
      + " mouth: " + mouthWidth + " " + mouthHeight + "\n"
      + " eye: " + eyeLeft + " " + eyeRight + "\n"
      + " eyebrow: " + eyebrowLeft + " " + eyebrowRight + "\n"
      + " jaw: " + jaw + "\n"
      + " nostrils: " + nostrils + "\n";
  }
};

Being Shushed

shhhhh face

My second idea was to create a character and, to some degree, an environment/game mechanic. When you open your mouth, a small speech bubble appears and begins to grow. However, as soon as you open your mouth, the word “shhhhh” begins to appear all around, and the words cluster around the speech bubble, as though they are squishing it. If you close your mouth, the speech bubble disappears and the face onscreen looks somewhat unhappy. But if you keep your mouth open long enough, the bubble grows and pushes the shhh’es out of the frame. If you successfully do this, you see the word applause appear all around.

I attempted to implement this idea and part of the way. I created (as shown in the video above) a speech bubble which grows based on how long you have been “speaking” (crudely measured by the length of time which you have had your mouth open). However, I had trouble figuring out how to position the face and speech bubble on screen such that they wouldn’t overlap awkwardly. I also realized that implementing some sort of particle system (most likely) of “shhh”es to put pressure on the speech bubble was going to make realizing this fully take a ton more time.

If I had more time to spend on this, I would probably stop drawing the face temporarily and work on the speech bubble’s interaction with a particle system of “shhh”es, then come back to the issue of the speaker’s face.

Code

Github Repo

shhhFace

//
// a template for receiving face tracking osc messages from
// Kyle McDonald's FaceOSC https://github.com/kylemcdonald/ofxFaceTracker
//
// this example includes a class to abstract the Face data
//
// 2012 Dan Wilcox danomatika.com
// for the IACD Spring 2012 class at the CMU School of Art
//
// adapted from from Greg Borenstein's 2011 example
// http://www.gregborenstein.com/
// https://gist.github.com/1603230

import oscP5.*;
OscP5 oscP5;

// our FaceOSC tracked face dat
Face face = new Face();
SpeechBubble speechBubble = new SpeechBubble();
float faceScale = 1;

// for additions


void setup() {
  // default size is 640 by 480
  int defaultWidth = 640;
  int defaultHeight = 480;
  
  int realWidth = (int)(defaultWidth * faceScale);
  int realHeight = (int)(defaultHeight * faceScale);
  size(realWidth, realHeight, OPENGL);
  
  frameRate(30);

  oscP5 = new OscP5(this, 8338);
}

void draw() {  
  background(255);
  stroke(0);

  if (face.found > 0) {
    
    // draw such that the center of the face is at 0,0
    translate(face.posePosition.x*faceScale, face.posePosition.y*faceScale);
    
    // scale things down to the size of the tracked face
    // then shrink again by half for convenience
    scale(face.poseScale*0.5);
    
    // rotate the drawing based on the orientation of the face
    rotateY (0 - face.poseOrientation.y); 
    rotateX (0 - face.poseOrientation.x); 
    rotateZ (    face.poseOrientation.z); 
    
    noFill();
    drawEyes();
    drawMouth();
    
    face.isSpeaking();
    int sbX = 7;
    int sbY = -15;
    speechBubble.draw(sbX, sbY);
      
    //}
    
    //drawNose();
    //drawEyebrows();
    print(face.toString());
    
    if (face.isSmiling()) {
      println("SMILING");
    }
    if (face.isBlinking()) {
      println("BLINKED");
    }
    
    face.lastEyeHeight = face.eyeLeft;
    face.lastEyebrowHeight = face.eyeRight;
    println("lastEyeHeight " + face.lastEyeHeight);
    println("lastEyebrowHeight " + face.lastEyebrowHeight);
  }
}

// OSC CALLBACK FUNCTIONS

void oscEvent(OscMessage m) {
  face.parseOSC(m);
}

void drawEyes() {
  int distanceFromCenterOfFace = 20;
  int heightOnFace = -9;
  int eyeWidth = 6;
  int eyeHeight =5;
  ellipse(-1*distanceFromCenterOfFace, face.eyeLeft * heightOnFace, eyeWidth, eyeHeight);
  ellipse(distanceFromCenterOfFace, face.eyeRight * heightOnFace, eyeWidth, eyeHeight);
}
void drawEyebrows() {
  rectMode(CENTER);
  fill(0);
  int distanceFromCenterOfFace = 20;
  int heightOnFace = -5;
  int eyebrowWidth = 23;
  int eyebrowHeight = 2;
  rect(-1*distanceFromCenterOfFace, face.eyebrowLeft * heightOnFace, eyebrowWidth, eyebrowHeight);
  rect(distanceFromCenterOfFace, face.eyebrowRight * heightOnFace, eyebrowWidth, eyebrowHeight);
}
void drawMouth() {
  float mouthWidth = 30;
  int heightOnFace = 14;
  int mouthHeightFactor = 6;
  
  float mLeftCornerX = 0;
  float mLeftCornerY = heightOnFace;
 
  float pointX = mLeftCornerX + ((mouthWidth/2));
  
  float mouthHeight = face.mouthHeight * mouthHeightFactor;
  ellipse(mLeftCornerX, mLeftCornerY, mouthWidth, mouthHeight);
}

void drawNose() {
  int distanceFromCenterOfFace = 5;
  int heightOnFace = -1;
  int nostrilWidth = 4;
  int nostrilHeight = 3;
  ellipse(-1*distanceFromCenterOfFace, face.nostrils * heightOnFace, nostrilWidth, nostrilHeight);
  ellipse(distanceFromCenterOfFace, face.nostrils * heightOnFace, nostrilWidth, nostrilHeight);
}

SpeechBubble

class SpeechBubble {
  float xPos; 
  float yPos; 

  float sbHeight = 150*0.25;
  float sbWidth = 250*0.25;
 
  float initialRadius = (sbHeight/3);
  float radius = initialRadius;
 
  int numPoints = 30;
  // http://math.rice.edu/~pcmi/sphere/degrad.gif
  float extrusionTheta = (5*PI)/6;
  float epsilon = PI/25;
  
  void draw(float xPosition, float yPosition) {
    xPos = xPosition;
    yPos = yPosition;
    
    float timeRadiusFactor = face.totalTime/10000;
    
    radius = radius + timeRadiusFactor;
    
    if (radius < 10) {
      return;
    }
    
    float xCenter = xPos+sbWidth/2 + timeRadiusFactor;
    float yCenter = yPos+sbHeight/2 - (timeRadiusFactor/2);
    
    println("DRAWN");
    beginShape();
    
      // variables for calculating each point
      float x;
      float y;
      float theta;   
      
      // iterate and draw points around circle.
      for (int i = 0; i <= numPoints; i++) {
        
        theta = map(i, 0, numPoints-2, 0, 2*PI);
        // this minus-2 is a hack to make the circle close
        x = radius*cos(theta) + xCenter;
        y = radius*sin(theta) + yCenter;
        
        // check to see if we're at the point in the circle where 
        // we want to draw the part of the speech bubble that sticks out
        if (((theta - epsilon) < extrusionTheta) && 
            ((theta + epsilon) > extrusionTheta)){
             
              float extrusionRadius = PI/25;
              
              float startTheta = extrusionTheta - extrusionRadius;
              float endTheta = extrusionTheta + extrusionRadius;
              
              float startX = radius*cos(startTheta) + xCenter;
              float startY = radius*sin(startTheta) + yCenter;
  
              float endX = radius*cos(endTheta) + xCenter;
              float endY = radius*sin(endTheta) + yCenter;
            
              curveVertex(startX, startY);
              vertex(startX, startY);
              vertex(x - (radius/1.5), y+ (radius/3));
              vertex(endX, endY);
              curveVertex(endX, endY);
        }
        else {
          curveVertex(x, y);
        }
      }
    endShape();
  }
}

Face class

import oscP5.*;

// a single tracked face from FaceOSC
class Face {

  // num faces found
  int found;

  // pose
  float poseScale;
  PVector posePosition = new PVector();
  PVector poseOrientation = new PVector();

  // gesture
  float mouthHeight, mouthWidth;
  float eyeLeft, eyeRight;
  float eyebrowLeft, eyebrowRight;
  float jaw;
  float nostrils;

  // past
  float lastEyeHeight;
  float lastEyebrowHeight;
  boolean wasSpeaking = false;
  float startSpeakingTime = 0;
  float totalTime = 0;
  float stoppedSpeakingTime = 0;

  Face() {
  }

  boolean isSmiling() {
    float minSmileWidth = 15;
    float minSmileHeight = 2;

    if ((mouthWidth > minSmileWidth) &&
      (mouthHeight > minSmileHeight)) {
      return true;
    }
    return false;
  }

  boolean isBlinking() {
    float eyeHeight = (face.eyeLeft + face.eyeRight) / 2;
    float eyebrowHeight = (face.eyebrowLeft + face.eyebrowRight) / 2;

    if ((eyeHeight < lastEyeHeight) &&
      (eyebrowHeight > lastEyebrowHeight)) {
      return true;
    }
    return false;
  }

  boolean isSpeaking() {
    int speakingMouthHeightThreshold = 2;
    /* Debug: 
     println("MOUTHHEIGHT");
     println(face.mouthHeight);
     */
     println("totalTime: ");
     print(totalTime);
     println("");
     
    if (face.mouthHeight > speakingMouthHeightThreshold) {
      if (!wasSpeaking) {
        totalTime = 0;
        startSpeakingTime = millis();
        wasSpeaking = true;
      }
      else {
        totalTime = millis() - startSpeakingTime;
      }
      println("SPEAKING");
      return true;
    } 
    else {
      if (wasSpeaking) {
        println("NOT SPEAKING");
        stoppedSpeakingTime = millis();
        wasSpeaking = false;
        totalTime = 0;
      }
      else {
        totalTime = -1*(millis() - stoppedSpeakingTime);
      }
      return false;
    }
  }

  // parse an OSC message from FaceOSC
  // returns true if a message was handled
  boolean parseOSC(OscMessage m) {

    if (m.checkAddrPattern("/found")) {
      found = m.get(0).intValue();
      return true;
    }      

    // pose
    else if (m.checkAddrPattern("/pose/scale")) {
      poseScale = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/pose/position")) {
      posePosition.x = m.get(0).floatValue();
      posePosition.y = m.get(1).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/pose/orientation")) {
      poseOrientation.x = m.get(0).floatValue();
      poseOrientation.y = m.get(1).floatValue();
      poseOrientation.z = m.get(2).floatValue();
      return true;
    }

    // gesture
    else if (m.checkAddrPattern("/gesture/mouth/width")) {
      mouthWidth = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/mouth/height")) {
      mouthHeight = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/eye/left")) {
      eyeLeft = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/eye/right")) {
      eyeRight = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/eyebrow/left")) {
      eyebrowLeft = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/eyebrow/right")) {
      eyebrowRight = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/jaw")) {
      jaw = m.get(0).floatValue();
      return true;
    }
    else if (m.checkAddrPattern("/gesture/nostrils")) {
      nostrils = m.get(0).floatValue();
      return true;
    }

    return false;
  }

  // get the current face values as a string (includes end lines)
  String toString() {
    return "found: " + found + "\n"
      + "pose" + "\n"
      + " scale: " + poseScale + "\n"
      + " position: " + posePosition.toString() + "\n"
      + " orientation: " + poseOrientation.toString() + "\n"
      + "gesture" + "\n"
      + " mouth: " + mouthWidth + " " + mouthHeight + "\n"
      + " eye: " + eyeLeft + " " + eyeRight + "\n"
      + " eyebrow: " + eyebrowLeft + " " + eyebrowRight + "\n"
      + " jaw: " + jaw + "\n"
      + " nostrils: " + nostrils + "\n";
  }
};

Other Idea: Feeling Misinterpreted

Photo Oct 06, 11 09 09 PM copy 2

One of my initial ideas was to create a face/character that would mirror your expressions but be…off. The face itself would be distorted, somewhat ugly, with some features upside-down or asymmetrical. As you looked at the face, it would mirror your expressions somewhat—if you smile, it would smile too, but crookedly, awkwardly.

The concept for this was to create a sort of mirror that evokes the feeling of being misinterpreted, not being able to say the right thing, or express it effectively.

I abandoned this idea because after initial experimentation, I decided that it would be too difficult to get to the point of accurately mirroring a face’s expressions so that I could deliberately distort parts of that mirroring.

Ralph-Assignment-05-FaceOSC

This piece is about the impact that small gestures put together can have on the world. With just the subtle motion of blinking one’s eyes, the fire will react in an equally subtle way until, eventually, the fire grows too big and wild for the candle and burns out.
The program tracks the height of the user’s eyes, and when it detects that the eyes are closed, a disturbance is sent through the fire and the wildness, number of fire particles, and the height increases by a tiny margin.

The video shows the program less responsive than usual, as faceOSC had a poor time trying to differentiate between my eyes and eyebrows under the bad lighting. Also, my eyes are already tiny enough as is. I’ll try to make a better recording.

Adam-Assignment-05-VisualClock

I set out to design a generative clock that gives you subtle sense of time as it passes.
My clock draws itself over a 5 minute period using noise. At the end of each 5 minutes a new clock face is started.

I think that visually the clock works well, the clock faces are definitely aesthetically pleasing.
Things to work on –
Indicating how many 5 minute blocks have passed. I had the idea that as a 5minute block is completed, it would be added to a wall of all of the past 5 minute blocks.
As Golan spoke about – experimenting with how the clock face “ends” would have a huge impact. Instead of it just fading, perhaps the lines unravel off-screen. I love this idea!

Generative ClockScreen Shot 2013-09-26 at 12.46.37 PM

Screen Shot 2013-09-26 at 12.46.46 PM

Screen Shot 2013-09-26 at 12.46.51 PM

Screen Shot 2013-09-26 at 12.46.54 PM

Screen Shot 2013-09-26 at 12.46.58 PM

Untitled-003



float x, y, oldX, oldY;
float xNoise, yNoise;
float xOffSet, yOffSet;


void setup( )
{
  smooth(); 
  background(255); 
  size( 600, 600 );
  stroke( 0, 16);
  strokeWeight( 1 );


  xNoise = random( width );
  yNoise = random( height );
  xOffSet = 0.001;  
  yOffSet = 0.001;  
}

void draw( )
{
  drawNoisePoint( );
  int s = second();  // Values from 0 - 59

println(s); 

  for (int i = 20; i < s-20; i = i+5) {
    for (int j = 20; j < 600-20; j = j+5) {
      //      point(i, j);
      x = noise( xNoise )  * width ;
      y = noise( yNoise )  * height;
      point( i+x, y );
      xNoise = xNoise + xOffSet;
      yNoise = yNoise + yOffSet;
    }
  }
}
void drawNoisePoint( )
{
}

float y = 1; 

void setup( )
{
  smooth(); 
  size( 750, 750 );
  background(255);

  noFill(); 
  stroke( 0, 16);
  strokeWeight( 1 );
  frameRate(10);
}



void draw( )
{

  int s = second();  // Values from 0 - 59

  float x = s % 12; 
  println(x); 
  point ( width/12 * x, height/12*y );

  for (int i = width/12; i < width - width/12; i = i+width/12*int(x)) {
    for (int j = height/12; j < height - height/12; j = j+height/12) {
//      ellipse(i, j, 55, 55);
      line(i +random(-25, 25), j + random(-25,25),i +random(-25, 25), j + random(-25,-25));
    }
  }

}

reasonably distanced ghost

For this, I had a tiny ghost character that would move away from you as you got closer and move forwards again as you came near, maintaining a pseudo-constant distance away from you. It blinks whenever you blink, relies on your scale and position on the screen to move itself around in a pseudo-3D space, and is relatively happy and cutesy.

This is a complete failure! Originally I really, really wanted a multi-eyed monster blob that blinked based on when the viewer did, and delayed the individual eye blinks so you got a visually appealing effect.

look at his beautiful, many eyes

Totally couldn’t get any sort of delay to work (I wrote a nifty little eye class in one of my previous sketches though, which was kind of exciting for me as I’ve tended to shy away from writing any sort of classes).

I have a working understand of classes and OSC now (as well as a continued appreciation for push/popMatrix), but I’m disappointed by the final sketch. I think it’s still pretty cute, but not really all that compelling or interesting. I’d really like to come back to this at some point and create something more like my original goal.