Marlena

16 Jan 2013

http://everyware.kr/portfolio/contents/09_oasis/

https://vimeo.com/1631013#

 

This project is an interactive piece where users create tide pools out of sand in which generated fish and other creatures move and react to outside stimulus. This project struck me as particularly beautiful, though, in its vivid yet simple simulation of life.

Some projects attempt to simulate life by attending to every detail, trying to get as close to the original as possible. Most of these attempts are exercises in futility, though, because the closer a thing gets to life without reaching it the more uncanny it appears. In Oasis II, though, when you look at the screen it is immediately clear that the residents of the tiny tide pools are neither alive nor close enough to appear uncanny–instead, as the documentation states, “creatures look like vivid pencil sketches on a canvas.” This design choice almost makes the creatures seem more real, though, as the creatures instead seem to be drawings leaping from the page, thus creating an environment of suspended disbelief rather than an unsuccessful copy of a live animal. The interaction is simple and intuitive as it mimics the way our natural world works and creates an elegant overall experience that, while clearly based in technology, feels natural and almost grown rather than built.

https://www.youtube.com/watch?v=-YNeyoBCz9Q

https://vimeo.com/20967084
The fly blimps are a small cluster of blimps that maneuver themselves around a room based on the input of a swarm of flies contained at the base of each balloon. The movement of each blimp is calculated from data collected from the movement of the flies as they buzz around their habitats.


I was surprised by this piece in that it took its interactive input from an animal other than a human; not only that but the animals themselves were part of the piece. The piece itself does not hold much meaning for me beyond the materials used in its construction–the fact that it incorporates living animals is the thing that interests me. I would have been even more interested to see what would happen if the inputs from the flies had an effect outside of their collection of blimps: what if the flies had unknowing control over something that humans depend on like the lights or a door opening and closing? What other projects could take the instinctual actions of different animals and create a responsive and interactive piece of work? It definitely gave me some food for thought.

http://www.visualcomplexity.com/vc/project_details.cfm?id=297&index=10&domain=Pattern%20Recognition

Screen Shot 2013-01-16 at 1.06.38 AM
This data visualization is a cool concept–the creator took Galvanic Skin Response readings of his basic emotional state as he wandered around the Greenwich Peninsula in London, England to create this visualization.

While the idea for the project is pretty cool and the concept of having a map of human response is excellent in that it adds an element of personality that simple maps lack, there are some inherent flaws. As the artist states, the data in this particular map is partially flawed because of the way he gathered it. It also might benefit from more specific information–emotional arousal could mean anything from a breathtaking sunset to just missing being hit by a car. Including unobtrusive descriptions of the events in question, connecting the visualization to a map, or comparing two peoples’ shared experiences could be fun possible additions. The artist also talked about constructing such maps for an entire community–I would be interested to see how that possibility could be explored.