Typorganism (www.typorganism.com) is a robust project containing a series of computational and interactive typography explorations.
The site design is lovely and I find many of the projects to be interesting augmentation approaches to type (check out “Motion Sickness”) Some of the projects such as “Visual Composer” are quite complex and interesting as a standalone piece. My biggest problem with this piece is that it really doesn’t work well as a unified project. The individual projects within the site don’t have much in common with each other except for the fact that they’re about kinetic or interactive typography, so I don’t quite see why they’re grouped this way. Otherwise, most of them are nice inspiration for a variety of type-based interactions.
for grains
wax is commercially available in bricks, grains, or columns(candle), so we have to design around that.
In terms what participants get out of the interaction, if we use wax, we could
1. pile wax up on a piece of baking paper and -tada- a piece of sculpture to take home
2. somehow add wick to it and make a candle?
3. print wax on a piece of cloth and dye it into table cloth?
4. use the wax model to make some more permanent piece?
For our capstone project we are working with stepper motors to create a 3-Axis print head. There are a lot of things that can be done with such a setup, and in fact even with 2-Axis movement. Below is a Robotic Tattoo Machine named ‘Freddy’ created in 2002 by Niki Passath.
The tattoo designs are generative, and it is apparently hard to do user testing…
“It was a hard job because the only person I could test it on was myself which was painful but a good incentive to get it right as soon as possible.
“He’s an artist of course so he always decides what design the person is going to get, they can’t choose. But I haven’t had any complaints yet.”
There is something interesting about the permanency of the tattoo form. So many computer generated graphics are disposable. When randomization is involved we cycle through them in search of the ‘perfect’ iteration, but this project commits one design in an irreversible way.
And also, maybe it is ambitious: school of fish behavior.
CEATEC: Nissan robots avoid traffic accidents and congestion
Wavefunction by Rafael Lozano-Hemmer. An array of chairs that move up and down responding to the presence of the public by creating waves that propagate over the exhibition room.
———————————————————————————————————
To control multiple objects I am thinking of using Arduino with shift registers. But how does the object communicate with each other (ideally in wireless), XBee is one option.
(Overall, the technical difficulty seems to be the main issue for me).
LastHistory visualizes your last.fm history. It displays every song you’ve ever scrobbled and the end looks like “a decoded genome, and has probably just as much metadata powering it.” If you click on the song and last.fm has the song data, it’ll play, and if you select a time period it’ll play your top tracks from that time period. You can also filter by year, time of day, day of the week, month, and pretty much any piece of metadata imaginable. It also will pull in your iPhoto events and iCal events and put them on the timeline as well. You can listen to the songs you might have been listening to during these things as well, and in the case of an iPhoto event, look at the photos at the same time.
Mac users can go to the GitHub and download the binary, and try it out themselves. I highly recommend it. This is a real monster of a visualization project, so I thought I’d share. It was done by Frederik Seiffert at the University of Munich as a thesis project.
In January 2010 the Cologne based design agencies Grosse8 and Lichtfrontpresented their cross-media installation titled Augmented Sculpture. The core of the installation is a 2.5m tall wooden form that builds the screen for a 360° projection.
In constant transformation over a score by Jon Hopkins, the 2:32 minute performance is described by Svenja Kubler of Lichtfront as a “mirror of changing realities… a kind of real virtuality arises to confront virtual reality.” I’m not sure what that all means but I really like it.
N Building is a commercial structure located near Tachikawa station amidst a shopping district. Being a commercial building signs or billboards are typically attached to its facade which we feel undermines the structures’ identity. As a solution we thought to use a QR Code as the facade itself. By reading the QR Code with your mobile device you will be taken to a site which includes up to date shop information. In this manner we envision a cityscape unhindered by ubiquitous signage and also an improvement to the quality and accuracy of the information itself.
Hereafter is a piece by UVA, a group whose physical computing work is really good and has super high production value. It is one of a few of their pieces based on “mirror” ideas that involve the theme of interacting with video history. As with all their work, the high quality of their finished product, and thought given to context and aesthetics lends a magical nature to the technology that is more bang and less whiz.
Sorry, I didn’t see that this Looking Outwards was due until I looked at the schedule this morning.
Here’s a piece that kind of has the same premise of what I wanted to do for my second project…using human input to create something visual.
This piece is by Camille Utterback and it’s called Aurora Organ:
The above link contains a video on the piece, but here’s a picture:
The piece is set up in an atrium and the human input is how people grasp the railing of a staircase in the atrium. Depending on how they grasp it columns of light hanging from the ceiling change color.
I also like how the piece fits into the environment. It pays with a simple gesture of grasping a railing that a person would normally not pay attention to.
I thought this implementation of augmented reality was interesting since it doesn’t use fiduciary markers or any other standard techniques for defining the surface on which to render. Instead, the makers have developed a way to read the architecture of the environment so that augmentations can be projected onto and surface within the frame.
Also, I thought the AR game of “Mario” on the street was really interesting–it reminded me of an article I read a few days ago about how “social games” may soon integrate with every aspect of our lives. However, many traditional gamers and developers feel left out of this movement. Perhaps AR games, like in “Mario”, can be used to bridge this gap by utilizing a social game mechanic in which you improve your score through interactions with your environment, which is augmented using AR to transform it into an immersive extraordinary experience.
Acoustic botany seeks to come up with ideas for creating plants that perform music. Taking from the idea that in the future many organisms will be genetically engineered to suit our needs, they are inspired by existing plants to design music-playing plants to genetically engineer in the future. Some of their plants would create music through symbiotic interaction with bugs and bacteria that reside within their resonating membranes. Their ideal result would be a “genetically engineered sound garden.”
I think this idea is cool and the idea of plants that emulate loudspeakers is something to look forward to. Unfortunately, they are currently only in the design phase of the project.
After I saw Julian Bleecker speak at Art&&Code last fall I started subscribing to his design collective’s blog. Last December he had a post about augmented reality that made me think more critically about what augmented reality is good for. It’s good to remember that AR hasn’t left the buzzword stage and most projects are more gee-whiz than practical.
Flow of Qi utilizes UWB technology to give installation visitors a direct impression of the ancient Chinese philosophy of Qi. This work lets participating observers orient their breathing on the spirit of famous works of calligraphy from the National Palace Museum in Taiwan and thereby establish personal contact with timeless cultural treasures.
I have seen this installation once and I think it is a delicate artwork.
Setting on the chair and relax the words show up according to your breath.
The speed of writing relates to user’s breathing rate and the density of ink depends on the breathing depth.
The interaction in this artwork is beautifully match the spirit of Qi.
The interaction demo starts from 03’45 in the video.
Comments Off on Looking-outwards(Augmentation):Flow of Qi
While browsing through a variety of augmented reality projects, I found myself most attracted to the projects that did not rely on specially designed codes that the computer can recognize; rather the ones that could use everyday objects as triggers for animation and effects.
Apologies if everyone has seen this project already, it was a collaboration between magician Marco Tempest and Zach Lieberman and Theo Watson, implemented in Open Frameworks. Nice example of using playing cards as an augmented reality trigger. Video is a bit long but is entertaining and really done well.
Here’s another project, this one dealing with pool. At around the 2:13 mark the video gets into the AR/projection part of the demonstration. It seems like a good idea for implementing the technology— it looks like they can calculate the trajectory of a pool ball quite accurately. I also like the pool-ball positioning robotic arm you see earlier in the video.
Comments Off on AR with common objects- looking outwards
This is an art installation happened in New Zealand? I guess. It is beautifully made for public interactive art and quite impressive thing was the people’s excitement we can observe from here. I think the beauty of this art is that the public get involved in creating the art dancing/having fun together and showing it to the public in the open space. Here are some more picture. I like it a lot.
While brainstorming with Cheng and Golan about the final project several ideas for using different types of input for fabricating physical objects came up. Here are a couple interesting projects I found out about from Cheng and Golan that use silhouette for input and/or output.
Kumi Yamashita, Dialogue, 1999
Nadeem Haidary, Hand-Made
I really like the idea of using body shape and body movement as sculptural input for digital fabrication. I have explored this general idea with another project called ‘Spatial Sketch’.
In a dance performance, scenes or music which interacts with dancers can add another dimension to dance performance. Here are some examples:
1- Mortal Engine
“Mortal Engine” is a full length dance piece, featuring six dancers on a steep stage, mostly kept in darkness by a computer interactive projection system. In this production by German software artist Frieder Weiss and Australian choreographer Gideon Obarzanek, projections react to the dancer’s moving body, graphically illuminating and extending it.
Pop-star Calvin Harris performed his new single on a “Humanthesizer,” a group of dancers painted with body-safe conductive ink used to trigger sounds. Students at the Royal College of Art developed the material, called Bare Conductive.
This piece is not technically challenging but rather its concept is fun and strikes an issue of concern. Google Maps Street View has been immensely useful for many people but also bothers many people of his lack of privacy concerns. So a group of artists “staged collective performances and actions that took place just as the Google Car was driving through the neighbourhood: a 17th-century sword fight, a lady escaping through the window using bed linen, a gigantic chicken, a parade with a brass band and majorettes, the lab of the inventor of a laser that makes people fall in love, etc. The images that document the events have become an integral part of the Google image archive.”
Now, these images are stored in Google’s databases permanently and serve as documentation for what is at any specific point in the world. “With their series of collective performances and actions, Kinsley and Hewlett create an analogy between their carefully planned and coordinated artistic events and the equally fictitious reality presented by Google.”
Oh, and this project came from the CMU CFA. Coincidence.
With the complicity of both the inhabitants of Sampsonia Way in Pittsburgh and Google Street View, artists Ben Kinsley and Robin Hewlett staged collective performances and actions that took place just as the Google Car was driving through the neighbourhood: a 17th-century sword fight, a lady escaping through the window using bed linen, a gigantic chicken, a parade with a brass band and majorettes, the lab of the inventor of a laser that makes people fall in love, etc. The images that document the events have become an integral part of the Google image archive.
Comments Off on Looking Outwards: Freestyle (Paul)