The following three art works are the result of Google when you search “interactive art” in videos and I deleted one the the result which has already been posted by other classmate. And I looked into the first ten pages of the result that is 100 videos. And I found out that when working with the interaction. Most of the interactive art work could be catagorized into 2 kinds: physical(body, facial, arm, leg,feet, distance with the artwork, touch..etc) and audio. And with the interaction, the area that is needed for the work varies from a limited area to a big space of area.
Interactive art with wooden mirrors (1/4) by Daniel Rozin
Wooden Mirror – 1999
830 square pieces of wood, 830 servo motors, control electronics, video camera, computer, wood frame.
Size – W 67” x H 80” x D 10” (170cm , 203cm, 25cm).
Built in 1999, this is the first mechanical mirror I built. This piece explores the line between digital and physical, using a warm and natural material such as wood to portray the abstract notion of digital pixels. Look here, you can find more mirrors.
http://www.smoothware.com/danny/
ACCESS – an interactive art installation by Marie Sester
ACCESS is an an interactive installation that lets web users track anonymous individuals in public places, by pursuing them with a robotic spotlight and acoustic beam system.
Starfield
[vimeo=https://vimeo.com/36892768 width=”600px” height=”400px”]
Starfield is a project by Lab 212 which manipulates the to and fro motion of the swing to animate a project sky on the wall. The application is created in Openframeworks and uses kinect for the detection of the depth of the swing in the space and automatically animate the stars in the projected sky. It uses the openGL for rendering the sky onto the wall. The installation is very interesting in the way the artist used the kinect, which is a sort of motion and depth detection but not of a human body, unlike other kinect projects.
Sound Machines
[vimeo=https://vimeo.com/35014340 width=”600px” height=”400px”]
Sound Machines is an interactive instrument that resembles record players and have a set of three record playing like discs to produce music. The music is controlled by the placement of the head on the record players. The record players are art discs with custom pattern with black and white colors. The head reads the color beneath the head and plays the music accordingly. It is interesting how the barcode scanner like concept was transferred into a DJ console.
Comments Off on Kaushal Agrawal – Looking Outwards 4
This is a pretty cool project done in another interactive arts class, at 2007. What I like about this project is the element of surprise that it has. When the student is about to pull the trigger on his head in front of the projector, I didn’t know what to expect. Then when it was pulled, and the splash of color “sprayed” out of his head, just like the crows in the classroom I laughed. When an artifact like that triggers a spontaneous reaction from people, I think it hits the spot and does the exact right thing. As simple as it is (the simpler the better!).
SXSW 2012 Interactive Movie winner!
This “movie” is one of the SXSW 2012 winners. It’s pretty weird and is interactive with the mouse. Basically what I like about this movie is that you are not given any instructions whatsoever or feedforward as what you are supposed to do. But once you start clicking, you interact with the shapes (and later this funny figure) and get a lot of different feedback – both audio and visual that guide you through this “movie”. I could not make any sense of what exactly is going on there, but it is pretty cool! Too bad it’s using flash.
Another winner of the SXSW 2012 awards. I really liked this one: The users send txt messages, and those are translated to a fictional creature according to pre determined rules. Those in turn are displayed on the screen. What I like about this piece is the simplicity of interaction – everyone can do it with their phone. In addition, I like the artists attitude as saying that this is a metafore for words in general. Once they are out of your mouth, they are independent in a way and act on their own.
This sculpture has its lighting pattern and musical composition determined by visitor’s position underneath the umbrella net of fabric. The persons movements will alter the electroluminescent fibers. I like the way that this piece is integrated into the environment of the visitors. The tree-like, form, the canopy of light that results, the structure is unnatural and yet somewhat organically incorporated into the environment. The interaction is fairly basic, but at the same time, very natural, creating an experience that is delightful and fun, and yet nondisruptive to the world that exists around the sculpture.
This piece involves projecting a snake playing board onto walls that includes obstacles generated from real life obstacles, for example windows on the wall will have bounding boxes created around them. Originally I had been looking for pieces with strictly user interaction, however this made me shift my focus to begin thinking about how the environment’s interaction affects a piece. Utilizing both player controlled snakes and environmentally generated game boards, Snake the Planet! shows how two different dimensions of interaction come together.
In Starfield, the person sits on a swing to view the piece. As s/he swings back and forth, the projected starfield before him begins to move at a pace determined by his swinging speed and possessing forwards/backwards movement that mimics her/his own. With 3D glasses it’s as if s/he is immersed in the field of stars. I liked the idea of adding a swing to the visitor environment, to set the stage for the interaction. In my mind the piece can mostly go anywhere, but that one addition is what will determine if the piece can exist and be experienced as intended (since the projection can be done almost anywhere).
This week I looked at several projects involving physical motion and sound manipulation. I’m interested in how depth and motion sensors can be used to make listening a more active experience. Many have used this input to control digital instruments, but few have explored more passive methods of sound control.
Kinectar is an OSC-based system for handing Kinect skeletal data to Ableton Live. Chris Vik’s demo uses two hands to control 8 parameters of a granular synthesis system, including grain size, loop start, pitch, filter, LFO cutoff/depth, reverb, and panning. This project rides the border between synthesis and manipulation. The source material is a preloaded sample, but its musical qualities emerge from the artist’s motion alone. I would like to see these same controls in a more inherently musical context.
ScratchML is a file format for recording and replaying turntablism routines. It comes with an open source tool for capturing crossfader and record movements and relaying this data over OSC. This dataflow is extremely valuable as it capitalizes the strong link between a DJ’s physical motion and sonic results. TTM transcriptions could prove valuable for developing analogous sound-manipulation systems on kinect. How can we recreate this input three dimensions, rather than just 1?
In this video Chris Vik uses simple kinect data (position + speec) to control a complete Ableton orchestra. Each movement is tied to a single control, and a looping pedal is used to add some complexity. The sonic results are rich, but the performance is visually a disaster. I appreciate the artist’s enthusiasm and hard work, but his flamboyant gestures make it hard to take seriously. Future projects should use subtle and intuitive gestures to create more natural musical interfaces.
Alright, so this is pretty cool. This guy used AR and Papervision3D to compose very simple, but cool audio mixes. Using AR placards, he was able to control the lef and right headset audio fading, the volume, and even the number over overlayed beats. I think this is pretty epic and I think using AR to control something non-visual is definitely something I’ll look into for this interaction assignment. At this point, I may not be able to program something this cool, but I’d definitely like to play with interactivity, something that isn’t always inherent to AR. Overall, FLARsound’s coolest exhibition is its ability to blend user senses together, creating a successfully elegant art work.
So this has got to be one of the more cool Kinect x AR simulations I’ve seen. The idea is to use an overhead-mounted kinect to work with displaying images of actualy objects, creating an awesome AR-esque platform for real-time involvement. The piece is actually able to have its projections interact with the real outline of the room, allowing the displayed car to “tumble” over ramps or not run past walls. I think this type of interaction piece is awesome because it allows the user, as the man in the video says, to be able to react in real time with his simulation, whether that be a game or a virtual reality engine. Imagine the possibilities of interaction with current gaming consoles. I think this in and of itself is deserving of the world’s attention.
Alright, so I’ve been interested more and more in working with AR for this project. Not quite sure where I want to end up with that, since AR is very new to me. But of course, I found this awesome video of this guy who actually used an AR marker for his business card. Yeah, that’s right, and AR simulation on the back of his business card. The AR simulation essentialy is a pixelated (for the time) video of him actually explaining his work and what his best qualities are. I think this type of thing is fantastic because it allows for another level of interactivity in addition to a personal greeting. I think that’s a great touch and will definitely serve as a foundation for future introductions of the digital persuasion. I mean, wow, if anything, it goes to show his ability to actually do what he says he can do. Definitely a thumbs up if I were his future employer!
Comments Off on SankalpBhatnagar-LookingOutwards-5
I saw this presented at TEI recently, and though it sonded cool, I don’t really see the need for it. I agree that it is really great to combine new media technology and tangible objects like writing on a scroll, but inventing a new method seems a bit too much, Although some of the features- like marking up pictures is very neat.
This is a really cool concept. I like the idea of virtually existing in a galaxy space as you are on a physical swing, and I think it is cool that they used a kinect. However, I don’t really see why they used a kinect instead of a regular motion sensor. It doesn’t seem to be doing anything more than zooming in and out of the star field.
These are really neat! I like the interaction with a virtual turntable to create a beat. It is like taking one of the DJ apps for an iphone and making it into a single unit. This is cool but also makes you think about how simple it really is to make this using a touch screen for a device that can do so many other things. although this uses an actual turntable arm and light patterns to create the beat in a way that is different that coding a touchscreen.
This is another Looking Outward for Interactivity where I wanted to focus on Augmented Reality.
Street Art + AR
Sweza is a German Street Artist who uses QR codes to link to his art work. I couldn’t embed the video but here’s a link to it. QR codes seem slightly cliched now but I thought AR + Fiducial markers serve the same purpose. At one point in the video, Sweza talks about a project where a QR code on a radio image links to a song, which was pretty cool.
This video led me to another project by Sweza called GraffYard which uses QR Codes to preserve graffiti after it is removed. He places a QR Code in the exact location which resolves to an image of the original.
Virtual Graffiti Apps
There are a bunch of apps that let you “virtually” place graffiti at a place using AR techniques like ARTags, ARStreet, Street Tag, AirPainter, Layar, and finally in 3D via Scrawl.
So, I’m going to be posting a lot of interactive art I’ve found over the last few weeks. I would have uploaded this earlier, but I was really caught up in international travel issues.
…summary paragraphs coming soon….
#1: Concentricity, Interactive Light Sculpture by Joshua Kirsch:
So I really liked this project because it involves interactivity AND light. Based off the project video, the user is ale to control the mechanism and the corresponding light simply by physically interacting with the device. By either pulling the inner doodad outwards or pushing it around the sides, the lights go through various patterns. I like this concept because I find it subtly immersive, just enough to get the user to use it for some optimal length of time. Light through through its surrounding space, and with this piece, the user is capable of controlling a larger medium than he is most likely used to. A possible revision could include the machine “remembering” what movements the user made and then repeating the light changes and physical moves so that the user feels like they created an actual light presentation.
#2: Self-Portrait by rAndom and Incubator
Essentially, this is an subtly interactive art piece that continuously draws a snapshot of the audience looking at it on a blank canvas. Every time it snaps an image, it erased and recolors the canvas to make the images appear. The passive interactivity is something I can appreciate. I imagine seeing this at an actual art show. At a certain point, the audience may feel a bit less interactive in that the procedure of the art dictates a very limited scope of change. I feel like this could be solved by perhaps changing the color of the ink to correlate with how close or far the person in the audience is. This would be a next step that I could see really helping this in an exhibit.
So this is a software art app that allows the user to interact with it to create a desired effect. You can change the movements of the on-screen particles based on gravity, or direction, and you can make some very pretty patterns. I particularly like how the app is designed based on the original Gravilux that was released in 1998 in a galleries and museums around the world. The iPad itself is pretty great for several reasons, but the main one that I find increasingly awesome about it, is that the touch interactivity feels very natural. I just purchased an iPad and while I don’t have the programming abilities to make a full fledged app, I feel like if I could somehow involve my iPad, I’d be able to really come up with some creative solutions to my interactivity.
Light Drive is a stop motion video by Kim Pimmel. Each photograph is a long exposure video of lights controlled by an Arduino remotely controlled via bluetooth to drive a stepper motor. The light sources include cold cathode case lights, EL wire, lasers, etc.
David Haylock’s Project (Dance)
This is still a work in progress by David Haylock, but looks interesting. It uses a 10k Sanyo Projector, a 10m by 7m Cyc, Quartz Composer, MaxMSPJitter and the Kinect to augment a dance performance by projecting previous movements on the screen behind the dancers. I thought the result was really pretty for something that seems like a pretty simple idea.
Kinect Cat Sequencer
I’m always amazed at the lack of interactive pet software there is out there. Sometimes I’ll see a few here and there but the Kinect Cat Sequencer seemed like an interesting idea. It uses the Kinect to trigger audio samples that are triggered by how filled the cat’s plates are. Code & details are available here.
Kinetic Art – Dynamic Structure
The video is of a set of 32 independently moving lines that are controlled by an integrated computer system. The video shows some of the constantly changing ordered and random structures that appear and disappear.