I looked at House Party, an arduino controlled room that utilizes household objects (lamps, shoes, tvs, etc) to create its own symphony.

I was surprised to see how much this project could get done with a modest amount of basic circuitry. Elements like the shoe tapping (which seemed to control both rotation and "tapping" with the same motor) was impressive to me. They used Logic Pro (the same DAW I have been using for years now) and fed it MIDI data from json files to get the sounds to play. I would like to do something similar with MIDI files, and go as far as assign all 127 values to some kind of response. For my automoton project I ended up doing something musical (Wilson, the drummer puppet) but I would like to go even further combining music with physical computing.


You ever feel like you really need to get some work done but the dang trackpad robots just won't give you no room?

Lately I've been feeling pretty burnt out from my workload so I decided to turn this AR project into a fun literal interpretation of my resistance towards doing any more work this semester.

The "site" is the laptop I use for schoolwork, on my desk which is also where I do my schoolwork. The "viewer" is me, a college student hopelessly behind on his deliverables, and the "object" is a bunch of happy-go-lucky robots freeloading my precious trackpad real-estate (good thing I brought a mouse!).

I modeled and rigged the robots in Blender, and applied different looping mocap animations I found on Turbosquid.


Creating Believable Visuals (Intermediate)

I am glad I took the time to go through this series of tutorials, as it addressed a lot of mistakes I have been making when lighting, modelling, texturing and rendering my scenes in Unity. In general my takeaway is that less is more when it comes to rendering, and a low-poly well-textured model can look a lot smoother and polished than a high poly mess of bad topology.

Going forward I am going to use sharp edges and vertices sparingly, as most object in the real world are rarely sharp, cube-cut shapes. I had also been underestimating how useful reflection probes are in scene lighting, and I had been marking all static scene objects as lightmap static, when in fact this is really only noticeable on large objects.

I also got to learn all about the post processing stacks that unity provides, and how I can customize them at a deep level. I will be making use of this feature on every project henceforth, particularly dithering, ambient occlusion and the tonemapper.

Lastly, I did not realize it was possible to export multiple sets of UV maps, each for a different purpose. For instance, UV mappings for base albedo texturing should space out the faces to allow the most amount of detail to fit on them, while lightmap UVs should prioritize large faces to have the most area for the smoothest shading, and reflection UVs should evenly size faces to avoid weird clipping artefacts.


Punk Rock Album Generator
The Punk Rock Album Generator is spreading anarchy and rebelling against the system with its questionably plagiarized lyrics

Going into this project I was thinking about how I typically don't incorporate my identity as a musician into my work, both in CFA and in SCS. Until recently my passion for music and punk rock has been a separate entity from my identity as an artist and creative coder. While I was looking at various books looking for inspiration I found my band's bass guitarist's massive Leonard Cohen bass songbook of popular rock songs, and I realized there is a standard set of conventions used in tabs, transcripts and sheet music that are pretty much universally followed: lyrics spaced over on top of bars of sheet music/tablature. To me this was a great way to get my feet wet with letting my musical self into my art world.

I used Markov chains over a relatively small and hand-picked corpus that consisted of lyrics from my favorite Green Day, Offspring, Rage Against The Machine and Sublime albums. After some tinkering I was satisfied with the whacky lyrical compositions being spit out by my program and I thought it was cool that every now and then I'd recognize the particular 2-3 songs that were being mashed together and when I shared the lyrics with my band they picked up on this as well.

At first I thought learning Basil to handle layouts was going to slow me down in that I'd have to learn a new skill, but in retrospect Basil was a very versatile and intuitive program and without it I could not have generated my totally random sheet music. Were I to revisit this project I would want to generate actual MIDI files based in music theory that actually sound like coherent songs, as well as add parts for more instruments. This is one of the few projects I have done this semester that I want to keep going with until I feel like I can sit down and record myself performing an entire procedurally-generated punk rock album on every instrument I know.


I think Calvin's got a point here. Its significantly more difficult to chop things we personify into entree-sized bits. Its also precisely what I loved about Allison Parish's talk and her approach to her work. Allison regards exploration bots and generative programs as personified robot beings that serve humanity by venturing out into spaces too hostile for us. What struck me the most was the comments some of the twitter bots were getting, where followers were directly speaking to the bot as a person no less than the rest of us. I tend to get attached to my work pretty easily, but perhaps that will allow me to create generative art and AI that feel real and have a unique personality. Every one of the automatons shown in class recently were far more than a set of servos and scrap material from CCR; it was even fitting for most of them to have names because ultimately in critique we referred to the automatons by their name and commented on their character. After seeing Allison's talk I'm certain the same could be achieved with the generative text assignment. Given the relationship humans have with reading so far, there's an inherent expectation that an author wrote the generated text.

The Deep Questions bot truly felt like there was a conspiracyKeanu-esque individual with too much time on their hands sharing their thoughts on the internet, and going forward with the generative text assignment I want to anthropomorphize the perceived "author" of the text just as much as the text itself.


Working on this project was an immensely informative experience for me. My partner Huw had a lot of prior experience with arduino and working with physical computation in general, while I was relatively new to working in that way. We went through many different ideas before we came up with the idea of a marionette drummer made of trash, playing in a whacky, improbable manner in a room also made up of disposable "trashy" material. We were inspired by the materials we were working with, such as the dirty-looking carpet fuzz sheets, carpet crumb, pieces of a broken nerf toy and colored felt.

It was the use of these unconventional materials that led to a source of conflict for me and my partner (chewie) as we were trying to figure out to what extent we should try to use our materials to emulate a representation of a real space, or stick purely to our materials and not try to hide what we were working with. At one point our automaton sported a costume, a paint job and went through a few different caricaturized heads. It was after a great deal of discussion that we concluded that this attempt to aestheticize our materials was actually just clashing with the rest of the project. We removed the costume and the paint and added in the tennis ball bobbing on a spring instead, to both of our satisfaction. On the other hand, using the playing cards to make a poster, dowels and scrap metal for the drums and carpet, cardboard and foam for the bed turned out to be pretty effective as a story-telling device. We took to calling our automaton Wilson (for obvious reasons).


As far as the division of labor went, I had been learning about puppetry in my video class and was able to apply those skills to making a simple marionette puppet. I then used code from the servo sweep example as well as the motor party example to get Wilson to play. I used the spare parts I had to put together a simple drumset. My partner (chewie) was instrumental in creating the room set, and without him our project wouldn't be nearly as sturdy, nor would our wiring work. He came up with the look for the interior too, for things like the poster and the unmade bed. The most valuable contribution chewie made however, was his unwavering perfectionism. I tend to let things slide and be ok with parts of my work not being effective. chewie made us iterate multiple times through elements of the piece that just weren't working. In addition to changing the costume and color, the initial armature for Wilson was rolled cardstock, the furniture was fully cardboard and the head was a sketched-in face of John Bonham with tacked-on hair. These were all either details that were lacking in thought, or they impaired the wild and free motion of Wilson, which was clearly the focus of the piece. I was surprised by just how much one can roll back and revert changes when working with physical objects. Although we had no CTRL-Z or Github to fix our mistakes, working with actual physical layers of materials felt like there was a concrete sense of version control after all.



Move your nose left and right to roll Bobo
Jerk your nose up to make Bobo jump for dear life!

Originally my plan was to make a simplified version of a game like Stick Fight or Super Smash Bros, but allow players to handle movement with their head via webcam, freeing up both their hands for the combat aspect of those games which can be complex. However I was having a lot of trouble getting the networking, matter.js and posenet components working together so I decided to boil the concept down to its most basic unique element, which was the movement.

I have noticed that when people play highly-movement centric games like platformers and racing games that they almost involuntarily jerk their body toward where they want their avatar to be. Its amusing to watch especially non-gamers frantically hopping around in their seats as they get used to the controls of a new game. I thought it would be interesting to have this kind of physical response to platformers be an actual element of its control rather than just a by-product.

My main challenge here was making the head controls comfortable. In an earlier iteration of this game I noticed the back of my neck was getting sore after playing it for more than a minute or so. Most of my changes after that were trying to find the right balance of values for tracking jumps, and I feel like I need to add sensitivity controls because the few people I tested this with had widely different ways of getting their character to jump, some being far more forceful than others. I also wish I had given myself more time to document this work and record a full demo so I could have made use of the in-class critiques.

In conclusion, I think I will be making use of posenet in future projects. In addition, I enjoyed working with matter.js, it was my first time using it and I don't think I even scraped the surface of what was possible, and I hope to do that as well in the future.

var Engine = Matter.Engine,
    Render = Matter.Render,
    World = Matter.World,
    Bodies = Matter.Bodies;
var engine = Engine.create();
var render = Render.create({
    element: document.body,
    engine: engine,
    options: {width:800, height:800,


let platTex = "https://cdn.glitch.com/7541f658-c2e5-4490-8bac-21a2d3c09449%2FtestPlatformTex.jpg?1539315376974";
let bobo1Tex = "https://cdn.glitch.com/7541f658-c2e5-4490-8bac-21a2d3c09449%2FgroundBobo.png?1539318024497";
let bobo2Tex = "https://cdn.glitch.com/7541f658-c2e5-4490-8bac-21a2d3c09449%2FjumpBobo.png?1539318026058";

var player;
var poseNet;
var platforms = [];

function setup() {
  createCanvas(800, 800);
  video = createCapture(VIDEO);
  video.size(width, height);
  poseNet = new PoseNetObj(video);
  poseNet.poseNet.on('pose', function (results) {
    poseNet.poses = results;

function draw() {
  image(video, 0, 0, 800, 800); 

function Reset() {
  engine.events = {};
  player = new Player(poseNet);
  platforms.push(new Platform(200, 300, 1500, 20));

function GameLoop() {
  for(var i=0; i-10 || abs(this.gameObject.velocity.y)>2) {
      } else {
      if(abs(this.gameObject.velocity.y)>1) {
        this.gameObject.render.sprite.texture = bobo2Tex;
      } else {
        this.gameObject.render.sprite.texture = bobo1Tex; 
      this.velocity.x = relativeHeadPos.x/this.inertia;
      this.velocity.y = relativeHeadPos.y/this.inertia;
      this.prevY = this.poseSource.GetHeadPos().y;

      Matter.Body.applyForce(this.gameObject, this.gameObject.position, this.velocity);
  this.CheckBounds = function() {
    if(this.gameObject.position.x<-10 || this.gameobject.position.x>1500) {
      console.log("game over");
      return true; 
    if(this.gameObject.position.y<-100 || this.gameobject.position.y> 1000) {
      console.log("game over");
      return true; 
    return false;

function PoseNetObj(videoSource) {
  this.video = videoSource;
  this.poses = [];
  this.poseNet = ml5.poseNet(video, {flipHorizontal:true, detectionType:'single'});
  this.lastKnownPos = {x:width/2, y:height/2};
  this.GetHeadPos = function() {
    let playerPose = this.poses[0]; 
    if(typeof(playerPose)!="undefined") {
      this.lastKnownPos.x = playerPose.pose.keypoints[0].position.x;
      this.lastKnownPos.y = playerPose.pose.keypoints[0].position.y;
    return this.lastKnownPos;


Recently I have been making more use of my Fitbit, and one of the social "games" I have been playing on it is the step challenges. Counting steps as a form of interactivity is a fairly old concepts, with the first pedometers showing up from Japanese manufacturers in 1985 (interestingly, Leonardo Da Vinci had envisioned a mechanical step-counting gadget centuries earlier!).

What I found unique about the Fitbit's spin on the "step challenge" concept is the virtual races you can hold with your friends. Users can pick an iconic expanse to walk across to race on like the Alps, Appalachian or Rocky mountains and can see realtime where their friends stand along these trails. The fitbit will (if permitted) use GPS tracking to figure out how much distance users cover, or utilize their gait and steps taken to generate a somewhat accurate representation of distance covered on these trails. Furthermore, walking these trails allows users to unlock 180 degree views of these locations and in-game "treasures" and unlockables.

The second and somewhat less obvious effect of this interactivity is that I find myself feeling closer to the people who do these challenges with me, regardless of the multiple thousand miles between us. The concept of catching up and being able to overtake your friends helps me feel closer to them. I am not sure if the developers realized this aspect of their product, but I think this is something special, and I see the potential in a game that makes you feel closer to people by moving relative to them.


Spectacle: I see "spectacle" as work that pushes the limits of our notion of what we can do with the tools and knowledge we have at our disposal as a community of artists. An entirely-spectacle driven work would likely give the audience a new way of generating content, but lack in self-awareness or meaning that transcends its time and medium.
Speculation: I see "speculation" as an experiment of some immutable artistic concept in a novel way. An entirely-speculation driven work would probably cause the audience to question the way they think about a certain concept, but lack real world applicability as a product.

A few weeks ago I came with an idea for an idle-clicker genre of game called Penis Simulator. This was during a period in Concept Studio: Systems and Processes where we were to create work in response to a system within the body, and I was working with the reproductive system. I think my project erred on the side of Speculation while modestly dabbling in Spectacle. Penis Simulator gives the player a virtual abstracted representation of a penis and testes, and by rapid mouse clicks and dragging vertically the player can control temperature and testosterone levels to achieve the open-ended goal of raising over-all sperm count. The result was one of the rare few times I have seen that something as sexualized and perverted as a human penis perceived as a totally non-sexual, sterile object with simple inputs, outputs and controlling variables. My professor expressed her discomfort and called the project problematic on a few occasions but overall the game had a positive response both amongst testers required by the class to play my game as well as people interested in trying it out of their own volition. I see my work so far as a proof-of-product and want to use what I have learnt from Warburton's argument to integrate a more substantial element of spectacle into the speculation piece I have created.