Eyeo 2018 Lecture Response

I feel that one of the most interesting points brought up in the talk was the ability of machine learning systems to detect, interpret, and communicate subconscious embodied practices. Coming from personal experience trying to do so for collective play, it’s quite difficult to imagine and write an algorithm that makes use of implicit meaning within gestures or expressions. The ability to approach this task through the methods of machine learning can bring a greater understanding of, or at least get closer to, the intentions of the artist. This supports the creative output of the artist, making it easier for them to feel that what they create is being throughly represented and brought forth, as emphasized by the quotation from the talk that the artist “Can ‘feel’ the music as I was playing it.” This attribute of machine learning plays into a seperate concept brought up by Fiebrink, namely the novel creative relationship defined by an ML system and an artist.

Dream Design for ML System

It just so happened that, seperately this week, I've started work on ideating with a couple friends for a proposal for the collaborative dance + technology performance. One idea of mine that I kept coming back to was a concept of a system that trained the creation of the dance's music on the choreography of the on-stage performer in real time, instead of the usual integration of machine learning into dance that trained new choreography on pre-existing posture data and music.

This system would take in the posenet data (or possibly some other system obtained through infrared light on a Kinect) to decide what type, timbre, mood, of the musical score would best accompany what the performer was doing on stage. This can be added to algorithmically run lights and projects so as to create an entire technical performance off the decisions of the dancer on stage in real time.

Tangible ML Training

For this mini-project, I combined a couple of the prompts together constructed a physical interface for the real-time control of machine learning model training and display. It runs off an Arduino Uno, controlled by a Node server running Johnny-Five, Express, and Socket.io. The physical sliders and buttons allow full control of the model training process while also serving to segment the experience into a digital interface with the trained model and a physical interface with the untrained model. Here is what the physical interface looks like:

Box

It can handle up to two outputs through the two faders position to the left. The black 'capture sample' button works either when pressed for one or when held for multiple. The large green button train the model when the user decides they have completed their training.

And here is the code base for the project:

NodeJS Server (Hardware Input)
Client Javscript
Client HTML
//---------------------------------------------
// Node Server Setup Code
//---------------------------------------------
// Module Requirements
const port = process.env.PORT || 3000;
var express = require('express');
var path = require('path');
var app = express();
var server = require('http').createServer(app).listen(port);
var io = require('socket.io').listen(server);

app.use(express.static('public'));

// Send index.html at '/'
app.get('/', function(req, res){
 res.sendFile(path.join(__dirname + '/routes/training.html'));
});

//Log start of app
console.log("App Started");

//---------------------------------------------
// Websocket Code
//---------------------------------------------
// Setup input event streams
let trainer = io.of('/trainer');
// Listen for input clients to connect
trainer.on('connection', function(socket){
  console.log('An trainer client connected: ' + socket.id);
});

// Setup output event streams
let outputs = io.of('/output');
// Listen for output clients to connect
outputs.on('connection', function(socket){
 console.log('An output client connected: ' + socket.id);

 // Listen for this output client to disconnect
 socket.on('disconnect', function() {
  console.log("An output client has disconnected " + socket.id);
 });
});

//---------------------------------------------
// Johnny-Five Code
//---------------------------------------------
var five = require("johnny-five");

board = new five.Board();

board.on("ready", function() {

  // Button Initialization
  let trainButton = new five.Button("9");
  let sampleButton = new five.Button("8");
  let fader = new five.Sensor({pin: "A0", freq: 250});

  // Train Button Setup
  trainButton.on("press", function() {
    trainer.emit("trainButtonPressed");
  });

  // Sample Button Setup
  sampleButton.on("press", function() {
    trainer.emit("sampleButtonPressed");
  });

  fader.on("data", function() {
    trainer.emit("sliderData", this.value / 1023);
  });
});