Skip to content


BigML Visualizations

BigML’s goal is to make the Machine Learning workflow simple, even enjoyable.  My task towards this was to provide the web interface for each step of the model training workflow.  This included an interactive CSV parsing module (with localization settings for numbers, etc.)

With CSV parsing options selected, our systems would calculate histograms and tabulate counts for fields.  I was able to visualize this information in a consistent way, so that all distributions for a given dataset were easy to understand and compare.

After relevant input and objective fields are selected, a decision tree model is trained.  I created a streamlined representation of a model, suitable for understanding the “big picture” of how the model behaves, as well as allowing users to explore and navigate through the decision tree branches.

Examples of the decision tree visualization are available in our model gallery.  If you want to see the rest, you can register for an account by submitting your e-mail on the BigML home page.

The visualizations are all HTML5, using SVG.  I wrote special libraries in Haxe that turned our asynchronous json resources into live updating tree and bar chart visualizations.

Strands 3D Run Tracker shifted from music recommendation to personal fitness recommendation around 2009.  Recommendation algorithms were now used to  recommend interesting routes through a variety of cities.  While the system was able to represent route recommendations on a map, it was difficult to get a sense of the surroundings for the route.  I conceived and implemented the strands “3D Tour” feature in mid 2009.  It enabled Strands users to visualize any uploaded route using the Google Earth plugin, giving a pretty close approximation to what running the route would actually look like.  I wrote code that integrated our route assets into forms that were usable by Google Earth.  I implemented a variety of 3d camera controls, so that it was seamless to navigate from a close up view, to a top down birds eye view.

I also did all of the graphic design, including the creation of 3d assets for the male and female avatars: “Strandrea” and “Strandley”.

Sadly, this project is no longer available, as recently shut down their fitness site.  However, a small video demonstration is available :

I used Haxe to write javascript code that integrated with the Google Earth js API, and made some of it available on haxelib.

Music Affinity Project and Dataset

Genres can be a useful means of describing musical content, but they’re not all equally useful for distinguishing certain styles.   The “Uncovering Genre Affinities” project explores how genres are perceived and used in every day music consumption.  We have a project page that provides the techniques, publications, code, and data sources used for this project.  Here is the abstract to the paper:

In organisation schemes, musical artists are commonly identified with a unique ‘genre’ label attached, even when they have affinity to multiple genres. To uncover this hidden cultural awareness about multi-genre affinity, we present a new model based on the analysis of the way in which a community of users organise artists and genres in playlists. Our work is based on a novel dataset that we have elaborated identifying the co-occurrences of artists in the playlists shared by the members of a popular Web-based community, and that is made publicly available. The analysis defines an automatic social-based method to uncover relationships between artists and genres, and introduces a series of novel concepts that characterises artists and genres in a richer way than a unique ‘genre’ label would do.

Strands “Summer of haXe 2008”

I hosted a student coding project for the summer of 2008 that was focused on the Haxe web programming language.  Haxe is a multipurpose “web” programming language geared at targeting the various web platforms at code or virtual machine bytecode level.  The main benefit is that programmers can create datastructures and tests that can be implemented globally across a project dealing with server side and client side resources.  Since haxe is strictly typed, its code is easier to maintain and often has better performance than the original language it replaces.  I’m co-hosting this project with Franco Ponticelli, who co-wrote the current book on the language, and who is serving as a mentor for the students.

“ArtistNet” Network Visualization

The ArtistNet Applet visualizes the stream of play events that are generated by MyStrands’ users.  The applet extracts artist information, and then generates a visual node on the screen (indicated by a representative album cover).
Initially, the nodes ‘repel’ eachother like the opposing ends of a magnet (which is known as a force directed layout technique).  However, if a single user plays two different artists in consecutive order, the applet will link them together, and they will then be attracted to one another.  After a while, different users may link the same artists in many different ways, forming a network of associated music. (sadly, this project is no longer available online).

I used Haxe to create a high performance force directed graph algorithm, targetting Flash.

Recommendation Mapping

Most recommender system results are presented as an answer to a question, when in reality, there are many results which may be interesting, and these results themselves may be related in interesting ways that would encourage exploration.

Recommendation Mapping is a technique that is novel in two ways.  Firstly, it is a novel network layout technique.  Since the recommender results are essentially network subgraphs, I use elements of the local graph connectivity as well as the global graph connectivity to weight the relatioinship information of the nodes (much like tf*idf handles local and global term information in documents).  This emphasizes “interesting” structure of nodes in the subgraph.

The second novel approach is how I handle occlusion, or node overlap.  While showing more information at once is useful, occasionally nodes can overlap eachother, effectively hiding or obscuring their presence in the map.  I use a local repulsion method that displaces the nodes according to mouse cursor position and movement to greatly alleviate this problem.  The interaction is non-modal, and easy for new users to handle.  Furthermore, the repulsion affect does not take place for “hovered” or selected nodes, which keeps the targetted node from “running away”. I used Haxe to create a novel method of interacting with scatterplot data, targetting Flash.

Sadly, this project is no longer available.


I’ve created a “performance controller” for laptop musicians for my capstone project at Indiana University. The project focuses on improving audience engagement through the design and creation of a performer and performance centric interface. The interface is a minimal visual obstacle for the audience, while providing for a range of expressive control for the musician. The device itself is a modified microphone stand with several types of embedded sensors and controls. Here’s an excerpt from the paper:

Musical performances using laptops as a sole performance interface can suffer from a lack of audience engagement. Due to the limitations of gesture variety, magnitude, and overall lack of expressive performer behaviors, the audience members are often unable to associate the acoustical sounds of the music with the performer. In the absence of a compelling visual point of reference, audience members accustomed to the standard conventions of contemporary musical performance can have negative responses or a lack of interest in the performance, regardless of their interest in the music. These negative responses adversely affect the engagement level of the spectator, and detract from the performance experience as a whole, limiting the effectiveness and potential of laptop music as performance art. This paper explores the domain of computer/electronic music interfaces and identifies processes, interfaces, and methods relevant to audience engagement with an emphasis on improving “the performer-audience connection”. It also describes the design process and concepts behind the “Limestick” (Laptop Interface for Musical Expression Stick) musical controller, a contribution towards the goal of improving audience engagement through performer-audience connectivity in laptop based musical performances.


A musical performance can be thought of as a framework of relationships between the performer, audience, music, and instrument. Each entity in the framework has a relationship to the other that can greatly influence the quality of the performance, regardless of whether or not the audience actually enjoys the music.

Oftentimes audiences have no clue as to how the musicians are creating or manipulating their music on their laptop. The limestick improves the audience’s understanding of the instrument’s role in the creation of music through the simple and expressive gestures required for its use. The instrument has a minimal, yet essential profile in the performance, and its use on stage makes the performer the focal point for the show. All of these qualities makes it an effective and practical performance controller.

Here are some videos:


Get every new post delivered to your Inbox.

%d bloggers like this: