I was surprised at how prominently “social” sources for musical understanding were featured in the projects and demos, so our work on using playlists/streams to understand relationships between genres was right at home. There were also some fantastic demos, including the “Aura Music Explaura” which allows for “steerable recommendations” by using tags.
The way that Francois and Paul (the project leads) implemented the tag interface was particularly brilliant. By leveraging the common visual motif of a tag cloud, the interface communicates the frequencies of the tag labels applied to the music. However, the tags themselves are interactive. By dragging the tag term boundaries, one can change the “weight” of the tag. This also increases or decreases its relative visual size. Doing so will alter the importance of the tag in selecting similar artists. So, hypothetically someone could alter the amount of “emo” they want in their music results. Paul gave me a quick demo, but I really hope this ends up publicly accessible soon, it’s killer.