Here is a quick writeup + video:
A standard webcam is mounted underneath the baseplate. The image is processed by a Python Script using the OpenCV Library to track the bricks. The tricky bit was to not track the users hand but we succeded at that as well. The information about brick color, position, orientation derived from the image is then converted into OpenSoundControl (OSC) messages which are sent over a network connection to a computer running Native Instruments Maschine to play back the sounds.Of course this would work with other sound generators as well, since the whole thing simply spits out OSC-Messages and MIDI — but hey: if the guys from Native are there youbetter use their Maschine stuff.
Being real Masterbuilders, of course we used only unmodified, standard Lego Parts and no Kragle* for the construction.
*(see the Lego Movie for reference