Sound-Tracing

I have carried out two experiments on what I call sound-tracing. Sound-tracing means to render perceptual features of sound through body motion, i.e. to "trace the sound". The term was coined first time by my colleagues who carried out an experiment where participants were asked to imitate various sound objects on a digital tablet.

Godøy, R. I., Haga, E., and Jensenius, A. R. 2006. Exploring music-related gestures by sound-tracing. a preliminary study. In 2nd ConGAS International Symposium on Gesture Interfaces for Multimedia Systems. Leeds, UK.

Rather than gathering motion responses on a digital tablet, my experiments used optical motion capture technology, and the experiment participants responded through moving in air.

Experiment 1

The figure shows a subject holding the rod with reflective markers in one end. Motion capture cameras are surrounding the subject

In the first sound-tracing experiment, 10 sound objects were presented to the participants. Each participant (15 in all) moved a rod (the SoundSaber controller) in air to trace the perceptual features of the sound. Three recordings were made per sound per subject. A NaturalPoint Optitrack motion capture system was used to track the position of the rod. The sounds used in the experiment are presented in the table below.

Sound Pitch Spectral Centroid Loudness Onsets
1
Noise 3 sweeps 3 sweeps
3
2
Noise 3 sweeps 3 sweeps
3
3
Falling Rising Steady
1
4
Rising Falling Steady
1
5
Noise Rising Steady
1
6
Noise Rising / Complex Steady
1
7
Noise Rising, then falling Steady
1
8
Rising Complex Steady
1
9
Noise Steady Static (on/off)
5
10
Noise Complex Impulsive attacks
5

All sound files may be downloaded here.


Publications

So far, two papers have been published on the first sound-tracing experiment:
Nymoen, K., Glette, K., Skogstad, S. A., Torresen, J., and Jensenius, A. R. 2010. Searching for cross-individual relationships between sound and movement features using an SVM classifier. In Proceedings of the International Conference on New Interfaces for Musical Expression. Sydney, 259–262. [PDF]
Nymoen, K., Caramiaux, B., Kozak, M., and Torresen, J. 2011a. Analyzing sound tracings: a multimodal approach to music information retrieval. In Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies. ACM, New York, 39–44. [link]

The first paper investigated the use of a support vector machine classifier to search for similarities between how participants moved to the various sounds. The second paper evaluated how canonical correlation analysis could be applied to unveil more complex correlations between features of sound and features of motion

Experiment 2

The handle used for the experiment

A second experiment on sound-tracing was carried out in the autumn 2010. The sounds in the experiment were designed in Max5, where the sound features were controlled in a more systematic manner than in the first experiment. The subjects were using two handles of the type shown above, and the positions of these were measured with a Qualisys motion capture system. 44 people participated in the experiment, and one recording was made per sound per subject. The 18 sounds used in the experiment are found in the table below.

Sound Pitch Spectral Centroid Loudness
1
Rising Falling Bell-shape
2
Falling Rising Bell-shape
3
Falling Falling Bell-shape
4
Rising Rising Bell-shape
5
Rising Steady Increasing
6
Falling Steady Increasing
7
Steady Falling Bell-shape
8
Steady Rising Bell-shape
9
Steady Steady Increasing
10
Noise Falling Bell-shape
11
Noise Rising Increasing
12
Noise Steady Increasing
13
Steady Complex Increasing
14
Steady Complex Increasing
15
Rising Falling Semi Impulsive
16
Steady Steady Impulsive
17
Noise Steady Impulsive
18
Noise Falling Impulsive

All sound files may be downloaded here.

Video clips

The video clips below show some typical examples of the motion people related to the sound objects:

Three clips with impulsive sounds, showing a fast accentuation at the beginning of the sound tracing:
     

A static sound, showing three different strategies:
     


Publications

Two papers have been published on the experiment:
Nymoen, K., Torresen, J., Godøy, R., and Jensenius, A. R. 2012. A statistical approach to analyzing sound tracings. In Speech, Sound and Music Processing: Embracing Research in India, S. Ystad, M. Aramaki, R. Kronland-Martinet, K. Jensen, and S. Mohanty, Eds. Lecture Notes in Computer Science Series, vol. 7172. Springer, Berlin Heidelberg, 120–145. [PDF]
Nymoen, K., Godøy, R. I., Jensenius, A. R. and Torresen J. (2013). Analyzing correspondence between sound objects and body motion. In ACM Transactions on Applied Perception. Vol 10(2) [link]