Cristin-resultat-ID: 1862514
Sist endret: 21. desember 2020, 21:17
Resultat
Mastergradsoppgave
2016

Video analysis of music-related body motion in Matlab

Bidragsytere:
  • Bo Zhou

Utgiver/serie

Utgiver

Universitetet i Oslo
NVI-nivå 0

Om resultatet

Mastergradsoppgave
Publiseringsår: 2016
Antall sider: 100

Beskrivelse Beskrivelse

Tittel

Video analysis of music-related body motion in Matlab

Sammendrag

Today, there are several toolboxes which can work on audio, motion, or other sensor data. These toolboxes are very useful to provide characteristic analysis of audio and motion. Unfortunately, the analysis is done separately by different toolboxes. This results in inconvenience when we want to work on these data simultaneously. So developing a toolbox which integrates the existing toolboxes is necessary. The main goal of the project is to integrate these toolboxes in Matlab and provide video analysis combined with audio and motion capture data. This would be important for our interdisciplinary research on music and motions through fourMs as well as for external work on e.g. analyzing video recording for early child diagnosis of cerebral palsy. This project presents the development of a toolbox for Matlab entitled “Musical Gestures (MG) Toolbox”. This toolbox is aimed for solving pressing needs for the video analysis of music-related body motion since video source recorded by regular video camera is a very good option for studying motion. The term music-related body motion refers to all sorts of body motion found in music performance and perception. It has received a growing interest in music research and behavioral science over the last decades. Particularly, with the rapid development of modern technology, various motion capture systems make it possible to further study music-related body motion. Matlab has been chosen as the platform since it is readily available, and there are already several pre-existing toolboxes to build on. This includes the “Motion Capture (MoCap) Toolbox” [1] developed for the analysis and visualization of Motion Capture data, which is aimed specifically for the analysis of music-related body motion. The “Music Information Retrievel (MIR) Toolbox” [2] is another relevant toolbox, which is developed for the extraction of musical features from audio data and the investigation of relationships between sound and music features. While the two above mentioned toolboxes are useful for studying motion capture data and audio, respectively, they are very differently designed, and it is not possible to make combined analysis of audio and motion capture data. Furthermore, there is no integration with video analysis. The MG Max toolbox [3] has been developed for music-related video analysis in the graphical programming environment Max/Msp/Jitter, with a number of novel visualization techniques (motiongrams, motion history images, etc.). These techniques are commonly used in music research, but are not currently available in Matlab. The main contributions of this project consist of two following things. One is to integrate the MoCap toolbox and MIR toolbox, and provide simple preprocessing on different input data. Another is to provide several video analysis techniques to study music-related body motion in the toolbox. These video analysis techniques include motiongram, optical flow, eulerian video magnification. With these techniques, the developed MG toolbox for Matlab could provide reliable and quantitative analysis of music-related body motion based on video.

Bidragsytere

Bo Zhou

  • Tilknyttet:
    Forfatter
Aktiv cristin-person

Alexander Refsum Jensenius

  • Tilknyttet:
    Veileder
    ved RITMO (IMV) Senter for tverrfaglig forskning på rytme, tid og bevegelse ved Universitetet i Oslo
1 - 2 av 2