Skip to content

Commit

Permalink
Merge pull request #1 from fourMs/master
Browse files Browse the repository at this point in the history
merge with master
  • Loading branch information
Habibur Rahman authored Feb 21, 2020
2 parents f663fd0 + d2de3af commit 09b557a
Show file tree
Hide file tree
Showing 14 changed files with 152 additions and 117 deletions.
43 changes: 20 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,22 +2,7 @@

The Musical Gestures Toolbox (MGT) is a Matlab toolbox for analysing music-related body motion, using sets of audio, video and motion capture data as source material.

## Installation

1. Many of the basic things work with a standard Matlab install, but to use all features you will also need these extra Matlab packages:

- Computer Vision System Toolbox
- Image Processing Toolbox

2. Some of the functions build on code from these third-party toolboxes:

- [matlabPyrTools](https://github.com/LabForComputationalVision/matlabPyrTools/archive/master.zip)
- [MoCap Toolbox](https://www.jyu.fi/hum/laitokset/musiikki/en/research/coe/materials/mocaptoolbox)
- [MIRtoolbox](https://www.jyu.fi/hum/laitokset/musiikki/en/research/coe/materials/mirtoolbox).

3. Add the files in the folder "source-code" to your Matlab path: under the "Home" section, click "set path". Click the "Add Folder" button and choose the "source-code" folder and finally click "save".

4. Try out the m-files in the examples folder. There are some test files (audio, video, mocap) in the "example_data" folder.
It was primarily developed for music research, with a particular focus on studying the motion of musicians and dancers. But it can be used for any type of motion-related analysis based on video recordings.

## Functions

Expand All @@ -30,24 +15,37 @@ The Musical Gestures Toolbox contains a set of functions for the analysis and vi

## Tutorial

There is a more or less complete introduction to the main features in the software carpentry workshop [Quantitative Video analysis for Qualitative Research](https://alexarje.github.io/video-analysis-workshop/).
Please refer to the [wiki pages](https://github.com/fourMs/MGT-matlab/wiki) for more information about installing and usage. Much of the same information is also available in the material for the software carpentry workshop [Quantitative Video analysis for Qualitative Research](https://alexarje.github.io/video-analysis-workshop/).


## Dependencies

1. Many of the basic things work with a standard Matlab install, but to use all features you will also need these extra Matlab packages:

- Computer Vision System Toolbox
- Image Processing Toolbox

2. Some of the functions build on code from these third-party toolboxes:

- [matlabPyrTools](https://github.com/LabForComputationalVision/matlabPyrTools/archive/master.zip)
- [MoCap Toolbox](https://www.jyu.fi/hum/laitokset/musiikki/en/research/coe/materials/mocaptoolbox)
- [MIRtoolbox](https://www.jyu.fi/hum/laitokset/musiikki/en/research/coe/materials/mirtoolbox).

## History

The toolbox builds on the [Musical Gestures Toolbox for Max](http://www.uio.no/english/research/groups/fourms/downloads/software/musicalgesturestoolbox/), which has been developed by [alexarje](https://github.com/alexarje) since 2004, and parts of it is currently embedded in the [Jamoma](http://www.jamoma.org) project.

The first version of the Musical Gestures Toolbox for Matlab was made by [benlyyan](https://github.com/benlyyan) as part of his M.Sc. thesis at the University of Oslo ([Video analysis of music-related body motion in Matlab](https://www.duo.uio.no/handle/10852/51118)).
A large chunk of the code was written by [benlyyan](https://github.com/benlyyan) as part of his [M.Sc. thesis](https://www.duo.uio.no/handle/10852/51118) at the University of Oslo.

There is now also a [Musical Gestures Toolbox for Python](https://github.com/fourMs/MGT-python).
There is now also a [Musical Gestures Toolbox for Python](https://github.com/fourMs/MGT-python) with more or less the same functionality.

The software is currently maintained by the [fourMs lab](https://github.com/fourMs) at [RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion](https://www.uio.no/ritmo/english/) at the University of Oslo.
The software is maintained by the [fourMs lab](https://github.com/fourMs) at [RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion](https://www.uio.no/ritmo/english/) at the University of Oslo.

## Reference

If you use this toolbox for research purposes, please reference this publication:
If you use this toolbox for research purposes, please reference this publication:

- Jensenius, Alexander Refsum (2018). [The Musical Gestures Toolbox for Matlab](http://hdl.handle.net/10852/65559). Proceedings of the 19th International Society for Music Information Retrieval Conference, Late Breaking Demos Session.
- Jensenius, Alexander Refsum (2018). [The Musical Gestures Toolbox for Matlab](http://hdl.handle.net/10852/65559). Proceedings of the 19th International Society for Music Information Retrieval Conference, Late Breaking Demos Session. Paris, France.

## Credits

Expand All @@ -56,4 +54,3 @@ Main developers: [Alexander Refsum Jensenius](http://people.uio.no/alexanje) and
## License

This software is open source, and is shared with [The GNU General Public License v3.0](https://www.gnu.org/licenses/gpl-3.0.html).

Loading

0 comments on commit 09b557a

Please sign in to comment.