Paper: | AE-P4.5 | ||
Session: | Applications to Music I | ||
Time: | Thursday, May 20, 15:30 - 17:30 | ||
Presentation: | Poster | ||
Topic: | Audio and Electroacoustics: Applications to Music | ||
Title: | A COMPARISON OF HUMAN AND AUTOMATIC MUSICAL GENRE CLASSIFICATION | ||
Authors: | Stefaan Lippens; Ghent University | ||
George Tzanetakis; Carnegie Mellon University | |||
Jean-Pierre Martens; Ghent University | |||
Tom De Mulder; Ghent University | |||
Abstract: | Recently there has been an increasing amount of work in the area of automatic genre classification of music in audio format. In addition to automatically structuring large music collections such classification can be used as a way to evaluate features for describing musical content. However the evaluation and comparison of genre classification systems is hindered by the subjective perception of genre definitions by users. In this work we describe a set of experiments in automatic musical genre classification. An important contribution of this work is the comparison of the automatic results with human genre classifications on the same dataset. The results show that, although there is room for improvement, genre classification is inherently subjective and therefore perfect results can not be expected neither from automatic nor human classification. The experiments also show that features derived from an auditory model have similar performance with features based on Mel-Frequency Cepstral Coefficients (MFCC). | ||
Back |
Home -||-
Organizing Committee -||-
Technical Committee -||-
Technical Program -||-
Plenaries
Paper Submission -||-
Special Sessions -||-
ITT -||-
Paper Review -||-
Exhibits -||-
Tutorials
Information -||-
Registration -||-
Travel Insurance -||-
Housing -||-
Workshops