2020:Audio Chord Estimation Results

From MIREX Wiki

Introduction

This page contains the results of the 2020 edition of the MIREX automatic chord estimation tasks. This edition was the eighth one since the reorganization of the evaluation procedure in 2013. The results can therefore be directly compared to those of the last six years. Chord labels are evaluated according to five different chord vocabularies and the segmentation is also assessed. Additional information about the used measures can be found on the page of the 2013 edition.

What’s new?

  • All datasets and evaluation procedures are the same as last year's MIREX.

Software

All software used for the evaluation has been made open-source. The evaluation framework is described by Pauwels and Peeters (2013). The corresponding binaries and code repository can be found on GitHub and the used measures are available as presets. The raw algorithmic output provided below makes it possible to calculate the additional measures from the paper (separate results for tetrads, etc.), in addition to those presented below. More help can be found in the readme.

The statistical comparison between the different submissions is explained in Burgoyne et al. (2014). The software is available at BitBucket. It uses the detailed results provided below as input.

Submissions

Abstract Contributors
HL2 PDF Yuan-Hao Ku, Hsueh-Han Lee

Results

Summary

All figures can be interpreted as percentages and range from 0 (worst) to 100 (best).

Isophonics2009
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 71.21 66.97 65.63 57.75 56.58 81.56 81.62 84.08

download these results as csv

Billboard2012
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 69.41 65.94 64.90 53.69 52.75 80.09 80.65 82.53

download these results as csv

Billboard2013
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 65.62 59.10 57.96 48.41 47.49 76.43 77.17 81.48

download these results as csv

JayChou29
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 64.54 62.10 57.20 44.97 41.17 84.01 82.84 85.90

download these results as csv

RobbieWilliams
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 77.20 72.65 71.38 65.05 63.88 85.18 85.07 87.16

download these results as csv

RWC-Popular
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 72.73 67.92 65.22 53.87 51.36 83.58 83.02 85.14

download these results as csv

USPOP2002Chords
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 74.01 70.36 67.66 58.80 56.34 83.56 83.60 85.76

download these results as csv

CASD-Annotator1
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 66.54 60.77 59.49 50.59 49.37 80.67 82.07 82.40

download these results as csv

CASD-Annotator2
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 67.03 60.14 59.06 50.14 49.08 77.45 76.36 83.32

download these results as csv

CASD-Annotator3
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 64.52 59.24 57.39 47.39 45.62 77.79 75.52 83.78

download these results as csv

CASD-Annotator4
Algorithm MirexRoot MirexMajMin MirexMajMinBass MirexSevenths MirexSeventhsBass MeanSeg UnderSeg OverSeg
HL2 55.17 50.04 43.15 41.34 35.32 77.10 79.80 77.67

download these results as csv


Detailed Results

More details about the performance of the algorithms, including per-song performance and supplementary statistics, are available from this repository. .

Algorithmic Output

The raw output of the algorithms are available in this repository. They can be used to experiment with alternative evaluation measures and statistics.