Difference between revisions of "2009:Audio Tag Classification (MajorMiner) Set Results"

From MIREX Wiki
m (Robot: Automated text replacement (-mirex/2009/results/ +mirex/results/2009/))
m (Robot: Automated text replacement (-mirex/abs/ +mirex/abstracts/))
Line 138: Line 138:
 
====Team ID====
 
====Team ID====
  
'''BP1''' =  [https://www.music-ir.org/mirex/abs/2009/BP_train_tag.pdf Juan Jos├⌐ Burred, Geoffroy Peeters]<br />
+
'''BP1''' =  [https://www.music-ir.org/mirex/abstracts/2009/BP_train_tag.pdf Juan Jos├⌐ Burred, Geoffroy Peeters]<br />
'''BP2''' =  [https://www.music-ir.org/mirex/abs/2009/BP_train_tag.pdf Juan Jos├⌐ Burred, Geoffroy Peeters]<br />
+
'''BP2''' =  [https://www.music-ir.org/mirex/abstracts/2009/BP_train_tag.pdf Juan Jos├⌐ Burred, Geoffroy Peeters]<br />
'''CC1''' =  [https://www.music-ir.org/mirex/abs/2009/CC.pdf Chuan Cao, Ming Li]<br />
+
'''CC1''' =  [https://www.music-ir.org/mirex/abstracts/2009/CC.pdf Chuan Cao, Ming Li]<br />
'''CC2''' =  [https://www.music-ir.org/mirex/abs/2009/CC.pdf Chuan Cao, Ming Li]<br />
+
'''CC2''' =  [https://www.music-ir.org/mirex/abstracts/2009/CC.pdf Chuan Cao, Ming Li]<br />
'''CC3''' =  [https://www.music-ir.org/mirex/abs/2009/CC.pdf Chuan Cao, Ming Li]<br />
+
'''CC3''' =  [https://www.music-ir.org/mirex/abstracts/2009/CC.pdf Chuan Cao, Ming Li]<br />
'''CC4''' =  [https://www.music-ir.org/mirex/abs/2009/CC.pdf Chuan Cao, Ming Li]<br />
+
'''CC4''' =  [https://www.music-ir.org/mirex/abstracts/2009/CC.pdf Chuan Cao, Ming Li]<br />
'''GP'''  =  [https://www.music-ir.org/mirex/abs/2009/Peeters_2009_MIREX_classification.pdf Geoffroy Peeters]<br />
+
'''GP'''  =  [https://www.music-ir.org/mirex/abstracts/2009/Peeters_2009_MIREX_classification.pdf Geoffroy Peeters]<br />
'''GT1''' = [https://www.music-ir.org/mirex/abs/2009/GTfinal.pdf George Tzanetakis]<br />
+
'''GT1''' = [https://www.music-ir.org/mirex/abstracts/2009/GTfinal.pdf George Tzanetakis]<br />
'''GT2''' = [https://www.music-ir.org/mirex/abs/2009/GTfinal.pdf George Tzanetakis]<br />
+
'''GT2''' = [https://www.music-ir.org/mirex/abstracts/2009/GTfinal.pdf George Tzanetakis]<br />
'''HBC''' = [https://www.music-ir.org/mirex/abs/2009/HBC.pdf Matthew D.Hoffman, David M. Blei, Perry R.Cook]<br />
+
'''HBC''' = [https://www.music-ir.org/mirex/abstracts/2009/HBC.pdf Matthew D.Hoffman, David M. Blei, Perry R.Cook]<br />
'''LWW1''' = [https://www.music-ir.org/mirex/abs/2009/LWW.pdf Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang]<br />
+
'''LWW1''' = [https://www.music-ir.org/mirex/abstracts/2009/LWW.pdf Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang]<br />
'''LWW2''' = [https://www.music-ir.org/mirex/abs/2009/LWW.pdf Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang]<br />
+
'''LWW2''' = [https://www.music-ir.org/mirex/abstracts/2009/LWW.pdf Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang]<br />
  
 
==Overall Summary Results (Binary)==
 
==Overall Summary Results (Binary)==

Revision as of 22:46, 13 May 2010

Introduction

This task compares various algorithms' abilities to associate tags with 10-second audio clips of songs. The tags come from the MajorMiner game. This task is very much related to the other audio classification tasks, however, instead of one N-way classification per clip, this task requires N binary classifications per clip.

Two outputs are produced by each algorithm:

  • a set of binary classifications indicating which tags are relevant to each example,
  • a set of 'affinity' scores which indicate the degree to which each tag applies to each track.

These different outputs allow the algorithms to be evaluated both on tag 'classification' and tag 'ranking' (where the tags may be ranked for each track and tracks ranked for each tag).

Data

All of the data is browsable via the MajorMiner search page.

The music consists of 2300 clips selected at random from 3900 tracks. Each clip is 10 seconds long. The 2300 clips represent a total of 1400 different tracks on 800 different albums by 500 different artists. To give a sense for the music collection, the following genre tags have been applied to these artists, albums, and tracks on Last.fm: electronica, rock, indie, alternative, pop, britpop, idm, new wave, hip-hop, singer-songwriter, trip-hop, post-punk, ambient, jazz.

Tags

The MajorMiner game has collected a total of about 73000 taggings, 12000 of which have been verified by at least two users. In these verified taggings, there are 43 tags that have been verified at least 35 times, for a total of about 9000 verified uses. These are the tags we will be using in this task.

Note that these data do not include strict negative labels. While many clips are tagged rock, none are tagged not rock. Frequently, however, a clip will be tagged many times without being tagged rock. We take this as an indication that rock does not apply to that clip. More specifically, a negative example of a particular tag is a clip on which another tag has been verified, but the tag in question has not.

Here is a list of the top 50 tags along with an approximate number of times each has been verified, how many times it's been used in total, and how many different users have ever used it:

Tag Verified Total Users
drums 962 3223 127
guitar 845 3204 181
male 724 2452 95
rock 658 2619 198
synth 498 1889 105
electronic 490 1878 131
pop 479 1761 151
bass 417 1632 99
vocal 355 1378 99
female 342 1387 100
dance 322 1244 115
techno 246 943 104
piano 179 826 120
electronica 168 686 67
hip hop 166 701 126
voice 160 790 55
slow 157 727 90
beat 154 708 90
rap 151 723 129
jazz 136 735 154
80s 130 601 94
fast 109 494 70
instrumental 103 539 62
drum machine 89 427 35
british 81 383 60
country 74 360 105
distortion 73 366 55
saxophone 70 316 86
house 65 298 66
ambient 61 335 78
soft 61 351 58
silence 57 200 35
r&b 57 242 59
strings 55 252 62
quiet 54 261 57
solo 53 268 56
keyboard 53 424 41
punk 51 242 76
horns 48 204 38
drum and bass 48 191 50
noise 46 249 61
funk 46 266 90
acoustic 40 193 58
trumpet 39 174 68
end 38 178 36
loud 37 218 62
organ 35 169 46
metal 35 178 64
folk 33 195 58
trance 33 226 49

Evaluation

Participating algorithms were evaluated with 3-fold cross validation. Artist filtering was used in the production of the test and training splits, I.e. training and test sets contained different artists.

Binary Evaluation

Algorithms are evaluated on their performance at tag classification using F-measure. Results are also reported for simple accuracy, however, as this statistic is dominated by the negative example accuracy it is not a reliable indicator of performance (as a system that returns no tags for any example will achieve a high score on this statistic). However, the accuracies are also reported for positive and negative examples separately as these can help elucidate the behaviour of an algorithm (for example demonstrating if the system is under of over predicting).

Affinity (ranking) Evaluation

Algorithms are evaluated on their performance at tag ranking using the Area Under the Receiver Operating Characteristic Curve (AUC-ROC). The affinity scores for each tag to be applied to a track are sorted prior to the computation of the AUC-ROC statistic, which gives higher scores to ranked tag sets where the correct tags appear towards the top of the set.


General Legend

Team ID

BP1 = Juan José Burred, Geoffroy Peeters
BP2 = Juan José Burred, Geoffroy Peeters
CC1 = Chuan Cao, Ming Li
CC2 = Chuan Cao, Ming Li
CC3 = Chuan Cao, Ming Li
CC4 = Chuan Cao, Ming Li
GP = Geoffroy Peeters
GT1 = George Tzanetakis
GT2 = George Tzanetakis
HBC = Matthew D.Hoffman, David M. Blei, Perry R.Cook
LWW1 = Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
LWW2 = Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang

Overall Summary Results (Binary)

Measure BP1 BP2 CC1 CC2 CC3 CC4 GP GT1 GT2 HBC LWW1 LWW2
Average Tag F-measure 0.277 0.290 0.209 0.241 0.170 0.263 0.012 0.290 0.293 0.044 0.289 0.311
Average Tag Accuracy 0.868 0.859 0.912 0.905 0.913 0.890 0.891 0.850 0.850 0.914 0.900 0.903
Average Positive Tag Accuracy 0.368 0.394 0.204 0.256 0.146 0.304 0.019 0.440 0.446 0.035 0.295 0.317
Average Negative Tag Accuracy 0.865 0.861 0.944 0.922 0.963 0.902 0.978 0.849 0.849 0.989 0.938 0.940

download these results as csv


Summary Binary Relevance F-Measure (Average Across All Folds)

Tag Positive Examples Negative Examples BP1 BP2 CC1 CC2 CC3 CC4 GP GT1 GT2 HBC LWW1 LWW2
80s 117.000 2098.000 0.075 0.148 0.013 0.073 0.033 0.213 0.012 0.177 0.182 0.000 0.153 0.161
horns 40.000 2175.000 0.022 0.133 0.000 0.000 0.000 0.000 0.018 0.033 0.050 0.000 0.096 0.075
voice 148.000 2067.000 0.031 0.101 0.012 0.013 0.012 0.084 0.121 0.113 0.131 0.000 0.081 0.075
trumpet 39.000 2176.000 0.315 0.222 0.000 0.061 0.000 0.061 0.000 0.239 0.256 0.000 0.183 0.282
rap 157.000 2058.000 0.642 0.615 0.712 0.710 0.624 0.632 0.000 0.552 0.561 0.000 0.589 0.600
strings 55.000 2160.000 0.198 0.184 0.000 0.000 0.000 0.000 0.000 0.158 0.109 0.000 0.089 0.089
pop 479.000 1736.000 0.429 0.446 0.407 0.500 0.234 0.468 0.000 0.459 0.459 0.004 0.449 0.464
rock 664.000 1551.000 0.684 0.630 0.669 0.655 0.621 0.649 0.023 0.618 0.618 0.633 0.694 0.698
house 65.000 2150.000 0.124 0.212 0.028 0.098 0.000 0.315 0.038 0.297 0.277 0.000 0.247 0.263
punk 51.000 2164.000 0.314 0.321 0.000 0.087 0.000 0.219 0.000 0.183 0.196 0.000 0.213 0.332
ambient 60.000 2155.000 0.248 0.142 0.133 0.078 0.054 0.123 0.000 0.189 0.244 0.098 0.241 0.253
rb 38.000 2177.000 0.054 0.171 0.000 0.000 0.000 0.000 0.000 0.158 0.193 0.000 0.052 0.083
techno 248.000 1967.000 0.483 0.471 0.512 0.606 0.399 0.566 0.000 0.481 0.487 0.007 0.516 0.514
distortion 60.000 2155.000 0.082 0.142 0.000 0.000 0.000 0.035 0.000 0.200 0.222 0.000 0.216 0.204
funk 37.000 2178.000 0.000 0.025 0.000 0.000 0.000 0.000 0.016 0.036 0.054 0.000 0.139 0.108
solo 53.000 2162.000 0.095 0.133 0.000 0.035 0.000 0.037 0.000 0.138 0.164 0.000 0.096 0.137
metal 35.000 2180.000 0.133 0.188 0.000 0.000 0.000 0.000 0.035 0.229 0.190 0.000 0.306 0.284
saxophone 68.000 2147.000 0.541 0.424 0.574 0.673 0.452 0.633 0.005 0.353 0.333 0.000 0.456 0.484
acoustic 38.000 2177.000 0.107 0.297 0.100 0.048 0.000 0.126 0.000 0.123 0.088 0.000 0.101 0.130
synth 480.000 1735.000 0.457 0.464 0.450 0.469 0.344 0.451 0.003 0.478 0.472 0.038 0.436 0.443
drum 927.000 1288.000 0.607 0.587 0.552 0.591 0.481 0.599 0.000 0.601 0.589 0.347 0.521 0.532
slow 159.000 2056.000 0.252 0.313 0.113 0.214 0.147 0.271 0.000 0.327 0.327 0.000 0.313 0.378
keyboard 42.000 2173.000 0.000 0.053 0.000 0.000 0.000 0.000 0.000 0.111 0.127 0.000 0.049 0.099
beat 137.000 2078.000 0.250 0.257 0.035 0.134 0.017 0.172 0.019 0.287 0.287 0.000 0.241 0.243
piano 185.000 2030.000 0.381 0.339 0.331 0.387 0.280 0.376 0.000 0.407 0.371 0.000 0.359 0.429
vocal 270.000 1945.000 0.243 0.235 0.077 0.186 0.051 0.249 0.000 0.257 0.267 0.007 0.200 0.215
british 82.000 2133.000 0.048 0.094 0.000 0.000 0.000 0.022 0.000 0.098 0.122 0.000 0.073 0.099
jazz 138.000 2077.000 0.538 0.491 0.641 0.667 0.580 0.555 0.000 0.478 0.473 0.000 0.577 0.569
male 738.000 1477.000 0.614 0.573 0.621 0.634 0.528 0.599 0.000 0.591 0.598 0.082 0.575 0.590
quiet 43.000 2172.000 0.251 0.177 0.148 0.044 0.083 0.165 0.044 0.248 0.264 0.000 0.210 0.207
electronica 166.000 2049.000 0.268 0.276 0.082 0.249 0.056 0.270 0.030 0.317 0.309 0.000 0.307 0.319
electronic 485.000 1730.000 0.571 0.552 0.609 0.612 0.527 0.587 0.000 0.522 0.528 0.159 0.577 0.601
noise 42.000 2173.000 0.167 0.290 0.000 0.000 0.000 0.093 0.000 0.143 0.127 0.000 0.170 0.269
country 74.000 2141.000 0.263 0.228 0.000 0.058 0.000 0.112 0.000 0.180 0.216 0.000 0.120 0.165
soft 62.000 2153.000 0.044 0.156 0.000 0.030 0.000 0.000 0.000 0.183 0.161 0.000 0.174 0.225
loud 37.000 2178.000 0.121 0.141 0.000 0.000 0.000 0.056 0.000 0.198 0.198 0.000 0.212 0.189
organ 35.000 2180.000 0.000 0.053 0.000 0.000 0.000 0.000 0.000 0.076 0.076 0.000 0.054 0.032
female 345.000 1870.000 0.451 0.412 0.513 0.530 0.324 0.384 0.029 0.385 0.402 0.016 0.437 0.457
hiphop 152.000 2063.000 0.571 0.572 0.646 0.664 0.539 0.617 0.000 0.513 0.522 0.000 0.477 0.575
bass 420.000 1795.000 0.323 0.287 0.139 0.264 0.095 0.283 0.111 0.302 0.310 0.000 0.271 0.290
instrumental 102.000 2113.000 0.000 0.066 0.019 0.000 0.000 0.054 0.022 0.092 0.111 0.000 0.099 0.099
guitar 849.000 1366.000 0.672 0.642 0.656 0.660 0.618 0.657 0.000 0.643 0.634 0.555 0.664 0.676
fast 109.000 2106.000 0.171 0.151 0.047 0.151 0.068 0.303 0.020 0.183 0.165 0.000 0.243 0.283
drummachine 89.000 2126.000 0.090 0.144 0.019 0.052 0.000 0.181 0.000 0.187 0.217 0.000 0.152 0.168
dance 324.000 1891.000 0.524 0.488 0.565 0.629 0.502 0.604 0.000 0.508 0.512 0.050 0.575 0.594

download these results as csv

Summary Binary Accuracy (Average Across All Folds)

Tag Positive Examples Negative Examples BP1 BP2 CC1 CC2 CC3 CC4 GP GT1 GT2 HBC LWW1 LWW2
80s 117.000 2098.000 0.912 0.873 0.945 0.946 0.946 0.927 0.913 0.870 0.870 0.947 0.910 0.911
horns 40.000 2175.000 0.971 0.971 0.982 0.982 0.982 0.982 0.915 0.947 0.948 0.982 0.967 0.967
voice 148.000 2067.000 0.920 0.846 0.929 0.928 0.932 0.896 0.652 0.822 0.825 0.933 0.877 0.876
trumpet 39.000 2176.000 0.981 0.969 0.982 0.982 0.982 0.982 0.978 0.960 0.961 0.982 0.971 0.974
rap 157.000 2058.000 0.941 0.935 0.965 0.961 0.957 0.942 0.929 0.905 0.907 0.928 0.942 0.943
strings 55.000 2160.000 0.960 0.952 0.972 0.975 0.972 0.971 0.975 0.937 0.934 0.975 0.955 0.955
pop 479.000 1736.000 0.480 0.615 0.759 0.713 0.747 0.667 0.784 0.648 0.648 0.784 0.761 0.769
rock 664.000 1551.000 0.744 0.679 0.764 0.717 0.779 0.715 0.698 0.655 0.655 0.804 0.816 0.819
house 65.000 2150.000 0.962 0.951 0.971 0.969 0.971 0.961 0.877 0.938 0.936 0.971 0.956 0.957
punk 51.000 2164.000 0.964 0.963 0.977 0.978 0.977 0.979 0.977 0.943 0.945 0.977 0.964 0.969
ambient 60.000 2155.000 0.969 0.952 0.968 0.973 0.966 0.961 0.973 0.934 0.938 0.974 0.959 0.960
rb 38.000 2177.000 0.973 0.967 0.983 0.982 0.982 0.981 0.983 0.957 0.958 0.983 0.967 0.968
techno 248.000 1967.000 0.851 0.837 0.911 0.911 0.902 0.890 0.889 0.824 0.826 0.889 0.892 0.892
distortion 60.000 2155.000 0.970 0.941 0.973 0.973 0.973 0.970 0.973 0.935 0.937 0.972 0.957 0.957
funk 37.000 2178.000 0.982 0.974 0.983 0.983 0.983 0.981 0.960 0.952 0.953 0.983 0.971 0.970
solo 53.000 2162.000 0.959 0.953 0.976 0.975 0.976 0.975 0.976 0.938 0.940 0.976 0.956 0.959
metal 35.000 2180.000 0.975 0.970 0.984 0.984 0.984 0.984 0.981 0.964 0.962 0.984 0.978 0.978
saxophone 68.000 2147.000 0.967 0.956 0.980 0.983 0.978 0.980 0.864 0.941 0.939 0.969 0.966 0.968
acoustic 38.000 2177.000 0.976 0.970 0.983 0.982 0.982 0.982 0.933 0.955 0.953 0.983 0.968 0.970
synth 480.000 1735.000 0.614 0.643 0.749 0.716 0.751 0.684 0.781 0.658 0.655 0.784 0.756 0.760
drum 927.000 1288.000 0.474 0.468 0.509 0.493 0.569 0.526 0.582 0.497 0.483 0.578 0.599 0.608
slow 159.000 2056.000 0.875 0.863 0.914 0.917 0.921 0.884 0.928 0.855 0.855 0.928 0.901 0.911
keyboard 42.000 2173.000 0.979 0.971 0.981 0.981 0.981 0.980 0.981 0.949 0.950 0.981 0.964 0.966
beat 137.000 2078.000 0.903 0.862 0.934 0.924 0.934 0.893 0.920 0.867 0.867 0.938 0.906 0.906
piano 185.000 2030.000 0.871 0.837 0.910 0.912 0.909 0.881 0.916 0.851 0.842 0.915 0.891 0.905
vocal 270.000 1945.000 0.632 0.693 0.858 0.830 0.866 0.783 0.874 0.728 0.732 0.877 0.805 0.809
british 82.000 2133.000 0.948 0.924 0.962 0.962 0.962 0.958 0.963 0.899 0.902 0.963 0.930 0.932
jazz 138.000 2077.000 0.930 0.928 0.960 0.965 0.955 0.940 0.937 0.903 0.902 0.937 0.947 0.946
male 738.000 1477.000 0.616 0.574 0.706 0.670 0.694 0.632 0.665 0.592 0.599 0.657 0.716 0.725
quiet 43.000 2172.000 0.973 0.964 0.979 0.981 0.980 0.982 0.971 0.956 0.957 0.980 0.969 0.969
electronica 166.000 2049.000 0.867 0.857 0.921 0.910 0.923 0.885 0.886 0.846 0.844 0.926 0.896 0.898
electronic 485.000 1730.000 0.739 0.727 0.812 0.803 0.805 0.776 0.782 0.683 0.687 0.792 0.816 0.826
noise 42.000 2173.000 0.973 0.963 0.981 0.981 0.979 0.978 0.969 0.951 0.950 0.981 0.969 0.973
country 74.000 2141.000 0.950 0.938 0.966 0.967 0.967 0.955 0.967 0.918 0.921 0.966 0.941 0.944
soft 62.000 2153.000 0.963 0.946 0.969 0.972 0.971 0.966 0.972 0.931 0.929 0.972 0.953 0.957
loud 37.000 2178.000 0.977 0.970 0.983 0.983 0.983 0.983 0.983 0.960 0.960 0.983 0.973 0.973
organ 35.000 2180.000 0.983 0.970 0.984 0.984 0.984 0.984 0.976 0.956 0.956 0.984 0.970 0.969
female 345.000 1870.000 0.750 0.731 0.866 0.858 0.837 0.750 0.834 0.711 0.719 0.846 0.825 0.832
hiphop 152.000 2063.000 0.930 0.928 0.958 0.956 0.951 0.940 0.931 0.900 0.902 0.930 0.928 0.942
bass 420.000 1795.000 0.369 0.529 0.765 0.690 0.779 0.629 0.782 0.602 0.607 0.809 0.724 0.731
instrumental 102.000 2113.000 0.946 0.898 0.950 0.953 0.951 0.937 0.903 0.874 0.877 0.954 0.916 0.917
guitar 849.000 1366.000 0.650 0.615 0.648 0.632 0.679 0.642 0.617 0.587 0.577 0.704 0.743 0.751
fast 109.000 2106.000 0.921 0.891 0.951 0.950 0.952 0.931 0.922 0.879 0.876 0.951 0.926 0.930
drummachine 89.000 2126.000 0.951 0.918 0.960 0.957 0.957 0.943 0.960 0.902 0.905 0.960 0.933 0.934
dance 324.000 1891.000 0.809 0.795 0.884 0.885 0.884 0.865 0.854 0.782 0.784 0.857 0.876 0.882

download these results as csv

Summary Positive Example Accuracy (Average Across All Folds)

Tag Positive Examples Negative Examples BP1 BP2 CC1 CC2 CC3 CC4 GP GT1 GT2 HBC LWW1 LWW2
80s 117.000 2098.000 0.090 0.226 0.007 0.042 0.020 0.204 0.021 0.259 0.273 0.000 0.157 0.169
horns 40.000 2175.000 0.028 0.125 0.000 0.000 0.000 0.000 0.058 0.050 0.078 0.000 0.087 0.078
voice 148.000 2067.000 0.025 0.136 0.007 0.008 0.007 0.077 0.361 0.167 0.192 0.000 0.081 0.072
trumpet 39.000 2176.000 0.259 0.261 0.000 0.037 0.000 0.037 0.000 0.386 0.406 0.000 0.196 0.298
rap 157.000 2058.000 0.756 0.759 0.627 0.686 0.510 0.727 0.000 0.836 0.851 0.000 0.617 0.628
strings 55.000 2160.000 0.207 0.225 0.000 0.000 0.000 0.000 0.000 0.244 0.172 0.000 0.088 0.090
pop 479.000 1736.000 0.879 0.720 0.383 0.666 0.179 0.679 0.000 0.690 0.691 0.002 0.452 0.465
rock 664.000 1551.000 0.917 0.909 0.790 0.892 0.603 0.875 0.013 0.928 0.928 0.564 0.693 0.698
house 65.000 2150.000 0.090 0.234 0.014 0.059 0.000 0.306 0.091 0.447 0.416 0.000 0.244 0.262
punk 51.000 2164.000 0.358 0.398 0.000 0.047 0.000 0.130 0.000 0.274 0.305 0.000 0.211 0.352
ambient 60.000 2155.000 0.200 0.156 0.094 0.044 0.041 0.122 0.000 0.280 0.362 0.057 0.259 0.268
rb 38.000 2177.000 0.059 0.206 0.000 0.000 0.000 0.000 0.000 0.262 0.308 0.000 0.061 0.111
techno 248.000 1967.000 0.643 0.665 0.429 0.617 0.299 0.659 0.000 0.720 0.730 0.004 0.529 0.525
distortion 60.000 2155.000 0.063 0.204 0.000 0.000 0.000 0.024 0.000 0.331 0.366 0.000 0.238 0.217
funk 37.000 2178.000 0.000 0.021 0.000 0.000 0.000 0.000 0.021 0.058 0.102 0.000 0.150 0.113
solo 53.000 2162.000 0.102 0.154 0.000 0.022 0.000 0.022 0.000 0.202 0.237 0.000 0.102 0.141
metal 35.000 2180.000 0.158 0.202 0.000 0.000 0.000 0.000 0.030 0.360 0.307 0.000 0.313 0.277
saxophone 68.000 2147.000 0.637 0.537 0.433 0.544 0.313 0.567 0.037 0.533 0.496 0.000 0.469 0.498
acoustic 38.000 2177.000 0.126 0.336 0.056 0.026 0.000 0.074 0.000 0.180 0.144 0.000 0.096 0.118
synth 480.000 1735.000 0.758 0.714 0.479 0.583 0.304 0.605 0.002 0.719 0.711 0.020 0.439 0.444
drum 927.000 1288.000 0.969 0.903 0.725 0.876 0.478 0.847 0.000 0.904 0.886 0.268 0.522 0.532
slow 159.000 2056.000 0.297 0.435 0.077 0.159 0.094 0.301 0.000 0.496 0.495 0.000 0.314 0.378
keyboard 42.000 2173.000 0.000 0.039 0.000 0.000 0.000 0.000 0.000 0.156 0.176 0.000 0.050 0.104
beat 137.000 2078.000 0.275 0.394 0.019 0.103 0.010 0.182 0.020 0.449 0.449 0.000 0.247 0.250
piano 185.000 2030.000 0.463 0.498 0.264 0.328 0.208 0.426 0.000 0.612 0.556 0.000 0.362 0.430
vocal 270.000 1945.000 0.501 0.386 0.048 0.159 0.030 0.296 0.000 0.386 0.397 0.004 0.200 0.215
british 82.000 2133.000 0.043 0.108 0.000 0.000 0.000 0.014 0.000 0.169 0.196 0.000 0.082 0.116
jazz 138.000 2077.000 0.676 0.564 0.586 0.573 0.502 0.614 0.000 0.718 0.707 0.000 0.592 0.582
male 738.000 1477.000 0.921 0.861 0.730 0.862 0.518 0.830 0.000 0.889 0.900 0.047 0.581 0.598
quiet 43.000 2172.000 0.238 0.219 0.094 0.024 0.048 0.094 0.024 0.377 0.403 0.000 0.211 0.208
electronica 166.000 2049.000 0.336 0.371 0.052 0.203 0.031 0.288 0.035 0.489 0.478 0.000 0.324 0.342
electronic 485.000 1730.000 0.804 0.781 0.679 0.720 0.503 0.741 0.000 0.788 0.796 0.091 0.583 0.608
noise 42.000 2173.000 0.144 0.387 0.000 0.000 0.000 0.063 0.000 0.226 0.207 0.000 0.179 0.275
country 74.000 2141.000 0.287 0.296 0.000 0.032 0.000 0.089 0.000 0.262 0.330 0.000 0.128 0.180
soft 62.000 2153.000 0.050 0.176 0.000 0.017 0.000 0.000 0.000 0.271 0.239 0.000 0.177 0.226
loud 37.000 2178.000 0.105 0.164 0.000 0.000 0.000 0.030 0.000 0.293 0.293 0.000 0.218 0.180
organ 35.000 2180.000 0.000 0.074 0.000 0.000 0.000 0.000 0.000 0.113 0.105 0.000 0.053 0.037
female 345.000 1870.000 0.684 0.632 0.474 0.537 0.264 0.523 0.021 0.600 0.630 0.008 0.454 0.474
hiphop 152.000 2063.000 0.702 0.719 0.563 0.650 0.422 0.728 0.000 0.778 0.793 0.000 0.504 0.597
bass 420.000 1795.000 0.797 0.500 0.100 0.296 0.061 0.390 0.082 0.453 0.470 0.000 0.273 0.291
instrumental 102.000 2113.000 0.000 0.077 0.010 0.000 0.000 0.039 0.039 0.137 0.167 0.000 0.100 0.098
guitar 849.000 1366.000 0.935 0.901 0.872 0.930 0.678 0.897 0.000 0.965 0.952 0.482 0.666 0.677
fast 109.000 2106.000 0.184 0.200 0.025 0.095 0.037 0.307 0.021 0.272 0.258 0.000 0.245 0.289
drummachine 89.000 2126.000 0.067 0.173 0.010 0.029 0.000 0.161 0.000 0.290 0.336 0.000 0.148 0.172
dance 324.000 1891.000 0.723 0.672 0.524 0.671 0.402 0.711 0.000 0.762 0.766 0.026 0.579 0.595

download these results as csv

Summary Negative Example Accuracy (Average Across All Folds)

Tag Positive Examples Negative Examples BP1 BP2 CC1 CC2 CC3 CC4 GP GT1 GT2 HBC LWW1 LWW2
80s 117.000 2098.000 0.960 0.911 0.998 0.997 0.998 0.969 0.964 0.904 0.904 1.000 0.953 0.953
horns 40.000 2175.000 0.988 0.987 1.000 1.000 1.000 1.000 0.931 0.964 0.964 1.000 0.983 0.983
voice 148.000 2067.000 0.984 0.898 0.996 0.995 0.999 0.955 0.673 0.869 0.871 1.000 0.934 0.934
trumpet 39.000 2176.000 0.995 0.982 1.000 1.000 1.000 0.999 0.995 0.971 0.971 1.000 0.985 0.987
rap 157.000 2058.000 0.956 0.949 0.991 0.983 0.992 0.958 1.000 0.911 0.912 0.999 0.969 0.969
strings 55.000 2160.000 0.979 0.971 0.997 1.000 0.997 0.995 1.000 0.955 0.953 1.000 0.977 0.977
pop 479.000 1736.000 0.362 0.586 0.864 0.727 0.905 0.664 1.000 0.637 0.637 0.999 0.847 0.853
rock 664.000 1551.000 0.671 0.581 0.753 0.642 0.855 0.647 0.991 0.538 0.538 0.907 0.869 0.871
house 65.000 2150.000 0.988 0.973 1.000 0.996 1.000 0.980 0.901 0.953 0.952 1.000 0.977 0.978
punk 51.000 2164.000 0.978 0.977 1.000 1.000 1.000 0.999 1.000 0.959 0.960 1.000 0.982 0.984
ambient 60.000 2155.000 0.990 0.974 0.992 0.999 0.991 0.985 1.000 0.952 0.954 1.000 0.979 0.979
rb 38.000 2177.000 0.989 0.980 1.000 1.000 1.000 0.998 1.000 0.969 0.970 1.000 0.984 0.984
techno 248.000 1967.000 0.875 0.858 0.971 0.948 0.979 0.919 1.000 0.838 0.839 1.000 0.939 0.939
distortion 60.000 2155.000 0.996 0.963 1.000 1.000 1.000 0.997 1.000 0.953 0.954 0.999 0.977 0.978
funk 37.000 2178.000 0.999 0.990 1.000 1.000 1.000 0.997 0.976 0.967 0.967 1.000 0.985 0.985
solo 53.000 2162.000 0.980 0.973 1.000 0.999 1.000 0.999 1.000 0.956 0.957 1.000 0.977 0.979
metal 35.000 2180.000 0.988 0.982 1.000 1.000 1.000 1.000 0.997 0.973 0.972 1.000 0.989 0.989
saxophone 68.000 2147.000 0.977 0.969 0.998 0.997 0.999 0.993 0.891 0.954 0.953 1.000 0.982 0.984
acoustic 38.000 2177.000 0.991 0.982 1.000 0.999 1.000 0.997 0.949 0.968 0.967 1.000 0.984 0.985
synth 480.000 1735.000 0.572 0.623 0.823 0.752 0.875 0.705 0.995 0.644 0.641 0.995 0.844 0.847
drum 927.000 1288.000 0.119 0.156 0.354 0.219 0.634 0.296 1.000 0.206 0.194 0.802 0.654 0.663
slow 159.000 2056.000 0.919 0.896 0.979 0.975 0.985 0.929 1.000 0.883 0.883 0.999 0.946 0.952
keyboard 42.000 2173.000 0.998 0.989 1.000 1.000 1.000 0.999 1.000 0.964 0.965 1.000 0.981 0.983
beat 137.000 2078.000 0.944 0.894 0.994 0.977 0.994 0.939 0.979 0.896 0.896 0.999 0.950 0.950
piano 185.000 2030.000 0.909 0.868 0.968 0.965 0.973 0.923 1.000 0.873 0.868 0.998 0.939 0.948
vocal 270.000 1945.000 0.648 0.736 0.971 0.923 0.983 0.850 0.995 0.776 0.778 0.999 0.889 0.891
british 82.000 2133.000 0.984 0.956 1.000 1.000 1.000 0.995 1.000 0.929 0.930 1.000 0.964 0.965
jazz 138.000 2077.000 0.946 0.952 0.985 0.991 0.986 0.962 1.000 0.916 0.915 1.000 0.972 0.972
male 738.000 1477.000 0.463 0.432 0.696 0.574 0.781 0.534 1.000 0.445 0.450 0.967 0.789 0.796
quiet 43.000 2172.000 0.988 0.979 0.997 1.000 0.999 0.999 0.990 0.968 0.968 1.000 0.984 0.984
electronica 166.000 2049.000 0.910 0.895 0.992 0.966 0.994 0.932 0.953 0.876 0.875 1.000 0.943 0.945
electronic 485.000 1730.000 0.721 0.713 0.850 0.827 0.889 0.786 1.000 0.658 0.660 0.989 0.882 0.889
noise 42.000 2173.000 0.989 0.975 1.000 1.000 0.998 0.995 0.987 0.965 0.965 1.000 0.984 0.986
country 74.000 2141.000 0.973 0.961 1.000 0.999 1.000 0.985 1.000 0.940 0.942 1.000 0.969 0.971
soft 62.000 2153.000 0.989 0.968 0.997 0.999 0.999 0.994 1.000 0.950 0.949 1.000 0.976 0.978
loud 37.000 2178.000 0.992 0.983 1.000 1.000 1.000 0.999 1.000 0.971 0.971 1.000 0.986 0.986
organ 35.000 2180.000 0.998 0.984 1.000 1.000 1.000 1.000 0.992 0.969 0.970 1.000 0.985 0.984
female 345.000 1870.000 0.767 0.754 0.943 0.920 0.946 0.797 0.985 0.737 0.742 1.000 0.897 0.901
hiphop 152.000 2063.000 0.947 0.945 0.988 0.979 0.990 0.956 1.000 0.910 0.911 0.999 0.961 0.969
bass 420.000 1795.000 0.268 0.538 0.920 0.783 0.946 0.686 0.943 0.637 0.640 0.998 0.829 0.834
instrumental 102.000 2113.000 0.991 0.938 0.996 1.000 0.997 0.981 0.945 0.910 0.911 1.000 0.956 0.957
guitar 849.000 1366.000 0.472 0.437 0.509 0.447 0.681 0.483 1.000 0.353 0.345 0.841 0.791 0.798
fast 109.000 2106.000 0.959 0.926 0.999 0.994 0.999 0.963 0.969 0.910 0.909 1.000 0.961 0.963
drummachine 89.000 2126.000 0.987 0.949 0.999 0.995 0.997 0.975 1.000 0.927 0.929 1.000 0.965 0.965
dance 324.000 1891.000 0.823 0.816 0.944 0.920 0.965 0.890 1.000 0.787 0.787 0.999 0.927 0.931

download these results as csv

Overall Summary Results (Affinity)

Measure BP1 BP2 CC1 CC2 CC3 CC4 GT1 GT2 HBC LWW1 LWW2
Average AUC-ROC Tag 0.742 0.761 0.762 0.791 0.721 0.749 0.784 0.786 0.736 0.782 0.807
Average AUC-ROC Clip 0.871 0.861 0.882 0.882 0.854 0.854 0.872 0.876 0.851 0.751 0.774
Precision at 3 0.534 0.476 0.539 0.539 0.511 0.511 0.507 0.511 0.454 0.298 0.324
Precision at 6 0.411 0.381 0.419 0.419 0.399 0.399 0.399 0.403 0.370 0.259 0.273
Precision at 9 0.325 0.313 0.332 0.331 0.316 0.316 0.321 0.323 0.306 0.225 0.239
Precision at 12 0.269 0.264 0.274 0.274 0.261 0.261 0.267 0.271 0.259 0.201 0.212
Precision at 15 0.226 0.227 0.233 0.233 0.221 0.221 0.229 0.232 0.224 0.182 0.189

download these results as csv

Summary AUC-ROC Tag (Average Across All Folds)

Tag BP1 BP2 CC1 CC2 CC3 CC4 GT1 GT2 HBC LWW1 LWW2
80s 0.679 0.674 0.707 0.766 0.706 0.738 0.702 0.711 0.673 0.713 0.757
horns 0.680 0.700 0.620 0.638 0.607 0.550 0.666 0.644 0.554 0.694 0.748
voice 0.580 0.563 0.550 0.592 0.566 0.612 0.625 0.622 0.560 0.522 0.560
trumpet 0.713 0.748 0.714 0.775 0.704 0.669 0.763 0.731 0.619 0.792 0.778
rap 0.913 0.921 0.941 0.945 0.915 0.920 0.934 0.940 0.835 0.938 0.937
strings 0.816 0.836 0.833 0.850 0.730 0.769 0.797 0.802 0.790 0.804 0.843
pop 0.736 0.709 0.734 0.775 0.691 0.732 0.715 0.724 0.705 0.759 0.778
rock 0.891 0.846 0.850 0.877 0.826 0.855 0.857 0.862 0.849 0.870 0.890
house 0.745 0.784 0.804 0.829 0.786 0.783 0.854 0.860 0.852 0.870 0.859
punk 0.880 0.840 0.838 0.895 0.752 0.846 0.843 0.864 0.823 0.849 0.904
ambient 0.702 0.803 0.806 0.798 0.753 0.751 0.793 0.800 0.667 0.746 0.809
rb 0.703 0.775 0.761 0.789 0.620 0.680 0.832 0.843 0.781 0.748 0.816
techno 0.825 0.850 0.900 0.908 0.877 0.879 0.859 0.863 0.821 0.894 0.877
distortion 0.793 0.755 0.689 0.779 0.678 0.753 0.833 0.833 0.810 0.821 0.819
funk 0.567 0.752 0.700 0.703 0.705 0.674 0.749 0.723 0.727 0.745 0.788
solo 0.779 0.751 0.697 0.812 0.556 0.689 0.771 0.789 0.653 0.729 0.765
metal 0.808 0.903 0.837 0.913 0.720 0.829 0.880 0.884 0.873 0.904 0.941
saxophone 0.892 0.867 0.917 0.920 0.885 0.902 0.870 0.876 0.748 0.891 0.899
acoustic 0.626 0.862 0.829 0.885 0.704 0.797 0.854 0.855 0.820 0.818 0.884
synth 0.725 0.711 0.718 0.717 0.684 0.683 0.739 0.733 0.713 0.733 0.747
drum 0.640 0.576 0.534 0.596 0.578 0.617 0.613 0.600 0.606 0.637 0.653
slow 0.729 0.778 0.776 0.785 0.746 0.762 0.806 0.797 0.794 0.814 0.847
keyboard 0.552 0.588 0.581 0.541 0.521 0.502 0.733 0.706 0.620 0.636 0.660
beat 0.744 0.788 0.770 0.773 0.739 0.716 0.815 0.816 0.830 0.809 0.828
piano 0.783 0.771 0.813 0.824 0.773 0.777 0.818 0.817 0.789 0.814 0.836
vocal 0.613 0.611 0.630 0.668 0.608 0.644 0.652 0.652 0.647 0.622 0.656
british 0.729 0.712 0.596 0.680 0.541 0.624 0.714 0.750 0.697 0.669 0.691
jazz 0.869 0.850 0.927 0.934 0.892 0.898 0.895 0.895 0.772 0.909 0.909
male 0.787 0.718 0.782 0.814 0.729 0.763 0.752 0.764 0.645 0.766 0.778
quiet 0.850 0.893 0.830 0.833 0.793 0.795 0.904 0.890 0.841 0.851 0.888
electronica 0.741 0.771 0.816 0.822 0.767 0.755 0.792 0.785 0.756 0.804 0.835
electronic 0.840 0.816 0.848 0.855 0.823 0.840 0.799 0.802 0.748 0.831 0.838
noise 0.756 0.820 0.844 0.902 0.809 0.860 0.800 0.829 0.765 0.807 0.867
country 0.804 0.869 0.772 0.827 0.727 0.774 0.781 0.817 0.745 0.796 0.823
soft 0.713 0.778 0.747 0.752 0.684 0.743 0.848 0.845 0.829 0.840 0.846
loud 0.811 0.816 0.807 0.884 0.765 0.856 0.916 0.905 0.882 0.890 0.917
organ 0.548 0.647 0.613 0.614 0.550 0.599 0.648 0.634 0.566 0.616 0.648
female 0.801 0.762 0.822 0.829 0.709 0.724 0.739 0.748 0.665 0.773 0.792
hiphop 0.900 0.913 0.930 0.932 0.910 0.914 0.916 0.924 0.848 0.918 0.934
bass 0.568 0.539 0.532 0.568 0.549 0.565 0.579 0.586 0.575 0.612 0.618
instrumental 0.609 0.600 0.719 0.722 0.672 0.675 0.647 0.676 0.507 0.695 0.730
guitar 0.833 0.748 0.741 0.803 0.743 0.786 0.791 0.798 0.765 0.811 0.838
fast 0.680 0.717 0.780 0.806 0.767 0.806 0.738 0.749 0.761 0.817 0.829
drummachine 0.582 0.694 0.729 0.749 0.720 0.704 0.789 0.775 0.774 0.744 0.783
dance 0.847 0.837 0.891 0.898 0.872 0.878 0.853 0.856 0.841 0.879 0.886

download these results as csv

Select Friedman's Test Results

Tag F-measure (Binary) Friedman Test

The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each tag in the test, averaged over all folds.

TeamID TeamID Lowerbound Mean Upperbound Significance
LWW2 GT2 -1.754 0.689 3.131 FALSE
LWW2 GT1 -1.509 0.933 3.376 FALSE
LWW2 BP2 -0.709 1.733 4.176 FALSE
LWW2 LWW1 -1.109 1.333 3.776 FALSE
LWW2 BP1 -0.843 1.600 4.043 FALSE
LWW2 CC4 -0.265 2.178 4.620 FALSE
LWW2 CC2 0.602 3.044 5.487 TRUE
LWW2 CC1 1.835 4.278 6.720 TRUE
LWW2 CC3 3.613 6.056 8.498 TRUE
LWW2 HBC 4.780 7.222 9.665 TRUE
LWW2 GP 4.357 6.800 9.242 TRUE
GT2 GT1 -2.198 0.244 2.687 FALSE
GT2 BP2 -1.398 1.044 3.487 FALSE
GT2 LWW1 -1.798 0.644 3.087 FALSE
GT2 BP1 -1.531 0.911 3.354 FALSE
GT2 CC4 -0.954 1.489 3.931 FALSE
GT2 CC2 -0.087 2.356 4.798 FALSE
GT2 CC1 1.146 3.589 6.031 TRUE
GT2 CC3 2.924 5.367 7.809 TRUE
GT2 HBC 4.091 6.533 8.976 TRUE
GT2 GP 3.669 6.111 8.554 TRUE
GT1 BP2 -1.643 0.800 3.243 FALSE
GT1 LWW1 -2.042 0.400 2.842 FALSE
GT1 BP1 -1.776 0.667 3.109 FALSE
GT1 CC4 -1.198 1.244 3.687 FALSE
GT1 CC2 -0.331 2.111 4.554 FALSE
GT1 CC1 0.902 3.344 5.787 TRUE
GT1 CC3 2.680 5.122 7.565 TRUE
GT1 HBC 3.846 6.289 8.731 TRUE
GT1 GP 3.424 5.867 8.309 TRUE
BP2 LWW1 -2.842 -0.400 2.042 FALSE
BP2 BP1 -2.576 -0.133 2.309 FALSE
BP2 CC4 -1.998 0.444 2.887 FALSE
BP2 CC2 -1.131 1.311 3.754 FALSE
BP2 CC1 0.102 2.544 4.987 TRUE
BP2 CC3 1.880 4.322 6.765 TRUE
BP2 HBC 3.046 5.489 7.931 TRUE
BP2 GP 2.624 5.067 7.509 TRUE
LWW1 BP1 -2.176 0.267 2.709 FALSE
LWW1 CC4 -1.598 0.844 3.287 FALSE
LWW1 CC2 -0.731 1.711 4.154 FALSE
LWW1 CC1 0.502 2.944 5.387 TRUE
LWW1 CC3 2.280 4.722 7.165 TRUE
LWW1 HBC 3.446 5.889 8.331 TRUE
LWW1 GP 3.024 5.467 7.909 TRUE
BP1 CC4 -1.865 0.578 3.020 FALSE
BP1 CC2 -0.998 1.444 3.887 FALSE
BP1 CC1 0.235 2.678 5.120 TRUE
BP1 CC3 2.013 4.456 6.898 TRUE
BP1 HBC 3.180 5.622 8.065 TRUE
BP1 GP 2.757 5.200 7.643 TRUE
CC4 CC2 -1.576 0.867 3.309 FALSE
CC4 CC1 -0.343 2.100 4.543 FALSE
CC4 CC3 1.435 3.878 6.320 TRUE
CC4 HBC 2.602 5.044 7.487 TRUE
CC4 GP 2.180 4.622 7.065 TRUE
CC2 CC1 -1.209 1.233 3.676 FALSE
CC2 CC3 0.569 3.011 5.454 TRUE
CC2 HBC 1.735 4.178 6.620 TRUE
CC2 GP 1.313 3.756 6.198 TRUE
CC1 CC3 -0.665 1.778 4.220 FALSE
CC1 HBC 0.502 2.944 5.387 TRUE
CC1 GP 0.080 2.522 4.965 TRUE
CC3 HBC -1.276 1.167 3.609 FALSE
CC3 GP -1.698 0.744 3.187 FALSE
HBC GP -2.865 -0.422 2.020 FALSE

download these results as csv


https://music-ir.org/mirex/results/2009/tag/MajorMiner/small.binary_FMeasure.friedman.tukeyKramerHSD.png

Per Track F-measure (Binary) Friedman Test

The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each track in the test, averaged over all folds.

TeamID TeamID Lowerbound Mean Upperbound Significance
CC2 CC1 0.102 0.442 0.782 TRUE
CC2 CC4 0.274 0.614 0.954 TRUE
CC2 BP1 0.260 0.601 0.941 TRUE
CC2 GT2 0.664 1.004 1.344 TRUE
CC2 GT1 0.709 1.048 1.389 TRUE
CC2 BP2 0.792 1.132 1.472 TRUE
CC2 LWW2 1.020 1.360 1.700 TRUE
CC2 LWW1 1.167 1.507 1.847 TRUE
CC2 CC3 1.494 1.834 2.174 TRUE
CC2 HBC 4.402 4.742 5.082 TRUE
CC2 GP 5.684 6.024 6.364 TRUE
CC1 CC4 -0.168 0.172 0.512 FALSE
CC1 BP1 -0.181 0.159 0.499 FALSE
CC1 GT2 0.222 0.562 0.903 TRUE
CC1 GT1 0.267 0.607 0.947 TRUE
CC1 BP2 0.350 0.690 1.030 TRUE
CC1 LWW2 0.578 0.918 1.259 TRUE
CC1 LWW1 0.725 1.065 1.405 TRUE
CC1 CC3 1.052 1.392 1.732 TRUE
CC1 HBC 3.961 4.301 4.641 TRUE
CC1 GP 5.243 5.583 5.923 TRUE
CC4 BP1 -0.353 -0.013 0.327 FALSE
CC4 GT2 0.050 0.390 0.730 TRUE
CC4 GT1 0.095 0.435 0.775 TRUE
CC4 BP2 0.178 0.518 0.858 TRUE
CC4 LWW2 0.406 0.746 1.086 TRUE
CC4 LWW1 0.553 0.893 1.233 TRUE
CC4 CC3 0.880 1.220 1.560 TRUE
CC4 HBC 3.789 4.129 4.469 TRUE
CC4 GP 5.071 5.411 5.751 TRUE
BP1 GT2 0.064 0.404 0.744 TRUE
BP1 GT1 0.108 0.448 0.788 TRUE
BP1 BP2 0.191 0.531 0.872 TRUE
BP1 LWW2 0.419 0.760 1.100 TRUE
BP1 LWW1 0.567 0.906 1.247 TRUE
BP1 CC3 0.893 1.233 1.573 TRUE
BP1 HBC 3.802 4.142 4.482 TRUE
BP1 GP 5.084 5.424 5.764 TRUE
GT2 GT1 -0.296 0.044 0.385 FALSE
GT2 BP2 -0.212 0.128 0.468 FALSE
GT2 LWW2 0.016 0.356 0.696 TRUE
GT2 LWW1 0.163 0.503 0.843 TRUE
GT2 CC3 0.490 0.830 1.170 TRUE
GT2 HBC 3.398 3.738 4.079 TRUE
GT2 GP 4.680 5.020 5.360 TRUE
GT1 BP2 -0.257 0.083 0.423 FALSE
GT1 LWW2 -0.029 0.311 0.652 FALSE
GT1 LWW1 0.118 0.459 0.798 TRUE
GT1 CC3 0.445 0.785 1.125 TRUE
GT1 HBC 3.354 3.694 4.034 TRUE
GT1 GP 4.636 4.976 5.316 TRUE
BP2 LWW2 -0.112 0.228 0.568 FALSE
BP2 LWW1 0.035 0.375 0.715 TRUE
BP2 CC3 0.362 0.702 1.042 TRUE
BP2 HBC 3.271 3.611 3.951 TRUE
BP2 GP 4.553 4.893 5.233 TRUE
LWW2 LWW1 -0.193 0.147 0.487 FALSE
LWW2 CC3 0.134 0.474 0.814 TRUE
LWW2 HBC 3.042 3.382 3.723 TRUE
LWW2 GP 4.324 4.664 5.004 TRUE
LWW1 CC3 -0.013 0.327 0.667 FALSE
LWW1 HBC 2.895 3.235 3.575 TRUE
LWW1 GP 4.177 4.517 4.857 TRUE
CC3 HBC 2.568 2.909 3.249 TRUE
CC3 GP 3.850 4.191 4.531 TRUE
HBC GP 0.942 1.282 1.622 TRUE

download these results as csv


https://music-ir.org/mirex/results/2009/tag/MajorMiner/small.binary_FMeasure_per_track.friedman.tukeyKramerHSD.png

Tag AUC-ROC (Affinity) Friedman Test

The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each tag in the test, averaged over all folds.

TeamID TeamID Lowerbound Mean Upperbound Significance
LWW2 CC2 -1.050 1.200 3.450 FALSE
LWW2 GT2 -0.162 2.089 4.339 FALSE
LWW2 GT1 0.349 2.600 4.851 TRUE
LWW2 LWW1 0.016 2.267 4.517 TRUE
LWW2 CC1 1.594 3.844 6.095 TRUE
LWW2 BP2 2.372 4.622 6.873 TRUE
LWW2 CC4 2.861 5.111 7.362 TRUE
LWW2 BP1 2.572 4.822 7.073 TRUE
LWW2 HBC 3.372 5.622 7.873 TRUE
LWW2 CC3 4.438 6.689 8.939 TRUE
CC2 GT2 -1.362 0.889 3.139 FALSE
CC2 GT1 -0.851 1.400 3.651 FALSE
CC2 LWW1 -1.184 1.067 3.317 FALSE
CC2 CC1 0.394 2.644 4.895 TRUE
CC2 BP2 1.172 3.422 5.673 TRUE
CC2 CC4 1.661 3.911 6.162 TRUE
CC2 BP1 1.372 3.622 5.873 TRUE
CC2 HBC 2.172 4.422 6.673 TRUE
CC2 CC3 3.238 5.489 7.739 TRUE
GT2 GT1 -1.739 0.511 2.762 FALSE
GT2 LWW1 -2.073 0.178 2.428 FALSE
GT2 CC1 -0.495 1.756 4.006 FALSE
GT2 BP2 0.283 2.533 4.784 TRUE
GT2 CC4 0.772 3.022 5.273 TRUE
GT2 BP1 0.483 2.733 4.984 TRUE
GT2 HBC 1.283 3.533 5.784 TRUE
GT2 CC3 2.349 4.600 6.851 TRUE
GT1 LWW1 -2.584 -0.333 1.917 FALSE
GT1 CC1 -1.006 1.244 3.495 FALSE
GT1 BP2 -0.228 2.022 4.273 FALSE
GT1 CC4 0.261 2.511 4.762 TRUE
GT1 BP1 -0.028 2.222 4.473 FALSE
GT1 HBC 0.772 3.022 5.273 TRUE
GT1 CC3 1.838 4.089 6.339 TRUE
LWW1 CC1 -0.673 1.578 3.828 FALSE
LWW1 BP2 0.105 2.356 4.606 TRUE
LWW1 CC4 0.594 2.844 5.095 TRUE
LWW1 BP1 0.305 2.556 4.806 TRUE
LWW1 HBC 1.105 3.356 5.606 TRUE
LWW1 CC3 2.172 4.422 6.673 TRUE
CC1 BP2 -1.473 0.778 3.028 FALSE
CC1 CC4 -0.984 1.267 3.517 FALSE
CC1 BP1 -1.273 0.978 3.228 FALSE
CC1 HBC -0.473 1.778 4.028 FALSE
CC1 CC3 0.594 2.844 5.095 TRUE
BP2 CC4 -1.762 0.489 2.739 FALSE
BP2 BP1 -2.050 0.200 2.450 FALSE
BP2 HBC -1.250 1.000 3.251 FALSE
BP2 CC3 -0.184 2.067 4.317 FALSE
CC4 BP1 -2.539 -0.289 1.962 FALSE
CC4 HBC -1.739 0.511 2.762 FALSE
CC4 CC3 -0.673 1.578 3.828 FALSE
BP1 HBC -1.450 0.800 3.050 FALSE
BP1 CC3 -0.384 1.867 4.117 FALSE
HBC CC3 -1.184 1.067 3.317 FALSE

download these results as csv

https://music-ir.org/mirex/results/2009/tag/MajorMiner/small.affinity.AUC_ROC_TAG.friedman.tukeyKramerHSD.png

Per Track AUC-ROC (Affinity) Friedman Test

The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each track/clip in the test, averaged over all folds.

TeamID TeamID Lowerbound Mean Upperbound Significance
CC1 CC2 -0.311 0.003 0.317 FALSE
CC1 GT2 -0.214 0.100 0.414 FALSE
CC1 GT1 0.224 0.539 0.853 TRUE
CC1 BP1 0.133 0.448 0.762 TRUE
CC1 BP2 0.726 1.040 1.355 TRUE
CC1 CC4 0.725 1.040 1.354 TRUE
CC1 CC3 0.729 1.044 1.358 TRUE
CC1 HBC 0.977 1.291 1.606 TRUE
CC1 LWW2 2.630 2.945 3.259 TRUE
CC1 LWW1 3.096 3.410 3.725 TRUE
CC2 GT2 -0.217 0.097 0.411 FALSE
CC2 GT1 0.221 0.536 0.850 TRUE
CC2 BP1 0.130 0.445 0.759 TRUE
CC2 BP2 0.723 1.038 1.352 TRUE
CC2 CC4 0.722 1.037 1.351 TRUE
CC2 CC3 0.726 1.041 1.355 TRUE
CC2 HBC 0.974 1.288 1.603 TRUE
CC2 LWW2 2.627 2.942 3.256 TRUE
CC2 LWW1 3.093 3.407 3.722 TRUE
GT2 GT1 0.124 0.439 0.753 TRUE
GT2 BP1 0.033 0.348 0.662 TRUE
GT2 BP2 0.626 0.940 1.255 TRUE
GT2 CC4 0.625 0.940 1.254 TRUE
GT2 CC3 0.629 0.944 1.258 TRUE
GT2 HBC 0.877 1.191 1.506 TRUE
GT2 LWW2 2.530 2.845 3.159 TRUE
GT2 LWW1 2.996 3.310 3.625 TRUE
GT1 BP1 -0.405 -0.091 0.224 FALSE
GT1 BP2 0.187 0.502 0.816 TRUE
GT1 CC4 0.186 0.501 0.815 TRUE
GT1 CC3 0.190 0.505 0.819 TRUE
GT1 HBC 0.438 0.753 1.067 TRUE
GT1 LWW2 2.091 2.406 2.720 TRUE
GT1 LWW1 2.557 2.871 3.186 TRUE
BP1 BP2 0.278 0.593 0.907 TRUE
BP1 CC4 0.277 0.592 0.906 TRUE
BP1 CC3 0.281 0.596 0.910 TRUE
BP1 HBC 0.529 0.844 1.158 TRUE
BP1 LWW2 2.182 2.497 2.811 TRUE
BP1 LWW1 2.648 2.962 3.277 TRUE
BP2 CC4 -0.315 -0.001 0.314 FALSE
BP2 CC3 -0.311 0.003 0.318 FALSE
BP2 HBC -0.064 0.251 0.566 FALSE
BP2 LWW2 1.590 1.904 2.219 TRUE
BP2 LWW1 2.055 2.370 2.684 TRUE
CC4 CC3 -0.310 0.004 0.319 FALSE
CC4 HBC -0.063 0.252 0.566 FALSE
CC4 LWW2 1.591 1.905 2.219 TRUE
CC4 LWW1 2.056 2.371 2.685 TRUE
CC3 HBC -0.067 0.248 0.562 FALSE
CC3 LWW2 1.586 1.901 2.215 TRUE
CC3 LWW1 2.052 2.367 2.681 TRUE
HBC LWW2 1.339 1.653 1.968 TRUE
HBC LWW1 1.804 2.119 2.433 TRUE
LWW2 LWW1 0.151 0.466 0.780 TRUE

download these results as csv

https://music-ir.org/mirex/results/2009/tag/MajorMiner/small.affinity.AUC_ROC_TRACK.friedman.tukeyKramerHSD.png

Assorted Results Files for Download

General Results

affinity_tag_fold_AUC_ROC.csv
affinity_clip_AUC_ROC.csv
binary_per_fold_Accuracy.csv
binary_per_fold_Fmeasure.csv
binary_per_fold_negative_example_Accuracy.csv
binary_per_fold_per_track_Accuracy.csv
binary_per_fold_per_track_Fmeasure.csv
binary_per_fold_per_track_negative_example_Accuracy.csv
binary_per_fold_per_track_positive_example_Accuracy.csv
binary_per_fold_positive_example_Accuracy.csv

Friedman's Tests Results

affinity.PrecisionAt3.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt3.friedman.tukeyKramerHSD.png
affinity.PrecisionAt6.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt6.friedman.tukeyKramerHSD.png
affinity.PrecisionAt9.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt9.friedman.tukeyKramerHSD.png
affinity.PrecisionAt12.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt12.friedman.tukeyKramerHSD.png
affinity.PrecisionAt15.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt15.friedman.tukeyKramerHSD.png
binary_Accuracy.friedman.tukeyKramerHSD.csv
binary_Accuracy.friedman.tukeyKramerHSD.png

Results By Algorithm

(.tgz format)

BP1 = Juan José Burred, Geoffroy Peeters
BP2 = Juan José Burred, Geoffroy Peeters
CC1 = Chuan Cao, Ming Li
CC2 = Chuan Cao, Ming Li
CC3 = Chuan Cao, Ming Li
CC4 = Chuan Cao, Ming Li
GP = Geoffroy Peeters
GT1 = George Tzanetakis
GT2 = George Tzanetakis
LWW1 = Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
LWW2 = Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
HBC = Matthew D.Hoffman, David M. Blei, Perry R.Cook