Difference between revisions of "2008:Audio Tag Classification Results"
(→CSV Files Without Rounding) |
(→Friedman test results) |
||
Line 47: | Line 47: | ||
<csv>tag/friedmansTables/tag.affinity.AUC_ROC_TAG.friedman.tukeyKramerHSD.csv</csv> | <csv>tag/friedmansTables/tag.affinity.AUC_ROC_TAG.friedman.tukeyKramerHSD.csv</csv> | ||
+ | [[Image:Affinity.AUC_ROC_TAG.friedman.tukeyKramerHSD.png]] | ||
===AUC-ROC Track Friedman test=== | ===AUC-ROC Track Friedman test=== | ||
Line 53: | Line 54: | ||
<csv>tag/friedmansTables/tag.affinity.AUC_ROC_TRACK.friedman.tukeyKramerHSD.csv</csv> | <csv>tag/friedmansTables/tag.affinity.AUC_ROC_TRACK.friedman.tukeyKramerHSD.csv</csv> | ||
+ | [[Image:Affinity.AUC_ROC_TRACK.friedman.tukeyKramerHSD.png]] | ||
===Tag Classification Accuracy Friedman test=== | ===Tag Classification Accuracy Friedman test=== | ||
Line 59: | Line 61: | ||
<csv>tag/friedmansTables/tag.binary_Accuracy.friedman.tukeyKramerHSD.csv</csv> | <csv>tag/friedmansTables/tag.binary_Accuracy.friedman.tukeyKramerHSD.csv</csv> | ||
+ | [[Image:Binary_Accuracy.friedman.tukeyKramerHSD.png]] | ||
===Tag F-measure Friedman test=== | ===Tag F-measure Friedman test=== | ||
Line 64: | Line 67: | ||
<csv>tag/friedmansTables/tag.binary_FMeasure.friedman.tukeyKramerHSD.csv</csv> | <csv>tag/friedmansTables/tag.binary_FMeasure.friedman.tukeyKramerHSD.csv</csv> | ||
+ | |||
+ | [[Image:Binary_FMeasure.friedman.tukeyKramerHSD.png]] | ||
==Assorted Results Files for Download== | ==Assorted Results Files for Download== |
Revision as of 08:56, 11 September 2008
Contents
- 1 Introduction
- 2 Overall Summary Results
- 2.1 Summary Positive Example Accuracy (Average Across All Folds)
- 2.2 Summary Negative Example Accuracy (Average Across All Folds)
- 2.3 Summary Binary relevance F-Measure (Average Across All Folds)
- 2.4 Summary Binary Accuracy (Average Across All Folds)
- 2.5 Summary AUC-ROC Tag (Average Across All Folds)
- 3 Friedman test results
- 4 Assorted Results Files for Download
Introduction
These are the results for the 2008 running of the Audio Tag Classification task. For background information about this task set please refer to the Audio Tag Classification page.
General Legend
Team ID
LB = L. Barrington, D. Turnball, G. Lanckriet
BBE 1 = T. Bertin-Mahieux, Y. Bengio, D. Eck (KNN)
BBE 2 = T. Bertin-Mahieux, Y. Bengio, D. Eck (NNet)
BBE 3 = T. Bertin-Mahieux, D. Eck, P. Lamere, Y. Bengio (Thierry/Lamere Boosting)
TB = T. Bertin-Mahieux (dumb/smurf)
ME1 = M. I. Mandel, D. P. W. Ellis 1
ME2 = M. I. Mandel, D. P. W. Ellis 2
ME3 = M. I. Mandel, D. P. W. Ellis 3
GP1 = G. Peeters 1
GP2 = G. Peeters 2
TTKV = K. Trohidis, G. Tsoumakas, G. Kalliris, I. Vlahavas
Overall Summary Results
file /nema-raid/www/mirex/results/tag/tag.grand.summary.show.csv not found
Summary Positive Example Accuracy (Average Across All Folds)
file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_positive_example_Accuracy.csv not found
Summary Negative Example Accuracy (Average Across All Folds)
file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_negative_example_Accuracy.csv not found
Summary Binary relevance F-Measure (Average Across All Folds)
file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_Fmeasure.csv not found
Summary Binary Accuracy (Average Across All Folds)
file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_Accuracy.csv not found
Summary AUC-ROC Tag (Average Across All Folds)
file /nema-raid/www/mirex/results/tag/rounded/tag.affinity_tag_AUC_ROC.csv not found
Friedman test results
AUC-ROC Tag Friedman test
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each tag in the test, averaged over all folds.
file /nema-raid/www/mirex/results/tag/friedmansTables/tag.affinity.AUC_ROC_TAG.friedman.tukeyKramerHSD.csv not found
File:Affinity.AUC ROC TAG.friedman.tukeyKramerHSD.png
AUC-ROC Track Friedman test
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each track in the test. Each track appears in exactly once over all three folds of the test. However, we are uncertain if these measurements are truly independent as an multiple tracks from each artist are used.
file /nema-raid/www/mirex/results/tag/friedmansTables/tag.affinity.AUC_ROC_TRACK.friedman.tukeyKramerHSD.csv not found
File:Affinity.AUC ROC TRACK.friedman.tukeyKramerHSD.png
Tag Classification Accuracy Friedman test
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the classification accuracy for each tag in the test, averaged over all folds.
file /nema-raid/www/mirex/results/tag/friedmansTables/tag.binary_Accuracy.friedman.tukeyKramerHSD.csv not found
File:Binary Accuracy.friedman.tukeyKramerHSD.png
Tag F-measure Friedman test
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each tag in the test, averaged over all folds.
file /nema-raid/www/mirex/results/tag/friedmansTables/tag.binary_FMeasure.friedman.tukeyKramerHSD.csv not found
File:Binary FMeasure.friedman.tukeyKramerHSD.png
Assorted Results Files for Download
AUC-ROC Clip Data
(Too large for easy Wiki viewing)
tag.affinity_clip_AUC_ROC.csv
CSV Files Without Rounding
tag.affinity.clip.auc.roc.csv
tag.binary.accuracy.csv
tag.binary.fmeasure.csv
tag.binary.negative.example.accuracy.csv
tag.binary.positive.example.accuracy.csv
Results By Algorithm
(.tar.gz)
LB = L. Barrington, D. Turnball, G. Lanckriet
BBE 1 = T. Bertin-Mahieux, Y. Bengio, D. Eck (KNN)
BBE 2 = T. Bertin-Mahieux, Y. Bengio, D. Eck (NNet)
BBE 3 = T. Bertin-Mahieux, D. Eck, P. Lamere, Y. Bengio (Thierry/Lamere Boosting)
TB = Bertin-Mahieux (dumb/smurf)
ME1 = M. I. Mandel, D. P. W. Ellis 1
ME2 = M. I. Mandel, D. P. W. Ellis 2
ME3 = M. I. Mandel, D. P. W. Ellis 3
GP1 = G. Peeters 1
GP2 = G. Peeters 2
TTKV = K. Trohidis, G. Tsoumakas, G. Kalliris, I. Vlahavas
.