Difference between revisions of "2008:Audio Tag Classification Results"

From MIREX Wiki
(CSV Files Without Rounding)
Line 50: Line 50:
  
 
===CSV Files Without Rounding===
 
===CSV Files Without Rounding===
[https://www.music-ir.org/mirex/2008/results/tag.affinity.tag.auc.roc.csv tag.affinity.clip.auc.roc.csv]<br />
+
[https://www.music-ir.org/mirex/2008/results/tag/csv_raw/tag.affinity.tag.auc.roc.csv tag.affinity.clip.auc.roc.csv]<br />
[https://www.music-ir.org/mirex/2008/results/tag.binary.accuracy.csv tag.binary.accuracy.csv]<br />
+
[https://www.music-ir.org/mirex/2008/results/tag/csv_raw/tag.binary.accuracy.csv tag.binary.accuracy.csv]<br />
[https://www.music-ir.org/mirex/2008/results/tag.binary.fmeasure.csv tag.binary.fmeasure.csv]<br />
+
[https://www.music-ir.org/mirex/2008/results/tag/csv_raw/tag.binary.fmeasure.csv tag.binary.fmeasure.csv]<br />
[https://www.music-ir.org/mirex/2008/results/tag.binary.negative.example.accuracy.csv tag.binary.negative.example.accuracy.csv]<br />
+
[https://www.music-ir.org/mirex/2008/results/tag/csv_raw/tag.binary.negative.example.accuracy.csv tag.binary.negative.example.accuracy.csv]<br />
[https://www.music-ir.org/mirex/2008/results/tag.binary.positive.example.accuracy.csv tag.binary.positive.example.accuracy.csv]<br />
+
[https://www.music-ir.org/mirex/2008/results/tag/csv_raw/tag.binary.positive.example.accuracy.csv tag.binary.positive.example.accuracy.csv]<br />
  
 
===Results By Algorithm===
 
===Results By Algorithm===

Revision as of 08:48, 11 September 2008

Introduction

These are the results for the 2008 running of the Audio Tag Classification task. For background information about this task set please refer to the Audio Tag Classification page.

General Legend

Team ID

LB = L. Barrington, D. Turnball, G. Lanckriet
BBE 1 = T. Bertin-Mahieux, Y. Bengio, D. Eck (KNN)
BBE 2 = T. Bertin-Mahieux, Y. Bengio, D. Eck (NNet)
BBE 3 = T. Bertin-Mahieux, D. Eck, P. Lamere, Y. Bengio (Thierry/Lamere Boosting)
TB = T. Bertin-Mahieux (dumb/smurf)
ME1 = M. I. Mandel, D. P. W. Ellis 1
ME2 = M. I. Mandel, D. P. W. Ellis 2
ME3 = M. I. Mandel, D. P. W. Ellis 3
GP1 = G. Peeters 1
GP2 = G. Peeters 2
TTKV = K. Trohidis, G. Tsoumakas, G. Kalliris, I. Vlahavas

Overall Summary Results

file /nema-raid/www/mirex/results/tag/tag.grand.summary.show.csv not found

Summary Positive Example Accuracy (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_positive_example_Accuracy.csv not found

Summary Negative Example Accuracy (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_negative_example_Accuracy.csv not found

Summary Binary relevance F-Measure (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_Fmeasure.csv not found

Summary Binary Accuracy (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_Accuracy.csv not found

Summary AUC-ROC Tag (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.affinity_tag_AUC_ROC.csv not found

Friedman test results

Assorted Results Files for Download

AUC-ROC Clip Data

(Too large for easy Wiki viewing)
tag.affinity_clip_AUC_ROC.csv

CSV Files Without Rounding

tag.affinity.clip.auc.roc.csv
tag.binary.accuracy.csv
tag.binary.fmeasure.csv
tag.binary.negative.example.accuracy.csv
tag.binary.positive.example.accuracy.csv

Results By Algorithm

(.tar.gz)
LB = L. Barrington, D. Turnball, G. Lanckriet
BBE 1 = T. Bertin-Mahieux, Y. Bengio, D. Eck (KNN)
BBE 2 = T. Bertin-Mahieux, Y. Bengio, D. Eck (NNet)
BBE 3 = T. Bertin-Mahieux, D. Eck, P. Lamere, Y. Bengio (Thierry/Lamere Boosting)
TB = Bertin-Mahieux (dumb/smurf)
ME1 = M. I. Mandel, D. P. W. Ellis 1
ME2 = M. I. Mandel, D. P. W. Ellis 2
ME3 = M. I. Mandel, D. P. W. Ellis 3
GP1 = G. Peeters 1
GP2 = G. Peeters 2
TTKV = K. Trohidis, G. Tsoumakas, G. Kalliris, I. Vlahavas
.