Difference between revisions of "2008:Audio Tag Classification Results"

From MIREX Wiki
(Overall Summary Results)
(Overall Summary Results)
Line 17: Line 17:
  
 
==Overall Summary Results==
 
==Overall Summary Results==
 +
 
<csv>tag.grand.summary.show.csv</csv>
 
<csv>tag.grand.summary.show.csv</csv>
  
===Summary Positive Example Accuracy (All Folds)===
+
===Summary Positive Example Accuracy (Average Across All Folds)===
  
 
<csv>tag/rounded/tag.binary_avg_positive_example_Accuracy.csv</csv>
 
<csv>tag/rounded/tag.binary_avg_positive_example_Accuracy.csv</csv>
  
===Summary Negative Example Accuracy (All Folds)===
+
===Summary Negative Example Accuracy (Average Across All Folds)===
  
 
<csv>tag/rounded/tag.binary_avg_negative_example_Accuracy.csv</csv>
 
<csv>tag/rounded/tag.binary_avg_negative_example_Accuracy.csv</csv>
  
===Summary Binary relevence F-Measure (All Folds)===
+
===Summary Binary relevance F-Measure (Average Across All Folds)===
  
 
<csv>tag/rounded/tag.binary_avg_Fmeasure.csv</csv>
 
<csv>tag/rounded/tag.binary_avg_Fmeasure.csv</csv>
  
===Summary Binary Accuracy (All Folds)===
+
===Summary Binary Accuracy (Average Across All Folds)===
 +
 
 
<csv>tag/rounded/tag.binary_avg_Accuracy.csv</csv>
 
<csv>tag/rounded/tag.binary_avg_Accuracy.csv</csv>
  
===Summary AUC-ROC Tag (All Folds)===
+
===Summary AUC-ROC Tag (Average Across All Folds)===
 +
 
 
<csv>tag/rounded/tag.affinity_tag_AUC_ROC.csv</csv>
 
<csv>tag/rounded/tag.affinity_tag_AUC_ROC.csv</csv>
  

Revision as of 08:31, 11 September 2008

Introduction

These are the results for the 2008 running of the Audio Tag Classification task. For background information about this task set please refer to the Audio Tag Classification page.

General Legend

Team ID

LB = L. Barrington, D. Turnball, G. Lanckriet
BBE 1 = T. Bertin-Mahieux, Y. Bengio, D. Eck (KNN)
BBE 2 = T. Bertin-Mahieux, Y. Bengio, D. Eck (NNet)
BBE 3 = T. Bertin-Mahieux, D. Eck, P. Lamere, Y. Bengio (Thierry/Lamere Boosting)
TB = T. Bertin-Mahieux (dumb/smurf)
ME1 = M. I. Mandel, D. P. W. Ellis 1
ME2 = M. I. Mandel, D. P. W. Ellis 2
ME3 = M. I. Mandel, D. P. W. Ellis 3
GP1 = G. Peeters 1
GP2 = G. Peeters 2
TTKV = K. Trohidis, G. Tsoumakas, G. Kalliris, I. Vlahavas

Overall Summary Results

file /nema-raid/www/mirex/results/tag.grand.summary.show.csv not found

Summary Positive Example Accuracy (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_positive_example_Accuracy.csv not found

Summary Negative Example Accuracy (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_negative_example_Accuracy.csv not found

Summary Binary relevance F-Measure (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_Fmeasure.csv not found

Summary Binary Accuracy (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.binary_avg_Accuracy.csv not found

Summary AUC-ROC Tag (Average Across All Folds)

file /nema-raid/www/mirex/results/tag/rounded/tag.affinity_tag_AUC_ROC.csv not found

Assorted Results Files for Download

AUC-ROC Clip Data

(Too large for easy Wiki viewing)
tag.affinity.clip.auc.roc.csv

CSV Files Without Rounding

tag.affinity.clip.auc.roc.csv
tag.binary.accuracy.csv
tag.binary.fmeasure.csv
tag.binary.negative.example.accuracy.csv
tag.binary.positive.example.accuracy.csv

Results By Algorithm

(.tar.gz)
LB = L. Barrington, D. Turnball, G. Lanckriet
BBE 1 = T. Bertin-Mahieux, Y. Bengio, D. Eck (KNN)
BBE 2 = T. Bertin-Mahieux, Y. Bengio, D. Eck (NNet)
BBE 3 = T. Bertin-Mahieux, D. Eck, P. Lamere, Y. Bengio (Thierry/Lamere Boosting)
TB = Bertin-Mahieux (dumb/smurf)
ME1 = M. I. Mandel, D. P. W. Ellis 1
ME2 = M. I. Mandel, D. P. W. Ellis 2
ME3 = M. I. Mandel, D. P. W. Ellis 3
GP1 = G. Peeters 1
GP2 = G. Peeters 2
TTKV = K. Trohidis, G. Tsoumakas, G. Kalliris, I. Vlahavas
.