Difference between revisions of "2008:Audio Cover Song Identification Results"

From MIREX Wiki
(Number of Correct Covers at Rank X Returned in Top Ten)
(Run Times)
Line 27: Line 27:
 
===Run Times===  
 
===Run Times===  
 
<csv>cover/coversong_runtimes.csv</csv>
 
<csv>cover/coversong_runtimes.csv</csv>
 +
CL1,CL2 ran on FAST2,FAST3.
 +
All others ran on ALE Nodes.
  
 
===Friedman's Test for Significant Differences===
 
===Friedman's Test for Significant Differences===

Revision as of 18:11, 12 September 2008

Still missing runtimes JSD Sept. 11 2008.

Introduction

These are the results for the 2008 running of the Audio Cover Song Identification task. For background information about this task set please refer to the Audio Cover Song Identification page.

Each system was given a collection of 1000 songs which included of 30 different classes (sets) of cover songs where each class/set was represented by 11 different versions of a particular song. Each of the 330 cover songs were used as queries and the systems were required to return 10 results for each query. Systems were evaluated on the number of the songs from the same class/set as the query that were returned in the list of 10 results for each query. Average precision, which looks at the entire per-query rank-ordered list of all songs in the collection, was the new metric introduced last year.


General Legend

Team ID

CL1 = C. Cao, M. Li
CL2 = C. Cao, M. Li
EL1 = A. Egorov, G. Linetsky
EL2 = A. Egorov, G. Linetsky
EL3 = A. Egorov, G. Linetsky
JCJ = J. H. Jensen, M. G. Christensen, S. H. Jensen
SGH1 = J. Serrà, E. Gόmez, P. Herrera
SGH2 = J. Serrà, E. Gόmez, P. Herrera

Overall Summary Results

file /nema-raid/www/mirex/results//cover/grand.summary.v2.csv not found


Number of Correct Covers at Rank X Returned in Top Ten

file /nema-raid/www/mirex/results/cover/cover.toptendist.transposed.csv not found

Run Times

file /nema-raid/www/mirex/results/cover/coversong_runtimes.csv not found CL1,CL2 ran on FAST2,FAST3. All others ran on ALE Nodes.

Friedman's Test for Significant Differences

The Friedman test was run in MATLAB against the Average Precision summary data over the 30 song groups.
Command: [c,m,h,gnames] = multcompare(stats, 'ctype', 'tukey-kramer','estimate', 'friedman', 'alpha', 0.05);

file /nema-raid/www/mirex/results/cover/coversong.friedman.anova.csv not found

file /nema-raid/www/mirex/results/cover/coversong.friedman.csv not found

File:Coversong.friedman.png

Average Performance per Query Group

These are the arithmetic means of the average precisions within each of the 30 query groups.

file /nema-raid/www/mirex/results/cover/cover.mapquerygroup.v2.csv not found

Individual Results Files

Average Precision Scores for Each Query

CL1 = C. Cao, M. Li
CL2 = C. Cao, M. Li
EL1 = A. Egorov, G. Linetsky
EL2 = A. Egorov, G. Linetsky
EL3 = A. Egorov, G. Linetsky
JCJ = J. H. Jensen, M. G. Christensen, S. H. Jensen
SGH1 = J. Serrà, E. Gόmez, P. Herrera
SGH2 = J. Serrà, E. Gόmez, P. Herrera

Ranks of the Ten Cover Songs Returned for Each Query

CL1 = C. Cao, M. Li
CL2 = C. Cao, M. Li
EL1 = A. Egorov, G. Linetsky
EL2 = A. Egorov, G. Linetsky
EL3 = A. Egorov, G. Linetsky
JCJ = J. H. Jensen, M. G. Christensen, S. H. Jensen
SGH1 = J. Serrà, E. Gόmez, P. Herrera
SGH2 = J. Serrà, E. Gόmez, P. Herrera

Runtimes

Where algorithms have been multi-threaded, the longest runtime is reported.

Where runtimes were not properly reported, file timestamps have been used to approximate a runtime.