Difference between revisions of "2010:Audio Music Similarity and Retrieval Results"

From MIREX Wiki
(Team ID)
(Team ID)
Line 19: Line 19:
 
|-
 
|-
 
! BWL1
 
! BWL1
| MTG-AMS ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2010/BWL1.pdf PDF] || [http://mtg.upf.edu Dmitry Bogdanov], [http://mtg.upf.edu Nicolas Wack], [http://mtg.upf.edu Perfecto Herrera]
+
| MTG-AMS ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2010/BWL1.pdf PDF] || [http://mtg.upf.edu Dmitry Bogdanov], [http://mtg.upf.edu Joan Serrà], [http://mtg.upf.edu Nicolas Wack], [http://mtg.upf.edu Perfecto Herrera]
 
|-
 
|-
 
! PS1
 
! PS1

Revision as of 05:55, 5 August 2010

Introduction

These are the results for the 2010 running of the Audio Music Similarity and Retrieval task set. For background information about this task set please refer to the Audio Music Similarity and Retrieval page.

Each system was given 7000 songs chosen from IMIRSEL's "uspop", "uscrap" and "american" "classical" and "sundry" collections. Each system then returned a 7000x7000 distance matrix. 100 songs were randomly selected from the 10 genre groups (10 per genre) as queries and the first 5 most highly ranked songs out of the 7000 were extracted for each query (after filtering out the query itself, returned results from the same artist were also omitted). Then, for each query, the returned results (candidates) from all participants were grouped and were evaluated by human graders using the Evalutron 6000 grading system. Each individual query/candidate set was evaluated by a single grader. For each query/candidate pair, graders provided two scores. Graders were asked to provide 1 categorical BROAD score with 3 categories: NS,SS,VS as explained below, and one FINE score (in the range from 0 to 100). A description and analysis is provided below.

The systems read in 30 second audio clips as their raw data. The same 30 second clips were used in the grading stage.


General Legend

Team ID

Sub code Submission name Abstract Contributors
BWL1 MTG-AMS PDF Dmitry Bogdanov, Joan Serrà, Nicolas Wack, Perfecto Herrera
PS1 PS09 PDF Tim Pohle, Dominik Schnitzer
PSS1 PSS10 PDF Tim Pohle, Klaus Seyerlehner, Dominik Schnitzer
RZ1 RND PDF Rainer Zufall
SSPK2 cbmr_sim PDF Klaus Seyerlehner, Markus Schedl, Tim Pohle, Peter Knees
TLN1 MarsyasSimilarity PDF George Tzanetakis, Steven Ness, Mathieu Lagrange
TLN2 Post-Processing 1 of Marsyas similarity results PDF George Tzanetakis, Mathieu Lagrange, Steven Ness
TLN3 Post-Processing 2 of Marsyas similarity results PDF George Tzanetakis, Mathieu Lagrange, Steven Ness

Broad Categories

NS = Not Similar
SS = Somewhat Similar
VS = Very Similar

Understanding Summary Measures

Fine = Has a range from 0 (failure) to 100 (perfection).
Broad = Has a range from 0 (failure) to 2 (perfection) as each query/candidate pair is scored with either NS=0, SS=1 or VS=2.

Human Evaluation

Overall Summary Results

Measure BWL1 PS1 PSS1 RZ1 SSPK2 TLN1 TLN2 TLN3
Average Fine Score 49.704 55.080 54.984 16.668 56.642 45.842 46.544 46.604
Average Cat Score 1.078 1.228 1.212 0.240 1.248 0.940 0.970 0.968
Artist Filtered Genre Neighbourhood clustering
Top 5 0.532 0.590 0.619 0.083 0.591 0.466 0.480 0.481
Top 10 0.516 0.570 0.600 0.087 0.576 0.448 0.464 0.466
Top 20 0.499 0.547 0.579 0.087 0.558 0.429 0.445 0.447
Top 50 0.469 0.510 0.545 0.088 0.530 0.400 0.416 0.418

download these results as csv
Note:RZ1 is the random result for comparing purpose.

Friedman's Tests

Friedman's Test (FINE Scores)

The Friedman test was run in MATLAB against the Fine summary data over the 100 queries.
Command: [c,m,h,gnames] = multcompare(stats, 'ctype', 'tukey-kramer','estimate', 'friedman', 'alpha', 0.05);

TeamID TeamID Lowerbound Mean Upperbound Significance
SSPK2 PS1 -0.881 0.160 1.202 FALSE
SSPK2 PSS1 -0.791 0.250 1.292 FALSE
SSPK2 BWL1 -0.032 1.010 2.051 FALSE
SSPK2 TLN3 0.454 1.495 2.537 TRUE
SSPK2 TLN2 0.478 1.520 2.562 TRUE
SSPK2 TLN1 0.488 1.530 2.571 TRUE
SSPK2 RZ1 3.514 4.555 5.596 TRUE
PS1 PSS1 -0.952 0.090 1.131 FALSE
PS1 BWL1 -0.192 0.850 1.891 FALSE
PS1 TLN3 0.293 1.335 2.377 TRUE
PS1 TLN2 0.319 1.360 2.401 TRUE
PS1 TLN1 0.329 1.370 2.412 TRUE
PS1 RZ1 3.353 4.395 5.436 TRUE
PSS1 BWL1 -0.281 0.760 1.802 FALSE
PSS1 TLN3 0.203 1.245 2.287 TRUE
PSS1 TLN2 0.229 1.270 2.312 TRUE
PSS1 TLN1 0.238 1.280 2.321 TRUE
PSS1 RZ1 3.264 4.305 5.346 TRUE
BWL1 TLN3 -0.556 0.485 1.526 FALSE
BWL1 TLN2 -0.531 0.510 1.552 FALSE
BWL1 TLN1 -0.521 0.520 1.562 FALSE
BWL1 RZ1 2.503 3.545 4.587 TRUE
TLN3 TLN2 -1.016 0.025 1.067 FALSE
TLN3 TLN1 -1.006 0.035 1.077 FALSE
TLN3 RZ1 2.018 3.060 4.101 TRUE
TLN2 TLN1 -1.032 0.010 1.052 FALSE
TLN2 RZ1 1.994 3.035 4.077 TRUE
TLN1 RZ1 1.984 3.025 4.066 TRUE

download these results as csv

2010AMS.evalutron.fine.friedman.tukeyKramerHSD.png

Friedman's Test (BROAD Scores)

The Friedman test was run in MATLAB against the BROAD summary data over the 100 queries.
Command: [c,m,h,gnames] = multcompare(stats, 'ctype', 'tukey-kramer','estimate', 'friedman', 'alpha', 0.05);

TeamID TeamID Lowerbound Mean Upperbound Significance
SSPK2 PS1 -1.022 -0.060 0.902 FALSE
SSPK2 PSS1 -0.792 0.170 1.132 FALSE
SSPK2 BWL1 -0.137 0.825 1.787 FALSE
SSPK2 TLN2 0.368 1.330 2.292 TRUE
SSPK2 TLN3 0.438 1.400 2.362 TRUE
SSPK2 TLN1 0.568 1.530 2.492 TRUE
SSPK2 RZ1 3.243 4.205 5.167 TRUE
PS1 PSS1 -0.732 0.230 1.192 FALSE
PS1 BWL1 -0.077 0.885 1.847 FALSE
PS1 TLN2 0.428 1.390 2.352 TRUE
PS1 TLN3 0.498 1.460 2.422 TRUE
PS1 TLN1 0.628 1.590 2.552 TRUE
PS1 RZ1 3.303 4.265 5.227 TRUE
PSS1 BWL1 -0.307 0.655 1.617 FALSE
PSS1 TLN2 0.198 1.160 2.122 TRUE
PSS1 TLN3 0.268 1.230 2.192 TRUE
PSS1 TLN1 0.398 1.360 2.322 TRUE
PSS1 RZ1 3.073 4.035 4.997 TRUE
BWL1 TLN2 -0.457 0.505 1.467 FALSE
BWL1 TLN3 -0.387 0.575 1.537 FALSE
BWL1 TLN1 -0.257 0.705 1.667 FALSE
BWL1 RZ1 2.418 3.380 4.342 TRUE
TLN2 TLN3 -0.892 0.070 1.032 FALSE
TLN2 TLN1 -0.762 0.200 1.162 FALSE
TLN2 RZ1 1.913 2.875 3.837 TRUE
TLN3 TLN1 -0.832 0.130 1.092 FALSE
TLN3 RZ1 1.843 2.805 3.767 TRUE
TLN1 RZ1 1.713 2.675 3.637 TRUE

download these results as csv

2010AMS.evalutron.cat.friedman.tukeyKramerHSD.png


Summary Results by Query

FINE Scores

These are the mean FINE scores per query assigned by Evalutron graders. The FINE scores for the 5 candidates returned per algorithm, per query, have been averaged. Values are bounded between 0 and 100. A perfect score would be 100. Genre labels have been included for reference.

Genre Query BWL1 PS1 PSS1 RZ1 SSPK2 TLN1 TLN2 TLN3
BAROQUE d004083 24.0 81.0 76.0 12.4 79.0 61.0 68.0 68.0
BAROQUE d007329 42.0 32.0 35.0 18.6 38.6 40.6 37.6 37.6
BAROQUE d009136 100.0 97.0 94.0 23.8 96.8 98.2 96.2 98.2
BAROQUE d010069 67.0 77.2 79.0 23.2 56.8 44.4 49.8 44.4
BAROQUE d011911 48.6 62.4 55.6 9.6 48.4 33.4 50.4 35.4
BAROQUE d012251 63.0 66.0 66.0 13.6 35.0 68.0 51.0 51.0
BAROQUE d013041 35.0 38.0 30.0 14.0 27.0 39.0 39.0 39.0
BAROQUE d015413 66.0 78.0 85.0 25.0 84.0 65.0 68.0 68.0
BAROQUE d016516 24.8 12.2 12.8 5.4 12.4 33.8 28.6 28.6
BAROQUE d017625 54.2 63.0 71.2 9.4 65.2 47.2 32.0 32.0
BLUES e001418 28.0 33.0 27.0 6.4 53.0 17.0 22.0 22.0
BLUES e002753 44.4 47.6 35.4 31.6 46.2 53.8 53.0 53.0
BLUES e004984 86.4 90.0 88.2 20.4 86.4 84.0 86.0 86.0
BLUES e005445 58.2 63.2 59.2 34.4 94.2 39.6 52.0 52.0
BLUES e005819 70.4 79.6 75.6 13.8 79.0 72.4 75.2 75.2
BLUES e007623 17.2 27.0 22.0 6.2 46.0 17.0 20.0 20.0
BLUES e008581 35.2 32.8 32.6 24.6 58.4 27.4 24.0 31.6
BLUES e009749 26.2 62.4 39.0 12.2 51.2 54.8 40.0 40.0
BLUES e011570 69.6 71.4 51.4 4.6 61.0 62.2 55.6 55.6
BLUES e012625 100.0 99.2 100.0 22.8 100.0 94.6 94.6 94.6
CLASSICAL d001301 54.0 58.0 56.8 19.6 65.6 49.2 45.0 50.4
CLASSICAL d003863 75.2 85.4 78.6 7.0 84.0 70.6 51.6 62.0
CLASSICAL d004683 36.6 58.6 61.8 9.4 60.4 54.0 64.6 64.6
CLASSICAL d009122 44.4 54.2 62.2 17.6 69.0 42.4 32.4 32.4
CLASSICAL d009750 100.0 94.4 90.0 13.0 88.8 86.6 71.2 71.2
CLASSICAL d010693 60.4 55.4 61.4 14.6 47.0 32.4 25.8 25.8
CLASSICAL d012461 42.4 19.2 20.0 3.2 52.8 32.8 23.4 25.0
CLASSICAL d013654 67.0 79.2 73.8 10.6 77.0 53.6 58.2 58.2
CLASSICAL d016051 86.8 89.0 84.8 1.8 89.8 90.8 90.8 90.8
CLASSICAL d016303 56.6 52.8 48.4 15.8 48.4 38.4 42.4 42.4
COUNTRY e003228 43.2 66.0 72.0 34.0 61.2 39.0 55.0 52.0
COUNTRY e003847 38.2 54.8 44.8 12.0 34.2 27.8 27.6 27.6
COUNTRY e005737 43.2 32.8 33.4 22.6 60.0 37.6 41.0 37.2
COUNTRY e006980 68.6 63.0 74.4 33.8 54.8 55.2 57.4 57.4
COUNTRY e009948 67.4 78.0 84.0 2.4 71.0 25.0 25.0 25.0
COUNTRY e010366 50.0 64.8 64.8 13.2 77.2 47.4 61.4 60.4
COUNTRY e011607 41.8 68.8 60.4 12.2 36.0 42.6 46.2 44.8
COUNTRY e017889 52.8 59.2 71.6 13.8 69.2 21.6 21.6 21.6
COUNTRY e019097 30.0 33.0 39.0 8.6 31.0 43.0 43.0 43.0
COUNTRY e019474 28.0 55.0 39.0 6.4 39.0 45.0 50.0 50.0
EDANCE a000643 1.4 1.8 0.6 3.0 8.0 6.8 6.0 6.0
EDANCE a001494 15.8 32.8 23.0 6.6 50.2 7.4 32.6 32.6
EDANCE a003006 58.8 44.0 48.2 39.4 32.0 59.6 59.6 59.6
EDANCE a008244 59.0 50.0 56.8 7.4 66.4 62.8 64.2 64.2
EDANCE b008057 3.2 1.4 3.4 0.8 2.4 0.6 4.6 4.6
EDANCE b014261 49.2 52.4 53.6 28.0 57.8 30.2 35.4 36.0
EDANCE b019365 52.4 42.0 44.0 4.6 52.4 21.6 40.6 39.4
EDANCE f001490 52.6 63.6 81.8 24.6 71.0 41.0 39.2 39.2
EDANCE f004897 42.8 39.0 20.4 0.0 25.8 4.6 7.2 7.2
EDANCE f018057 82.8 84.0 82.6 35.2 74.8 83.0 77.0 77.0
JAZZ a000470 56.0 54.0 41.0 22.0 57.0 73.0 62.0 61.0
JAZZ e002021 80.6 91.8 92.2 45.8 94.6 92.0 92.0 91.0
JAZZ e003659 47.8 54.0 27.8 4.0 49.8 6.0 23.6 23.6
JAZZ e007349 32.0 43.0 51.0 9.0 47.0 16.0 19.0 19.0
JAZZ e009585 10.6 14.6 20.0 0.0 11.8 21.8 12.6 12.6
JAZZ e010110 59.0 76.0 60.0 0.0 75.0 17.0 17.0 20.0
JAZZ e014019 46.0 28.0 23.0 11.0 64.0 44.0 59.0 57.0
JAZZ e015948 75.6 69.6 74.6 18.0 66.0 66.0 67.8 67.8
JAZZ e019177 34.0 30.2 41.2 9.0 45.8 7.8 4.4 4.4
JAZZ e019705 74.6 67.8 64.2 31.4 72.6 37.0 48.6 48.6
METAL b004052 5.0 29.2 31.6 46.0 9.8 55.0 51.0 51.0
METAL b008274 42.6 58.6 66.8 33.4 52.6 54.6 60.6 60.6
METAL b008631 32.6 66.8 78.0 24.4 69.0 57.4 57.0 56.6
METAL b010864 31.2 40.2 32.8 6.4 72.2 34.6 46.0 46.0
METAL b011136 64.2 74.6 72.6 10.8 70.0 71.2 77.4 77.4
METAL b011316 26.0 30.8 20.6 11.0 22.8 26.2 26.2 13.8
METAL b012933 10.4 20.2 28.4 0.0 32.8 11.0 19.4 19.4
METAL b013451 23.6 13.8 30.2 10.6 10.0 39.8 51.0 51.0
METAL b013972 60.8 47.8 53.8 11.4 73.6 52.2 42.2 42.2
METAL b018129 60.8 77.6 79.6 40.6 81.8 76.2 76.6 76.6
RAPHIPHOP a004379 63.0 74.4 62.4 18.2 78.4 59.8 57.0 62.2
RAPHIPHOP a004607 44.4 40.2 41.6 17.0 30.6 50.8 53.2 52.8
RAPHIPHOP a005261 16.0 14.6 48.2 30.0 48.2 35.4 31.4 31.4
RAPHIPHOP a005341 71.0 72.6 69.2 22.4 66.6 66.6 58.8 65.4
RAPHIPHOP a005918 48.4 38.2 32.2 2.2 51.6 42.4 37.2 37.2
RAPHIPHOP a008296 81.0 77.8 81.4 33.6 87.6 69.4 70.8 70.8
RAPHIPHOP b001476 63.0 63.0 48.0 16.0 59.2 58.4 58.4 58.4
RAPHIPHOP b005622 76.0 78.0 83.0 16.4 75.0 56.0 56.0 56.0
RAPHIPHOP b009667 37.2 31.0 34.0 4.8 34.6 42.0 37.0 39.6
RAPHIPHOP b010090 67.6 71.0 67.4 40.8 68.2 67.8 70.4 70.4
ROCKROLL b003741 72.2 68.2 82.8 20.2 70.8 12.8 3.6 3.6
ROCKROLL b008148 78.4 83.0 83.4 18.6 85.8 81.6 81.6 81.6
ROCKROLL b008265 60.8 50.8 53.2 36.8 57.2 60.0 58.4 58.4
ROCKROLL b008704 48.0 62.0 67.4 26.4 62.4 64.2 56.0 56.0
ROCKROLL b009829 57.0 69.0 54.0 12.2 30.0 31.0 25.0 25.0
ROCKROLL b011257 55.2 63.2 61.0 26.4 58.6 51.0 51.0 51.0
ROCKROLL b015298 68.0 69.4 66.8 16.0 78.4 61.4 61.4 61.4
ROCKROLL b017838 50.2 53.0 50.0 38.4 63.6 53.4 48.8 49.2
ROCKROLL b019038 70.0 67.6 78.8 12.4 74.2 67.2 63.2 67.2
ROCKROLL e003526 18.0 59.0 68.0 11.0 46.0 22.0 26.0 26.0
ROMANTIC d000312 92.8 92.0 90.8 29.0 90.4 81.0 79.2 79.2
ROMANTIC d002366 58.6 56.0 69.2 11.8 51.8 61.0 50.6 56.6
ROMANTIC d004649 40.6 33.8 38.4 19.0 43.6 36.8 36.4 34.0
ROMANTIC d008538 50.0 78.0 89.0 18.0 76.0 34.0 45.0 45.0
ROMANTIC d009847 5.8 5.4 9.2 26.8 8.2 17.4 10.8 10.8
ROMANTIC d014039 6.2 12.8 8.4 5.0 18.4 25.8 24.8 25.8
ROMANTIC d014488 34.4 42.0 42.8 3.6 49.0 29.0 26.6 26.6
ROMANTIC d017256 36.0 70.0 80.4 7.8 64.8 19.4 43.6 43.6
ROMANTIC d017550 40.4 37.0 35.6 13.6 37.6 21.6 35.6 35.6
ROMANTIC d019896 22.0 40.8 41.8 14.4 43.4 40.2 40.2 40.2

download these results as csv

BROAD Scores

These are the mean BROAD scores per query assigned by Evalutron graders. The BROAD scores for the 5 candidates returned per algorithm, per query, have been averaged. Values are bounded between 0 (not similar) and 2 (very similar). A perfect score would be 2. Genre labels have been included for reference.

Genre Query BWL1 PS1 PSS1 RZ1 SSPK2 TLN1 TLN2 TLN3
BAROQUE d004083 0.6 2.0 1.8 0.2 1.8 1.4 1.6 1.6
BAROQUE d007329 1.0 1.0 1.0 0.2 1.0 1.0 1.0 1.0
BAROQUE d009136 2.0 2.0 2.0 0.4 2.0 2.0 2.0 2.0
BAROQUE d010069 1.4 2.0 2.0 0.2 1.4 1.2 1.2 1.2
BAROQUE d011911 0.8 1.2 1.0 0.2 1.0 0.6 1.0 0.6
BAROQUE d012251 1.6 1.6 1.6 0.2 0.8 1.6 1.0 1.0
BAROQUE d013041 0.6 0.6 0.6 0.2 0.4 0.8 0.8 0.8
BAROQUE d015413 1.8 1.8 2.0 0.6 2.0 1.4 1.4 1.4
BAROQUE d016516 1.0 0.8 0.6 0.4 0.6 1.0 0.8 0.8
BAROQUE d017625 1.0 1.2 1.4 0.2 1.4 1.2 0.6 0.6
BLUES e001418 0.4 0.4 0.4 0.0 1.0 0.0 0.0 0.0
BLUES e002753 1.0 1.0 0.8 0.4 1.0 1.2 1.2 1.2
BLUES e004984 2.0 2.0 2.0 0.4 2.0 1.8 2.0 2.0
BLUES e005445 1.2 1.2 1.2 0.4 2.0 0.6 0.8 0.8
BLUES e005819 1.6 1.8 1.8 0.0 2.0 1.8 1.8 1.8
BLUES e007623 0.2 0.2 0.2 0.0 0.8 0.0 0.0 0.0
BLUES e008581 0.6 0.6 0.6 0.2 1.2 0.4 0.2 0.6
BLUES e009749 0.6 1.4 0.8 0.2 1.2 1.2 0.8 0.8
BLUES e011570 1.6 1.8 1.2 0.0 1.4 1.4 1.2 1.2
BLUES e012625 2.0 2.0 2.0 0.4 2.0 2.0 2.0 2.0
CLASSICAL d001301 1.0 1.4 1.4 0.4 1.4 1.0 1.0 1.0
CLASSICAL d003863 1.8 2.0 2.0 0.2 1.8 1.2 1.0 1.2
CLASSICAL d004683 1.0 1.4 1.2 0.2 1.2 1.0 1.4 1.4
CLASSICAL d009122 0.8 1.4 1.6 0.0 1.6 0.8 0.4 0.4
CLASSICAL d009750 2.0 2.0 2.0 0.2 2.0 2.0 1.4 1.4
CLASSICAL d010693 1.4 1.0 1.2 0.4 1.0 0.6 0.6 0.6
CLASSICAL d012461 1.0 0.2 0.2 0.0 1.2 0.6 0.4 0.4
CLASSICAL d013654 1.6 1.8 1.6 0.0 1.8 0.8 1.0 1.0
CLASSICAL d016051 2.0 2.0 2.0 0.0 2.0 2.0 2.0 2.0
CLASSICAL d016303 1.2 1.0 1.0 0.4 0.8 0.6 0.8 0.8
COUNTRY e003228 1.2 1.4 1.6 0.8 1.4 1.0 1.4 1.2
COUNTRY e003847 0.4 1.0 0.8 0.0 0.4 0.2 0.2 0.2
COUNTRY e005737 1.0 0.8 0.8 0.6 1.2 0.6 0.8 0.6
COUNTRY e006980 1.6 1.4 1.8 0.6 1.2 1.0 1.2 1.2
COUNTRY e009948 1.6 1.6 1.8 0.0 1.6 0.4 0.4 0.4
COUNTRY e010366 1.4 1.6 1.4 0.2 1.8 1.2 1.4 1.4
COUNTRY e011607 0.6 1.6 1.2 0.0 0.6 0.6 0.6 0.6
COUNTRY e017889 1.2 1.4 1.6 0.2 1.4 0.4 0.4 0.4
COUNTRY e019097 0.6 0.6 0.8 0.2 0.6 1.0 1.0 1.0
COUNTRY e019474 0.6 1.2 0.8 0.0 0.8 1.0 1.2 1.2
EDANCE a000643 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
EDANCE a001494 0.2 0.6 0.4 0.0 1.0 0.0 0.8 0.8
EDANCE a003006 1.2 0.6 1.0 0.8 0.4 1.2 1.2 1.2
EDANCE a008244 1.2 1.0 1.2 0.2 1.6 1.2 1.4 1.4
EDANCE b008057 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
EDANCE b014261 1.0 1.0 1.0 0.4 1.2 0.8 1.0 1.0
EDANCE b019365 1.2 0.8 1.0 0.0 1.0 0.0 0.8 0.8
EDANCE f001490 1.0 1.4 2.0 0.6 1.6 0.6 0.6 0.6
EDANCE f004897 0.8 1.0 0.4 0.0 0.4 0.0 0.0 0.0
EDANCE f018057 2.0 2.0 2.0 0.6 1.8 2.0 1.8 1.8
JAZZ a000470 1.2 1.4 0.8 0.2 1.4 2.0 1.4 1.4
JAZZ e002021 1.8 2.0 2.0 0.6 2.0 2.0 2.0 2.0
JAZZ e003659 1.2 1.2 0.4 0.0 1.0 0.0 0.4 0.4
JAZZ e007349 0.8 1.0 1.2 0.0 1.0 0.2 0.2 0.2
JAZZ e009585 0.0 0.2 0.2 0.0 0.0 0.2 0.0 0.0
JAZZ e010110 1.2 1.8 1.4 0.0 1.6 0.2 0.2 0.2
JAZZ e014019 1.0 0.4 0.2 0.0 1.6 1.0 1.6 1.4
JAZZ e015948 2.0 1.8 2.0 0.0 1.6 1.6 1.6 1.6
JAZZ e019177 0.6 0.8 1.0 0.2 1.0 0.0 0.0 0.0
JAZZ e019705 2.0 1.6 1.4 0.4 1.8 0.4 1.0 1.0
METAL b004052 0.0 0.6 0.8 1.2 0.0 1.2 1.2 1.2
METAL b008274 1.0 1.2 1.6 0.2 1.0 1.2 1.4 1.4
METAL b008631 0.6 1.6 2.0 0.0 1.4 1.2 1.2 1.2
METAL b010864 0.6 0.8 0.2 0.0 1.8 0.6 0.8 0.8
METAL b011136 1.4 2.0 1.8 0.0 1.8 1.8 2.0 2.0
METAL b011316 0.4 0.4 0.2 0.2 0.2 0.4 0.4 0.0
METAL b012933 0.6 0.6 0.8 0.0 1.0 0.6 0.8 0.8
METAL b013451 0.2 0.0 0.6 0.0 0.0 0.8 1.2 1.2
METAL b013972 1.2 1.0 1.0 0.0 1.4 1.2 1.0 1.0
METAL b018129 1.2 1.8 2.0 1.0 2.0 1.6 1.6 1.6
RAPHIPHOP a004379 1.4 1.6 1.0 0.2 1.8 1.2 1.2 1.4
RAPHIPHOP a004607 1.0 1.0 1.0 0.2 0.8 1.0 1.0 1.0
RAPHIPHOP a005261 0.0 0.0 1.2 0.4 1.0 0.4 0.4 0.4
RAPHIPHOP a005341 1.8 2.0 1.8 0.2 1.6 1.8 1.2 1.6
RAPHIPHOP a005918 1.4 1.2 0.8 0.0 1.6 1.2 1.2 1.2
RAPHIPHOP a008296 2.0 1.8 1.8 0.6 2.0 1.6 1.6 1.6
RAPHIPHOP b001476 1.2 1.4 0.8 0.2 1.4 1.2 1.2 1.2
RAPHIPHOP b005622 1.6 1.6 2.0 0.2 1.6 1.0 1.0 1.0
RAPHIPHOP b009667 1.0 0.8 1.0 0.0 1.0 1.0 1.0 1.0
RAPHIPHOP b010090 2.0 2.0 2.0 0.6 1.8 2.0 2.0 2.0
ROCKROLL b003741 1.8 1.6 2.0 0.4 1.6 0.0 0.0 0.0
ROCKROLL b008148 1.6 1.8 1.6 0.0 2.0 1.6 1.6 1.6
ROCKROLL b008265 1.4 1.0 1.0 0.8 1.0 1.2 1.2 1.2
ROCKROLL b008704 1.0 1.4 1.8 0.6 1.4 1.4 1.2 1.2
ROCKROLL b009829 1.2 1.6 1.4 0.2 0.8 0.6 0.4 0.4
ROCKROLL b011257 1.0 1.4 1.4 0.4 1.2 0.8 0.8 0.8
ROCKROLL b015298 1.2 1.6 1.0 0.2 1.8 1.2 1.2 1.2
ROCKROLL b017838 1.0 1.2 1.0 0.4 1.8 1.2 1.0 1.0
ROCKROLL b019038 1.4 1.6 1.8 0.2 1.4 1.4 1.4 1.4
ROCKROLL e003526 0.4 1.6 1.6 0.2 1.2 0.2 0.4 0.4
ROMANTIC d000312 1.6 2.0 2.0 0.6 2.0 1.8 1.8 1.8
ROMANTIC d002366 1.2 1.0 1.2 0.2 1.0 1.0 1.0 1.0
ROMANTIC d004649 1.0 1.0 1.0 0.4 1.0 1.0 1.0 1.0
ROMANTIC d008538 1.2 1.8 2.0 0.2 1.8 0.6 1.0 1.0
ROMANTIC d009847 0.0 0.0 0.0 0.4 0.0 0.2 0.0 0.0
ROMANTIC d014039 0.0 0.2 0.0 0.0 0.2 0.6 0.6 0.6
ROMANTIC d014488 1.0 1.0 1.4 0.0 1.6 0.6 0.4 0.4
ROMANTIC d017256 0.4 1.8 1.8 0.0 1.4 0.2 1.0 1.0
ROMANTIC d017550 1.2 1.0 0.8 0.4 1.0 0.6 0.8 0.8
ROMANTIC d019896 0.2 0.8 0.6 0.0 0.6 0.8 0.8 0.8

download these results as csv

Raw Scores

The raw data derived from the Evalutron 6000 human evaluations are located on the 2010:Audio Music Similarity and Retrieval Raw Data page.

Metadata and Distance Space Evaluation

The following reports provide evaluation statistics based on analysis of the distance space and metadata matches and include:

  • Neighbourhood clustering by artist, album and genre
  • Artist-filtered genre clustering
  • How often the triangular inequality holds
  • Statistics on 'hubs' (tracks similar to many tracks) and orphans (tracks that are not similar to any other tracks at N results).

Reports

BWL1 = Dmitry Bogdanov, Nicolas Wack, Cyril Laurier
PS1 = Tim Pohle, Dominik Schnitzer
PSS1 = Tim Pohle, Klaus Seyerlehner, Dominik Schnitzer
RZ1 = Dmitry Rainer Zufall
SSPK2 = Klaus Seyerlehner, Markus Schedl, Tim Pohle, Peter Knees
TLN1 = George Tzanetakis, Mathieu Lagrange, Steven Ness
TLN2 = George Tzanetakis, Mathieu Lagrange, Steven Ness
TLN3 = George Tzanetakis, Mathieu Lagrange, Steven Ness

Run Times

file /nema-raid/www/mirex/results/2010/ams/audiosim.runtime.csv not found