Difference between revisions of "2008:Real-time Audio to Score Alignment (a.k.a. Score Following) Results"

From MIREX Wiki
m (Robot: Automated text replacement (-/mirex/2008/results/ +/mirex/results/2008/))
 
(6 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
==Introduction==
 
==Introduction==
These are the results for the 2008 running of the Real-time Audio to Score Alignment (a.k.a Score Following) task. For background information about this task set please refer to the [[Real-time Audio to Score Alignment (a.k.a Score Following)]] page.  
+
These are the results for the 2008 running of the Real-time Audio to Score Alignment (a.k.a Score Following) task. For background information about this task set please refer to the [[2008:Real-time Audio to Score Alignment (a.k.a Score Following)]] page.  
  
 
===General Legend===
 
===General Legend===
 
====Team ID====
 
====Team ID====
  
'''MO1''' = [https://www.music-ir.org/mirex/2008/abs/XXX.pdf N. Montecchio & Orio 1]<br />
+
 
'''MO2''' = [https://www.music-ir.org/mirex/2008/abs/XXX.pdf N. Montecchio & Orio 2]<br />
+
'''MO1''' = [https://www.music-ir.org/mirex/abstracts/2008/XXX.pdf N. Montecchio & Orio 1]<br />
'''RM1''' = [https://www.music-ir.org/mirex/2008/abs/XXX.pdf R. Macrae]<br />
+
'''MO2''' = [https://www.music-ir.org/mirex/abstracts/2008/XXX.pdf N. Montecchio & Orio 2]<br />
'''RM2''' = [https://www.music-ir.org/mirex/2008/abs/XXX.pdf R. Macrae]<br />
+
'''RM1''' = [https://www.music-ir.org/mirex/abstracts/2008/Scofo.pdf R. Macrae]<br />
 +
'''RM2''' = [https://www.music-ir.org/mirex/abstracts/2008/Scofo.pdf R. Macrae]<br />
  
 
[[Category: Results]]
 
[[Category: Results]]
 
  
 
===Summary Results===
 
===Summary Results===
<csv>scofo/scofo_summary_results.csv</csv>
+
<csv>2008/scofo/scofo_summary_results.csv</csv>
  
 
===Individual Results===
 
===Individual Results===
'''MO''' = [https://www.music-ir.org/mirex/2008/results/scofo/MOResults.zip N. Montecchio & Orio]<br />
+
'''MO''' = [https://www.music-ir.org/mirex/results/2008/scofo/MOResults.zip N. Montecchio & Orio]<br />
'''RM''' = [https://www.music-ir.org/mirex/2008/results/scofo/RMResults.zip R. Macrae ]<br />
+
'''RM''' = [https://www.music-ir.org/mirex/results/2008/scofo/RMResults.zip R. Macrae ]<br />
  
 
===Summary Results w.r.t R. Macrae`s Evaluation Script===
 
===Summary Results w.r.t R. Macrae`s Evaluation Script===
<csv>scofo/scofo_summary_results_withRobsEvalScript.csv</csv>
+
<csv>2008/scofo/scofo_summary_results_withRobsEvalScript.csv</csv>
  
 
===Individual Results w.r.t R. Macrae`s Evaluation Script===
 
===Individual Results w.r.t R. Macrae`s Evaluation Script===
'''MO''' = [https://www.music-ir.org/mirex/2008/results/scofo/MOresults_withRobsEvalScript.zip N. Montecchio & Orio]<br />
+
'''MO''' = [https://www.music-ir.org/mirex/results/2008/scofo/MOresults_withRobsEvalScript.zip N. Montecchio & Orio]<br />
'''RM''' = [https://www.music-ir.org/mirex/2008/results/scofo/RMresults_withRobsEvalScript.zip R. Macrae ]<br />
+
'''RM''' = [https://www.music-ir.org/mirex/results/2008/scofo/RMresults_withRobsEvalScript.zip R. Macrae ]<br />
  
  
 
The systems are evaluated against the ground truth that is prepared by  parsing the score files by each systems own midi parser (MO GT, RM GT).
 
The systems are evaluated against the ground truth that is prepared by  parsing the score files by each systems own midi parser (MO GT, RM GT).
 +
 +
 +
=== Issues with ground-truth ===

Latest revision as of 13:54, 7 June 2010

Introduction

These are the results for the 2008 running of the Real-time Audio to Score Alignment (a.k.a Score Following) task. For background information about this task set please refer to the 2008:Real-time Audio to Score Alignment (a.k.a Score Following) page.

General Legend

Team ID

MO1 = N. Montecchio & Orio 1
MO2 = N. Montecchio & Orio 2
RM1 = R. Macrae
RM2 = R. Macrae

Summary Results

MO1 MO2 RM1 RM2
Piecewise Precision (MO GT) 84.45% 68.84% 17.10% 19.50%
Piecewise Precision (RM GT) 48.55% 41.67% 25.34% 26.19%
Ave. Piecewise Precision 66.50% 55.26% 21.22% 22.85%

download these results as csv

Individual Results

MO = N. Montecchio & Orio
RM = R. Macrae

Summary Results w.r.t R. Macrae`s Evaluation Script

MO1 MO2 RM1 RM2
Piecewise Precision (MO GT) 81.59% 65.63% 24.27% 17.03%
Piecewise Precision (RM GT) 25.93% 25.36% 44.77% 28.80%
Ave. Piecewise Precision 53.76% 45.50% 34.52% 22.92%

download these results as csv

Individual Results w.r.t R. Macrae`s Evaluation Script

MO = N. Montecchio & Orio
RM = R. Macrae


The systems are evaluated against the ground truth that is prepared by parsing the score files by each systems own midi parser (MO GT, RM GT).


Issues with ground-truth