Difference between revisions of "2013:Discovery of Repeated Themes & Sections Results"

From MIREX Wiki
m (General Legend)
m (General Legend)
Line 34: Line 34:
 
         ! Task Version
 
         ! Task Version
 
! symPoly
 
! symPoly
 +
        !
 +
        !
 +
|-
 +
! NF2
 +
| motives_poly  ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2013/???.pdf PDF] || [http://files.nyu.edu/onc202/public/ Oriol Nieto], [http://www.nyu.edu/projects/farbood/ Morwaread Farbood]
 +
|-
 +
! DM10
 +
| SIATECSegment  ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2013/???.pdf PDF] || [http://www.titanmusic.com/ David Meredith]
 +
        |-
 +
! DM9
 +
| SIATECCompressRaw  ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2013/???.pdf PDF] || [http://www.titanmusic.com/ David Meredith]
 +
        |-
 +
! DM8
 +
| SIATECCompressBB  ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2013/???.pdf PDF] || [http://www.titanmusic.com/ David Meredith]
 +
        |-
 +
! DM7
 +
| COSIATECSegment  ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2013/???.pdf PDF] || [http://www.titanmusic.com/ David Meredith]
 +
        |-
 +
! DM6
 +
| COSIATECRaw  ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2013/???.pdf PDF] || [http://www.titanmusic.com/ David Meredith]
 +
        |-
 +
! DM5
 +
| COSSIATECBB  ||  style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2013/???.pdf PDF] || [http://www.titanmusic.com/ David Meredith]
 +
        |- style="background: green;"
 +
        ! Task Version
 +
! symMono
 
         !
 
         !
 
         !
 
         !

Revision as of 05:03, 29 October 2013

Introduction

The task: algorithms take a piece of music as input, and output a list of patterns repeated within that piece. A pattern is defined as a set of ontime-pitch pairs that occurs twice (i.e., is repeated at least once) in a piece of music. The second, third, etc. occurrences of the pattern will likely be shifted in time and/or transposed, relative to the first occurrence. Ideally an algorithm will be able to discover all exact and inexact occurrences of a pattern within a piece, so in evaluating this task we are interested in both:

  • (1) to what extent an algorithm can discover one occurrence, up to time shift and transposition, and;
  • (2) to what extent it can find all occurrences.

The metrics establishment recall, establishment precision and establishment F1 address (1), and the metrics occurrence recall, occurrence precision, and occurrence F1 address (2).

Contribution

Existing approaches to music structure analysis in MIR tend to focus on segmentation (e.g., Weiss & Bello, 2010). The contribution of this task is to afford access to the note content itself (please see the example in Fig. 1A), requiring algorithms to do more than label time windows (e.g., the segmentations in Figs. 1B-D). For instance, a discovery algorithm applied to the piece in Fig. 1A should return a pattern corresponding to the note content of and , as well as a pattern corresponding to the note content of . This is because occurs again independently of the accompaniment in bars 19-22 (not shown here). The ground truth also contains nested patterns, such as in Fig. 1A being a subset of the sectional repetition , reflecting the often-hierarchical nature of musical repetition. While we recognise the appealing simplicity of linear segmentation, in the Discovery of Repeated Themes & Sections task we are demanding analysis at a greater level of detail, and have built a ground truth that contains overlapping and nested patterns.


MozartK282Mvt2.png

Figure 1. Pattern discovery v segmentation. (A) Bars 1-12 of Mozart’s Piano Sonata in E-flat major K282 mvt.2, showing some ground-truth themes and repeated sections; (B-D) Three linear segmentations. Numbers below the staff in Fig. 1A and below the segmentation in Fig. 1D indicate crotchet beats, from zero for bar 1 beat 1.


For a more detailed introduction to the task, please see 2013:Discovery_of_Repeated_Themes_&_Sections.

Ground Truth and Algorithms

The ground truth, called the Johannes Kepler University Patterns Test Database (JKUPTD-Aug2013), is based on motifs and themes in Barlow and Morgenstern (1953), Schoenberg (1967), and Bruhn (1993). Repeated sections are based on those marked by the composer. These annotations are supplemented with some of our own where necessary. A Development Database (JKUPDD-Aug2013) released in March enabled participants to try out their algorithms. For each piece in the Development and Test Databases, symbolic and synthesised audio versions are crossed with monophonic and polyphonic versions, giving four versions of the task in total: symPoly, symMono, audPoly, and audMono. Algorithms submitted to the task are are shown in Table 1.


General Legend

Sub code Submission name Abstract Contributors
Task Version symPoly
NF2 motives_poly PDF Oriol Nieto, Morwaread Farbood
DM10 SIATECSegment PDF David Meredith
DM9 SIATECCompressRaw PDF David Meredith
DM8 SIATECCompressBB PDF David Meredith
DM7 COSIATECSegment PDF David Meredith
DM6 COSIATECRaw PDF David Meredith
DM5 COSSIATECBB PDF David Meredith
Task Version symMono
NF2 motives_poly PDF Oriol Nieto, Morwaread Farbood
DM10 SIATECSegment PDF David Meredith
DM9 SIATECCompressRaw PDF David Meredith
DM8 SIATECCompressBB PDF David Meredith
DM7 COSIATECSegment PDF David Meredith
DM6 COSIATECRaw PDF David Meredith
DM5 COSSIATECBB PDF David Meredith


Code Researcher(s) Algorithm
Task Version: symPoly
NF2 Nieto and Farbood (2013) motives_poly
DM10 Meredith (2013) SIATECCompressSegment
DM9 Meredith (2013) SIATECCompressRaw
DM8 Meredith (2013) SIATECCompressBB
DM7 Meredith (2013) COSIATECSegment
DM6 Meredith (2013) COSIATECRaw
DM5 Meredith (2013) COSIATECBB
Task Version: symMono
NF1 Nieto and Farbood (2013) motives_mono
DM10 Meredith (2013) SIATECSegment
DM9 Meredith (2013) SIATECCompressRaw
DM8 Meredith (2013) SIATECCompressBB
DM7 Meredith (2013) COSIATECSegment
DM6 Meredith (2013) COSIATECraw
DM5 Meredith (2013) COSIATECBB
Task Version: audPoly
NF4 Nieto and Farbood (2013) motives_audio_poly
Task Version: audMono
NF3 Nieto and Farbood (2013) motives_audio_mono

Table 1. Algorithms submitted to DRTS.

Results

For mathematical definitions of the various metrics, please see 2013:Discovery_of_Repeated_Themes_&_Sections#Evaluation_Procedure.

In Brief

To avoid a bias toward the more numerous submissions of Meredith (2013), DM10 was preselected for comparison with Nieto and Farbood's (2013) submissions, based on reported performance for the Development Database. Figure 2 shows establishment recall results on a per-pattern basis for the symbolic-polyphonic version of the task. DM10 outperforms NF2 according to Friedman's test (), suggesting the former is preferable for discovering at least one occurrence of each ground truth pattern. This result addresses point (1) from the introduction.

Figure 3 shows occurrence recall results on a per-pattern basis for the symbolic-polyphonic version of the task. Again, DM10 outperforms NF2 according to Friedman's test (), suggesting the former is preferable for retrieving all occurrences of a discovered ground truth pattern. This result addresses point (2) from the introduction.

It should be noted, however, that according to Friedman's test, algorithm NF2 is significantly faster than DM10 (), suggesting the former is preferable for rapid summarisation.

The results are closer for the symbolic-monophonic version of the task. According to Friedman's test (, ns), NF1 and DM10 show no significant difference for establishment recall on a per-pattern basis (please see Figure 14). This result relates to point (1) from the introduction. Figure 15 shows occurrence recall results on a per-pattern basis for the symbolic-monophonic version of the task. DM10 outperforms NF1 according to Friedman's test (), suggesting the former is preferable for retrieving all occurrences of a discovered ground truth pattern. This result relates to point (2) from the introduction.

For the audio versions of the task, algorithms NF3 (audMono) and NF4 (audPoly) were the only submissions. Nieto and Farbood (2013) are credited with being the first to define an algorithm for discovering repeated note content in a synthesised audio file that is time-locked to a symbolic representation. We include the results for these versions of the task, with a view to comparison in future years.

symPoly

01symPolyEstRecPerPatt.png

Figure 2. Establishment recall on a per-pattern basis. Establishment recall answers the following question. On average, how similar is the most similar algorithm-output pattern to a ground-truth pattern prototype?


04symPolyOccRecPerPatt.png

Figure 3. Occurrence recall on a per-pattern basis. Occurrence recall answers the following question. On average, how similar is the most similar set of algorithm-output pattern occurrences to a discovered ground-truth occurrence set?


01symPolyEstRec.png

Figure 4. Establishment recall averaged over each piece/movement. Establishment recall answers the following question. On average, how similar is the most similar algorithm-output pattern to a ground-truth pattern prototype?


02symPolyEstPrec.png

Figure 5. Establishment precision averaged over each piece/movement. Establishment precision answers the following question. On average, how similar is the most similar ground-truth pattern prototype to an algorithm-output pattern?


03symPolyEstF1.png

Figure 6. Establishment F1 averaged over each piece/movement. Establishment F1 is an average of establishment precision and establishment recall.


04symPolyOccRecP75.png

Figure 7. Occurrence recall () averaged over each piece/movement. Occurrence recall answers the following question. On average, how similar is the most similar set of algorithm-output pattern occurrences to a discovered ground-truth occurrence set?


05symPolyOccPrecP75.png

Figure 8. Occurrence precision () averaged over each piece/movement. Occurrence precision answers the following question. On average, how similar is the most similar discovered ground-truth occurrence set to a set of algorithm-output pattern occurrences?


06symPolyOccF1P75.png

Figure 9. Occurrence F1 () averaged over each piece/movement. Occurrence F1 is an average of occurrence precision and occurrence recall.


07symPolyR3.png

Figure 10. Three-layer recall averaged over each piece/movement. Rather than using as a similarity measure (which is the default for establishment recall), three-layer recall uses , which is a kind of F1 measure.


08symPolyP3.png

Figure 11. Three-layer precision averaged over each piece/movement. Rather than using as a similarity measure (which is the default for establishment precision), three-layer precision uses , which is a kind of F1 measure.


09symPolyTLF.png

Figure 12. Three-layer F1 (TLF) averaged over each piece/movement. TLF is an average of three-layer precision and three-layer recall.


10symPolyRuntime.png

Figure 13. Log runtime of the algorithm for each piece/movement.

symMono

(Poor performance here for algorithm NF1 on piece 2 is likely due to rounding errors in the discovery phase; not anything musically interesting. The task captain tried in vain to identify a workaround.)

11symMonoEstRecPerPatt.png

Figure 14. Establishment recall on a per-pattern basis. Establishment recall answers the following question. On average, how similar is the most similar algorithm-output pattern to a ground-truth pattern prototype?


14symMonoOccRecPerPatt.png

Figure 15. Occurrence recall on a per-pattern basis. Occurrence recall answers the following question. On average, how similar is the most similar set of algorithm-output pattern occurrences to a discovered ground-truth occurrence set?


11symMonoEstRec.png

Figure 16. Establishment recall averaged over each piece/movement. Establishment recall answers the following question. On average, how similar is the most similar algorithm-output pattern to a ground-truth pattern prototype?


12symMonoEstPrec.png

Figure 17. Establishment precision averaged over each piece/movement. Establishment precision answers the following question. On average, how similar is the most similar ground-truth pattern prototype to an algorithm-output pattern?


13symMonoEstF1.png

Figure 18. Establishment F1 averaged over each piece/movement. Establishment F1 is an average of establishment precision and establishment recall.


14symMonoOccRecP75.png

Figure 19. Occurrence recall () averaged over each piece/movement. Occurrence recall answers the following question. On average, how similar is the most similar set of algorithm-output pattern occurrences to a discovered ground-truth occurrence set?


15symMonoOccPrecP75.png

Figure 20. Occurrence precision () averaged over each piece/movement. Occurrence precision answers the following question. On average, how similar is the most similar discovered ground-truth occurrence set to a set of algorithm-output pattern occurrences?


16symMonoOccF1P75.png

Figure 21. Occurrence F1 () averaged over each piece/movement. Occurrence F1 is an average of occurrence precision and occurrence recall.


17symMonoR3.png

Figure 22. Three-layer recall averaged over each piece/movement. Rather than using as a similarity measure (which is the default for establishment recall), three-layer recall uses , which is a kind of F1 measure.


18symMonoP3.png

Figure 23. Three-layer precision averaged over each piece/movement. Rather than using as a similarity measure (which is the default for establishment precision), three-layer precision uses , which is a kind of F1 measure.


19symMonoTLF.png

Figure 24. Three-layer F1 (TLF) averaged over each piece/movement. TLF is an average of three-layer precision and three-layer recall.


20symMonoRuntime.png

Figure 25. Log runtime of the algorithm for each piece/movement.

audPoly

21audPolyEstRecPerPatt.png

Figure 26. Establishment recall on a per-pattern basis. Establishment recall answers the following question. On average, how similar is the most similar algorithm-output pattern to a ground-truth pattern prototype?


24audPolyOccRecPerPatt.png

Figure 27. Occurrence recall on a per-pattern basis. Occurrence recall answers the following question. On average, how similar is the most similar set of algorithm-output pattern occurrences to a discovered ground-truth occurrence set?


21audPolyEstRec.png

Figure 28. Establishment recall averaged over each piece/movement. Establishment recall answers the following question. On average, how similar is the most similar algorithm-output pattern to a ground-truth pattern prototype?


22audPolyEstPrec.png

Figure 29. Establishment precision averaged over each piece/movement. Establishment precision answers the following question. On average, how similar is the most similar ground-truth pattern prototype to an algorithm-output pattern?


23audPolyEstF1.png

Figure 30. Establishment F1 averaged over each piece/movement. Establishment F1 is an average of establishment precision and establishment recall.


24audPolyOccRecP75.png

Figure 31. Occurrence recall () averaged over each piece/movement. Occurrence recall answers the following question. On average, how similar is the most similar set of algorithm-output pattern occurrences to a discovered ground-truth occurrence set?


25audPolyOccPrecP75.png

Figure 32. Occurrence precision () averaged over each piece/movement. Occurrence precision answers the following question. On average, how similar is the most similar discovered ground-truth occurrence set to a set of algorithm-output pattern occurrences?


26audPolyOccF1P75.png

Figure 33. Occurrence F1 () averaged over each piece/movement. Occurrence F1 is an average of occurrence precision and occurrence recall.


27audPolyR3.png

Figure 34. Three-layer recall averaged over each piece/movement. Rather than using as a similarity measure (which is the default for establishment recall), three-layer recall uses , which is a kind of F1 measure.


28audPolyP3.png

Figure 35. Three-layer precision averaged over each piece/movement. Rather than using as a similarity measure (which is the default for establishment precision), three-layer precision uses , which is a kind of F1 measure.


29audPolyTLF.png

Figure 36. Three-layer F1 (TLF) averaged over each piece/movement. TLF is an average of three-layer precision and three-layer recall.


30audPolyRuntime.png

Figure 37. Log runtime of the algorithm for each piece/movement.

audMono

31audMonoEstRecPerPatt.png

Figure 38. Establishment recall on a per-pattern basis. Establishment recall answers the following question. On average, how similar is the most similar algorithm-output pattern to a ground-truth pattern prototype?


34audMonoOccRecPerPatt.png

Figure 39. Occurrence recall on a per-pattern basis. Occurrence recall answers the following question. On average, how similar is the most similar set of algorithm-output pattern occurrences to a discovered ground-truth occurrence set?


31audMonoEstRec.png

Figure 40. Establishment recall averaged over each piece/movement. Establishment recall answers the following question. On average, how similar is the most similar algorithm-output pattern to a ground-truth pattern prototype?


32audMonoEstPrec.png

Figure 41. Establishment precision averaged over each piece/movement. Establishment precision answers the following question. On average, how similar is the most similar ground-truth pattern prototype to an algorithm-output pattern?


33audMonoEstF1.png

Figure 42. Establishment F1 averaged over each piece/movement. Establishment F1 is an average of establishment precision and establishment recall.


34audMonoOccRecP75.png

Figure 43. Occurrence recall () averaged over each piece/movement. Occurrence recall answers the following question. On average, how similar is the most similar set of algorithm-output pattern occurrences to a discovered ground-truth occurrence set?


35audMonoOccPrecP75.png

Figure 44. Occurrence precision () averaged over each piece/movement. Occurrence precision answers the following question. On average, how similar is the most similar discovered ground-truth occurrence set to a set of algorithm-output pattern occurrences?


36audMonoOccF1P75.png

Figure 45. Occurrence F1 () averaged over each piece/movement. Occurrence F1 is an average of occurrence precision and occurrence recall.


37audMonoR3.png

Figure 46. Three-layer recall averaged over each piece/movement. Rather than using as a similarity measure (which is the default for establishment recall), three-layer recall uses , which is a kind of F1 measure.


38audMonoP3.png

Figure 47. Three-layer precision averaged over each piece/movement. Rather than using as a similarity measure (which is the default for establishment precision), three-layer precision uses , which is a kind of F1 measure.


39audMonoTLF.png

Figure 48. Three-layer F1 (TLF) averaged over each piece/movement. TLF is an average of three-layer precision and three-layer recall.


40audMonoRuntime.png

Figure 49. Log runtime of the algorithm for each piece/movement.

Discussion

If an occurrence of a ground-truth pattern contains forty or more notes then, according to Fig. 2, it is likely that SIATECSegment (DM10, Meredith, 2013) and motives_poly (NF2, Nieto & Farbood, 2013) will return a pattern rated as at least 75% similar. When we restrict attention to these successful discoveries and ask to what extent can the algorithms retrieve all exact and inexact occurrences, we find that SIATECSegment performs relatively well (see nonzero entries for the black line in Fig. 3), with the exception of the first patterns in pieces 1 and 2. We conclude, therefore, that the discovery of repeated sections has been addressed well by the current submissions, but that the discovery of themes and motifs requires more attention in future iterations of this task.

When assembling the ground truth, it was remarkable that most often a motif occurs as a subset of a theme or repeated section, which is not surprising given Drabkin’s (2001) definition of a motif as ‘the shortest subdivision of a theme or phrase that still maintains its identity as an idea’. One suggestion for future work is to apply a discovery algorithm to find repeated sections, and then apply the algorithm again but to the output sections only, in order to retrieve these nested and important musical motifs.

Tabular Versions of Plots

symPoly

AlgId TaskVersion Piece n_P n_Q P_est R_est F1_est P_occ(c=.75) R_occ(c=.75) F_1occ(c=.75) P_3 R_3 TLF_1 runtime FRT FFTP_est FFP P_occ(c=.5) R_occ(c=.5) F_1occ(c=.5) P R F_1
NF2 symPoly piece1 5 5 0.240 0.222 0.231 0.000 0.000 0.000 0.143 0.142 0.142 15.000 0.000 0.222 0.143 0.000 0.000 0.000 0.000 0.000 0.000
NF2 symPoly piece2 5 27.000 0.277 0.446 0.342 0.793 0.793 0.793 0.078 0.253 0.120 1221.000 0.000 0.399 0.235 0.409 0.411 0.410 0.000 0.000 0.000
NF2 symPoly piece3 10.000 20.000 0.695 0.584 0.635 0.703 0.320 0.440 0.473 0.439 0.455 34.000 0.000 0.355 0.473 0.603 0.373 0.461 0.000 0.000 0.000
NF2 symPoly piece4 5 2 0.667 0.272 0.386 0.885 0.885 0.885 0.609 0.240 0.344 3.000 0.000 0.272 0.609 0.885 0.885 0.885 0.000 0.000 0.000
NF2 symPoly piece5 13.000 18.000 0.564 0.334 0.419 0.690 0.393 0.501 0.417 0.345 0.377 153.000 0.000 0.245 0.549 0.601 0.488 0.539 0.000 0.000 0.000
DM5 symPoly piece1 5 23.000 0.332 0.545 0.412 0.743 0.804 0.772 0.228 0.441 0.301 474.000 0.000 0.463 0.531 0.460 0.593 0.518 0.000 0.000 0.000
DM5 symPoly piece2 5 38.000 0.287 0.447 0.349 0.385 0.770 0.513 0.235 0.339 0.278 19896.000 0.000 0.274 0.238 0.328 0.770 0.460 0.000 0.000 0.000
DM5 symPoly piece3 10.000 10.000 0.330 0.423 0.371 0.000 0.000 0.000 0.285 0.365 0.320 762.000 0.000 0.288 0.370 0.318 0.471 0.379 0.000 0.000 0.000
DM5 symPoly piece4 5 4 0.400 0.349 0.372 0.000 0.000 0.000 0.243 0.187 0.211 14.000 0.000 0.349 0.243 0.238 0.195 0.214 0.000 0.000 0.000
DM5 symPoly piece5 13.000 33.000 0.305 0.300 0.303 0.693 0.805 0.745 0.301 0.328 0.314 36299.000 0.000 0.160 0.328 0.620 0.765 0.685 0.000 0.000 0.000
DM6 symPoly piece1 5 23.000 0.215 0.381 0.275 0.000 0.000 0.000 0.163 0.346 0.221 500.000 0.000 0.315 0.365 0.454 0.207 0.284 0.000 0.000 0.000
DM6 symPoly piece2 5 38.000 0.090 0.283 0.136 0.000 0.000 0.000 0.046 0.175 0.073 23294.000 0.000 0.207 0.102 0.100 0.033 0.050 0.000 0.000 0.000
DM6 symPoly piece3 10.000 10.000 0.145 0.164 0.154 0.000 0.000 0.000 0.152 0.204 0.174 771.000 0.000 0.143 0.209 0.000 0.000 0.000 0.000 0.000 0.000
DM6 symPoly piece4 5 4 0.321 0.234 0.271 0.000 0.000 0.000 0.165 0.127 0.143 13.000 0.000 0.234 0.165 0.250 0.125 0.167 0.000 0.000 0.000
DM6 symPoly piece5 13.000 33.000 0.119 0.189 0.146 0.000 0.000 0.000 0.084 0.193 0.117 37646.000 0.000 0.135 0.175 0.000 0.000 0.000 0.000 0.000 0.000
DM7 symPoly piece1 5 23.000 0.316 0.522 0.393 0.406 0.439 0.422 0.203 0.377 0.264 532.000 0.000 0.467 0.409 0.405 0.447 0.425 0.000 0.000 0.000
DM7 symPoly piece2 5 38.000 0.673 0.614 0.642 0.686 0.965 0.802 0.598 0.465 0.523 19926.000 0.000 0.368 0.562 0.633 0.935 0.755 0.000 0.000 0.000
DM7 symPoly piece3 10.000 10.000 0.742 0.612 0.671 0.429 0.632 0.511 0.608 0.497 0.547 783.000 0.000 0.400 0.715 0.458 0.627 0.529 0.000 0.000 0.000
DM7 symPoly piece4 5 4 0.555 0.311 0.398 0.000 0.000 0.000 0.457 0.247 0.321 14.000 0.000 0.311 0.457 0.370 0.486 0.420 0.000 0.000 0.000
DM7 symPoly piece5 13.000 33.000 0.656 0.401 0.498 0.794 0.934 0.859 0.656 0.405 0.501 35325.000 0.000 0.154 0.533 0.757 0.894 0.820 0.000 0.000 0.000
DM8 symPoly piece1 5 37.000 0.457 0.733 0.563 0.479 0.454 0.466 0.299 0.514 0.378 55.000 0.000 0.535 0.648 0.348 0.639 0.451 0.000 0.000 0.000
DM8 symPoly piece2 5 67.000 0.379 0.749 0.503 0.512 0.851 0.640 0.326 0.591 0.420 1319.000 0.000 0.223 0.211 0.401 0.826 0.540 0.000 0.000 0.000
DM8 symPoly piece3 10.000 20.000 0.425 0.488 0.454 0.547 0.834 0.661 0.385 0.417 0.401 77.000 0.000 0.324 0.337 0.475 0.693 0.563 0.000 0.000 0.000
DM8 symPoly piece4 5 21.000 0.399 0.636 0.491 0.358 0.632 0.457 0.276 0.370 0.316 3.000 0.000 0.426 0.307 0.265 0.431 0.328 0.000 0.000 0.000
DM8 symPoly piece5 13.000 69.000 0.463 0.461 0.462 0.648 0.838 0.731 0.417 0.416 0.417 3002.000 0.000 0.208 0.525 0.567 0.819 0.670 0.000 0.000 0.000
DM9 symPoly piece1 5 37.000 0.240 0.375 0.293 0.000 0.000 0.000 0.197 0.385 0.261 49.000 0.000 0.316 0.436 0.500 0.307 0.381 0.000 0.000 0.000
DM9 symPoly piece2 5 67.000 0.149 0.408 0.218 0.656 0.492 0.562 0.078 0.314 0.124 1335.000 0.000 0.199 0.129 0.656 0.329 0.438 0.000 0.000 0.000
DM9 symPoly piece3 10.000 20.000 0.177 0.225 0.198 0.000 0.000 0.000 0.125 0.201 0.154 91.000 0.000 0.152 0.146 0.406 0.317 0.356 0.000 0.000 0.000
DM9 symPoly piece4 5 21.000 0.321 0.446 0.373 0.444 0.375 0.407 0.180 0.256 0.212 2.000 0.000 0.293 0.284 0.243 0.356 0.289 0.000 0.000 0.000
DM9 symPoly piece5 13.000 69.000 0.138 0.257 0.179 0.000 0.000 0.000 0.095 0.293 0.143 2961.000 0.000 0.162 0.234 0.000 0.000 0.000 0.000 0.000 0.000
DM10 symPoly piece1 5 37.000 0.395 0.535 0.454 0.406 0.439 0.422 0.281 0.422 0.337 53.000 0.000 0.498 0.485 0.384 0.515 0.440 0.000 0.000 0.000
DM10 symPoly piece2 5 67.000 0.621 0.785 0.693 0.556 0.948 0.701 0.531 0.609 0.568 1287.000 0.000 0.313 0.313 0.512 0.917 0.657 0.000 0.000 0.000
DM10 symPoly piece3 10.000 20.000 0.670 0.557 0.608 0.592 0.831 0.691 0.641 0.474 0.545 89.000 0.000 0.330 0.751 0.509 0.782 0.617 0.000 0.000 0.000
DM10 symPoly piece4 5 21.000 0.503 0.508 0.505 0.472 0.941 0.628 0.368 0.326 0.346 3.000 0.000 0.415 0.556 0.306 0.726 0.430 0.000 0.000 0.000
DM10 symPoly piece5 13.000 69.000 0.678 0.530 0.595 0.643 0.897 0.749 0.631 0.448 0.524 3108.000 0.000 0.214 0.652 0.565 0.887 0.690 0.000 0.000 0.000

download these results as csv

Table 2. Tabular version of Figures 4-13.


AlgId TaskVersion Piece n_P R_est R_occ(c=.75) R_occ(c=.5)
NF2 symPoly piece1 5
0.266 0.422 0.109 0.074 0.238
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
NF2 symPoly piece2 5
0.444 0.319 0.571 0.795 0.099
0.00000 0.000 0.000 0.793 0.000
0.00000 0.000 0.000 0.793 0.000
NF2 symPoly piece3 10.000
0.951 0.628 0.987 0.486 0.407 0.787 0.235 0.641 0.446 0.271
0.170 0.000 0.305 0.000 0.000 0.390 0.000 0.000 0.000 0.000
0.170 0.000 0.305 0.000 0.000 0.390 0.000 0.000 0.000 0.000
NF2 symPoly piece4 5
0.276 0.000 0.000 0.182 0.902
0.00000 0.000 0.000 0.000 0.885
0.00000 0.000 0.000 0.000 0.885
NF2 symPoly piece5 13.000
0.372 0.173 0.759 0.634 0.072 0.071 0.020 0.026 0.025 0.805 0.321 0.578 0.481
0.00000 0.000 0.380 0.000 0.000 0.000 0.000 0.000 0.000 0.396 0.000 0.000 0.000
0.00000 0.000 0.380 0.000 0.000 0.000 0.000 0.000 0.000 0.396 0.000 0.000 0.000
DM5 symPoly piece1 5
0.824 0.737 0.385 0.111 0.667
0.804 0.000 0.000 0.000 0.000
0.804 0.000 0.000 0.000 0.000
DM5 symPoly piece2 5
0.357 0.340 0.357 0.788 0.390
0.00000 0.000 0.000 0.770 0.000
0.00000 0.000 0.000 0.770 0.000
DM5 symPoly piece3 10.000
0.480 0.190 0.696 0.327 0.527 0.394 0.111 0.419 0.560 0.529
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DM5 symPoly piece4 5
0.545 0.138 0.222 0.444 0.394
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
DM5 symPoly piece5 13.000
0.133 0.069 0.603 0.384 0.143 0.210 0.076 0.140 0.066 0.335 0.087 0.831 0.829
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.826 0.765
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.826 0.765
DM6 symPoly piece1 5
0.412 0.526 0.429 0.111 0.429
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
DM6 symPoly piece2 5
0.455 0.213 0.667 0.057 0.024
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
DM6 symPoly piece3 10.000
0.415 0.061 0.269 0.143 0.110 0.227 0.062 0.043 0.120 0.188
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DM6 symPoly piece4 5
0.250 0.143 0.200 0.500 0.078
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
DM6 symPoly piece5 13.000
0.172 0.231 0.073 0.047 0.114 0.328 0.250 0.091 0.400 0.462 0.095 0.054 0.145
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DM7 symPoly piece1 5
0.941 0.583 0.273 0.200 0.611
0.439 0.000 0.000 0.000 0.000
0.439 0.000 0.000 0.000 0.000
DM7 symPoly piece2 5
0.600 0.151 0.357 0.998 0.961
0.00000 0.000 0.000 0.980 0.957
0.00000 0.000 0.000 0.980 0.957
DM7 symPoly piece3 10.000
0.529 0.763 0.953 0.351 0.735 0.468 0.093 0.882 0.833 0.516
0.00000 0.675 0.578 0.000 0.000 0.000 0.000 0.772 0.278 0.000
0.00000 0.675 0.578 0.000 0.000 0.000 0.000 0.772 0.278 0.000
DM7 symPoly piece4 5
0.522 0.158 0.111 0.125 0.637
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
DM7 symPoly piece5 13.000
0.113 0.065 0.903 0.972 0.138 0.205 0.067 0.133 0.050 0.724 0.077 0.956 0.812
0.00000 0.000 0.872 0.970 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.942 0.757
0.00000 0.000 0.872 0.970 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.942 0.757
DM8 symPoly piece1 5
0.824 0.929 0.625 0.556 0.733
0.804 0.337 0.000 0.000 0.000
0.804 0.337 0.000 0.000 0.000
DM8 symPoly piece2 5
0.900 0.518 0.556 0.927 0.847
0.569 0.000 0.000 0.908 0.847
0.569 0.000 0.000 0.908 0.847
DM8 symPoly piece3 10.000
0.854 0.620 0.716 0.541 0.391 0.362 0.097 0.701 0.373 0.224
0.834 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.834 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DM8 symPoly piece4 5
0.750 0.500 0.667 0.500 0.765
0.500 0.000 0.000 0.000 0.765
0.500 0.000 0.000 0.000 0.765
DM8 symPoly piece5 13.000
0.286 0.308 0.882 0.692 0.223 0.232 0.107 0.182 0.071 0.918 0.301 0.933 0.852
0.00000 0.000 0.863 0.000 0.000 0.000 0.000 0.000 0.000 0.902 0.000 0.928 0.769
0.00000 0.000 0.863 0.000 0.000 0.000 0.000 0.000 0.000 0.902 0.000 0.928 0.769
DM9 symPoly piece1 5
0.412 0.368 0.375 0.222 0.500
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
DM9 symPoly piece2 5
0.800 0.426 0.625 0.132 0.058
0.492 0.000 0.000 0.000 0.000
0.492 0.000 0.000 0.000 0.000
DM9 symPoly piece3 10.000
0.415 0.052 0.256 0.550 0.088 0.091 0.200 0.145 0.231 0.222
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DM9 symPoly piece4 5
0.333 0.429 0.600 0.750 0.118
0.00000 0.000 0.000 0.375 0.000
0.00000 0.000 0.000 0.375 0.000
DM9 symPoly piece5 13.000
0.322 0.308 0.112 0.084 0.402 0.211 0.250 0.375 0.297 0.163 0.313 0.275 0.222
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DM10 symPoly piece1 5
0.941 0.619 0.286 0.217 0.611
0.439 0.000 0.000 0.000 0.000
0.439 0.000 0.000 0.000 0.000
DM10 symPoly piece2 5
0.900 0.516 0.556 1.000 0.952
0.413 0.000 0.000 0.982 0.944
0.413 0.000 0.000 0.982 0.944
DM10 symPoly piece3 10.000
0.851 0.964 0.852 0.500 0.601 0.318 0.070 0.821 0.366 0.225
0.847 0.804 0.843 0.000 0.000 0.000 0.000 0.821 0.000 0.000
0.847 0.804 0.843 0.000 0.000 0.000 0.000 0.821 0.000 0.000
DM10 symPoly piece4 5
0.556 0.429 0.364 0.250 0.941
0.00000 0.000 0.000 0.000 0.941
0.00000 0.000 0.000 0.000 0.941
DM10 symPoly piece5 13.000
0.619 0.368 0.973 0.964 0.213 0.209 0.069 0.431 0.047 0.918 0.300 0.965 0.808
0.00000 0.000 0.953 0.956 0.000 0.000 0.000 0.000 0.000 0.912 0.000 0.953 0.750
0.00000 0.000 0.953 0.956 0.000 0.000 0.000 0.000 0.000 0.912 0.000 0.953 0.750

download these results as csv

Table 3. Tabular version of Figures 2 and 3.

symMono

(Poor performance here for algorithm NF1 on piece2 is likely due to rounding errors in the discovery phase; not anything musically interesting. The task captain tried in vain to identify a workaround.)

AlgIdx AlgStub Piece n_P n_Q P_est R_est F1_est P_occ(c=.75) R_occ(c=.75) F_1occ(c=.75) P_3 R_3 TLF_1 runtime FRT FFTP_est FFP P_occ(c=.5) R_occ(c=.5) F_1occ(c=.5) P R F_1
NF1 symMono piece1 5 16.000 0.608 0.430 0.504 0.528 0.154 0.238 0.200 0.205 0.203 92.000 0.000 0.420 0.207 0.521 0.154 0.237 0.000 0.000 0.000
NF1 symMono piece2 5 8 0.029 0.023 0.026 0.000 0.000 0.000 0.015 0.014 0.015 326.000 0.000 0.023 0.022 0.000 0.000 0.000 0.000 0.000 0.000
NF1 symMono piece3 10.000 12.000 0.618 0.454 0.524 0.754 0.408 0.530 0.455 0.374 0.411 19.000 0.000 0.344 0.453 0.576 0.335 0.424 0.000 0.000 0.000
NF1 symMono piece4 8 26.000 0.602 0.781 0.680 0.693 0.498 0.580 0.401 0.598 0.480 20.000 0.000 0.429 0.444 0.601 0.449 0.514 0.038 0.125 0.059
NF1 symMono piece5 13.000 14.000 0.505 0.423 0.460 0.969 0.969 0.969 0.448 0.381 0.412 78.000 0.000 0.191 0.421 0.681 0.439 0.534 0.000 0.000 0.000
DM5 symMono piece1 5 16.000 0.324 0.522 0.400 0.706 0.565 0.628 0.248 0.441 0.317 723.000 0.000 0.307 0.302 0.638 0.568 0.601 0.000 0.000 0.000
DM5 symMono piece2 5 19.000 0.195 0.365 0.254 0.000 0.000 0.000 0.205 0.362 0.262 2083.000 0.000 0.207 0.230 0.600 0.300 0.400 0.000 0.000 0.000
DM5 symMono piece3 10.000 7 0.548 0.522 0.534 0.929 0.929 0.929 0.545 0.515 0.530 26.000 0.000 0.522 0.638 0.760 0.609 0.676 0.143 0.100 0.118
DM5 symMono piece4 8 5 0.495 0.466 0.480 0.889 0.667 0.762 0.408 0.302 0.347 22.000 0.000 0.466 0.408 0.446 0.491 0.467 0.200 0.125 0.154
DM5 symMono piece5 13.000 23.000 0.318 0.306 0.312 0.773 0.781 0.777 0.306 0.329 0.317 5089.000 0.000 0.268 0.474 0.645 0.736 0.688 0.000 0.000 0.000
DM6 symMono piece1 5 16.000 0.235 0.481 0.315 0.815 0.544 0.652 0.194 0.428 0.267 725.000 0.000 0.236 0.254 0.558 0.505 0.530 0.000 0.000 0.000
DM6 symMono piece2 5 19.000 0.135 0.266 0.179 0.000 0.000 0.000 0.099 0.251 0.142 2171.000 0.000 0.185 0.153 0.600 0.300 0.400 0.000 0.000 0.000
DM6 symMono piece3 10.000 7 0.544 0.476 0.508 0.929 0.929 0.929 0.515 0.495 0.505 27.000 0.000 0.476 0.638 0.701 0.537 0.608 0.143 0.100 0.118
DM6 symMono piece4 8 5 0.557 0.385 0.455 0.694 0.528 0.600 0.392 0.263 0.315 22.000 0.000 0.385 0.392 0.694 0.528 0.600 0.000 0.000 0.000
DM6 symMono piece5 13.000 23.000 0.131 0.232 0.167 0.000 0.000 0.000 0.100 0.274 0.147 4868.000 0.000 0.205 0.290 0.588 0.588 0.588 0.000 0.000 0.000
DM7 symMono piece1 5 16.000 0.306 0.522 0.385 0.733 0.590 0.653 0.241 0.437 0.311 720.000 0.000 0.301 0.297 0.658 0.587 0.620 0.000 0.000 0.000
DM7 symMono piece2 5 19.000 0.571 0.571 0.571 0.666 0.904 0.767 0.532 0.504 0.518 2184.000 0.000 0.352 0.453 0.645 0.819 0.721 0.000 0.000 0.000
DM7 symMono piece3 10.000 7 0.725 0.620 0.668 0.807 0.757 0.781 0.683 0.586 0.631 28.000 0.000 0.594 0.793 0.748 0.623 0.680 0.143 0.100 0.118
DM7 symMono piece4 8 5 0.620 0.587 0.603 0.376 0.657 0.478 0.410 0.351 0.378 23.000 0.000 0.587 0.410 0.311 0.546 0.396 0.200 0.125 0.154
DM7 symMono piece5 13.000 23.000 0.631 0.335 0.437 0.810 0.894 0.850 0.628 0.369 0.465 5129.000 0.000 0.308 0.692 0.785 0.856 0.819 0.000 0.000 0.000
DM8 symMono piece1 5 35.000 0.401 0.609 0.484 0.572 0.669 0.617 0.279 0.513 0.362 86.000 0.000 0.409 0.601 0.438 0.666 0.529 0.000 0.000 0.000
DM8 symMono piece2 5 37.000 0.292 0.634 0.400 0.480 0.331 0.392 0.254 0.486 0.334 261.000 0.000 0.429 0.293 0.449 0.484 0.466 0.000 0.000 0.000
DM8 symMono piece3 10.000 12.000 0.545 0.631 0.585 0.778 0.429 0.553 0.455 0.485 0.469 6.000 0.000 0.404 0.452 0.531 0.435 0.478 0.083 0.100 0.091
DM8 symMono piece4 8 20.000 0.345 0.532 0.418 0.889 0.667 0.762 0.244 0.331 0.281 3.000 0.000 0.532 0.454 0.335 0.508 0.404 0.050 0.125 0.071
DM8 symMono piece5 13.000 54.000 0.503 0.379 0.433 0.599 0.772 0.674 0.456 0.347 0.394 617.000 0.000 0.275 0.693 0.497 0.748 0.597 0.000 0.000 0.000
DM9 symMono piece1 5 35.000 0.250 0.529 0.340 0.778 0.424 0.549 0.208 0.454 0.286 87.000 0.000 0.313 0.512 0.481 0.446 0.463 0.000 0.000 0.000
DM9 symMono piece2 5 37.000 0.222 0.510 0.309 0.539 0.385 0.450 0.129 0.380 0.192 192.000 0.000 0.462 0.385 0.545 0.342 0.421 0.000 0.000 0.000
DM9 symMono piece3 10.000 12.000 0.478 0.544 0.509 0.917 0.465 0.617 0.357 0.460 0.402 6.000 0.000 0.324 0.443 0.645 0.493 0.559 0.000 0.000 0.000
DM9 symMono piece4 8 20.000 0.341 0.421 0.377 0.611 0.528 0.566 0.203 0.313 0.246 2.000 0.000 0.403 0.437 0.498 0.527 0.512 0.000 0.000 0.000
DM9 symMono piece5 13.000 54.000 0.158 0.254 0.195 0.000 0.000 0.000 0.111 0.298 0.162 593.000 0.000 0.192 0.346 0.588 0.588 0.588 0.000 0.000 0.000
DM10 symMono piece1 5 35.000 0.423 0.642 0.510 0.507 0.717 0.594 0.295 0.530 0.379 87.000 0.000 0.436 0.617 0.453 0.749 0.564 0.000 0.000 0.000
DM10 symMono piece2 5 37.000 0.545 0.799 0.648 0.574 0.817 0.674 0.454 0.569 0.505 249.000 0.000 0.592 0.480 0.528 0.807 0.638 0.000 0.000 0.000
DM10 symMono piece3 10.000 12.000 0.724 0.696 0.710 0.615 0.577 0.596 0.590 0.516 0.550 6.000 0.000 0.451 0.628 0.560 0.592 0.576 0.083 0.100 0.091
DM10 symMono piece4 8 20.000 0.397 0.671 0.499 0.398 0.804 0.532 0.269 0.390 0.319 4.000 0.000 0.671 0.505 0.309 0.713 0.431 0.050 0.125 0.071
DM10 symMono piece5 13.000 54.000 0.637 0.432 0.515 0.589 0.919 0.718 0.612 0.389 0.476 461.000 0.000 0.305 0.804 0.535 0.870 0.663 0.000 0.000 0.000

download these results as csv

Table 4. Tabular version of Figures 16-25.


AlgId TaskVersion Piece n_P R_est R_occ(c=.75) R_occ(c=.5)
NF1 symMono piece1 5
0.235 0.900 0.600 0.000 0.417
0.00000 0.154 0.000 0.000 0.000
0.00000 0.154 0.000 0.000 0.000
NF1 symMono piece2 5
0.111 0.000 0.000 0.004 0.002
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
NF1 symMono piece3 10.000
0.533 0.512 0.778 0.500 0.500 0.526 0.200 0.667 0.226 0.103
0.00000 0.000 0.408 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.408 0.000 0.000 0.000 0.000 0.000 0.000 0.000
NF1 symMono piece4 8
0.857 0.556 0.500 1.000 0.867 0.571 0.900 1.000
0.321 0.000 0.000 0.219 0.433 0.000 0.771 1.000
0.321 0.000 0.000 0.219 0.433 0.000 0.771 1.000
NF1 symMono piece5 13.000
0.318 0.256 0.375 0.969 0.316 0.426 0.170 0.044 0.054 0.637 0.565 0.706 0.665
0.00000 0.000 0.000 0.969 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.969 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DMDM8 symMono piece1 5
0.588 0.895 1.000 0.049 0.080
0.00000 0.576 0.544 0.000 0.000
0.00000 0.576 0.544 0.000 0.000
DMDM8 symMono piece2 5
0.600 0.438 0.167 0.324 0.295
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
DMDM8 symMono piece3 10.000
0.600 0.186 0.577 0.600 0.474 0.421 0.308 0.194 0.857 1.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.857 1.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.857 1.000
DMDM8 symMono piece4 8
1.00000 0.333 0.267 0.500 0.467 0.286 0.333 0.538
0.667 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.667 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DMDM8 symMono piece5 13.000
0.087 0.100 0.498 0.560 0.233 0.158 0.273 0.056 0.050 0.272 0.096 0.824 0.765
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.824 0.738
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.824 0.738
DMDM9 symMono piece1 5
0.588 0.526 1.000 0.222 0.067
0.00000 0.000 0.544 0.000 0.000
0.00000 0.000 0.544 0.000 0.000
DMDM9 symMono piece2 5
0.600 0.375 0.250 0.072 0.033
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
DMDM9 symMono piece3 10.000
0.600 0.116 0.577 0.600 0.105 0.211 0.500 0.194 0.857 1.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.857 1.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.857 1.000
DMDM9 symMono piece4 8
0.833 0.429 0.200 0.750 0.200 0.214 0.222 0.231
0.625 0.000 0.000 0.479 0.000 0.000 0.000 0.000
0.625 0.000 0.000 0.479 0.000 0.000 0.000 0.000
DMDM9 symMono piece5 13.000
0.275 0.182 0.099 0.064 0.588 0.250 0.250 0.167 0.200 0.086 0.370 0.273 0.208
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DMDM10 symMono piece1 5
0.588 0.947 1.000 0.027 0.046
0.00000 0.613 0.544 0.000 0.000
0.00000 0.613 0.544 0.000 0.000
DMDM10 symMono piece2 5
0.600 0.438 0.167 0.722 0.927
0.00000 0.000 0.000 0.000 0.904
0.00000 0.000 0.000 0.000 0.904
DMDM10 symMono piece3 10.000
0.600 0.728 0.833 0.600 0.864 0.432 0.091 0.194 0.857 1.000
0.00000 0.000 0.500 0.000 0.670 0.000 0.000 0.000 0.857 1.000
0.00000 0.000 0.500 0.000 0.670 0.000 0.000 0.000 0.857 1.000
DMDM10 symMono piece4 8
1.00000 0.259 0.200 0.286 0.875 0.619 0.600 0.857
0.750 0.000 0.000 0.000 0.625 0.000 0.000 0.643
0.750 0.000 0.000 0.000 0.625 0.000 0.000 0.643
DMDM10 symMono piece5 13.000
0.086 0.081 0.789 0.986 0.227 0.144 0.045 0.032 0.039 0.279 0.088 0.840 0.713
0.00000 0.000 0.529 0.981 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.840 0.000
0.00000 0.000 0.529 0.981 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.840 0.000
DM8 symMono piece1 5
0.824 0.800 0.455 0.300 0.667
0.808 0.576 0.000 0.000 0.000
0.808 0.576 0.000 0.000 0.000
DM8 symMono piece2 5
0.900 0.438 0.800 0.700 0.330
0.569 0.000 0.093 0.000 0.000
0.569 0.000 0.093 0.000 0.000
DM8 symMono piece3 10.000
0.600 0.504 0.577 1.000 0.615 0.524 0.308 0.516 1.000 0.667
0.00000 0.000 0.000 0.190 0.000 0.000 0.000 0.000 0.667 0.000
0.00000 0.000 0.000 0.190 0.000 0.000 0.000 0.000 0.667 0.000
DM8 symMono piece4 8
1.00000 0.571 0.267 0.500 0.533 0.438 0.333 0.615
0.667 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.667 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DM8 symMono piece5 13.000
0.095 0.154 0.674 0.519 0.299 0.200 0.084 0.084 0.093 0.795 0.284 0.834 0.818
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.476 0.000 0.834 0.780
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.476 0.000 0.834 0.780
DM9 symMono piece1 5
0.588 0.778 0.500 0.444 0.333
0.00000 0.424 0.000 0.000 0.000
0.00000 0.424 0.000 0.000 0.000
DM9 symMono piece2 5
0.800 0.762 0.800 0.128 0.059
0.492 0.571 0.093 0.000 0.000
0.492 0.571 0.093 0.000 0.000
DM9 symMono piece3 10.000
0.600 0.202 0.846 1.000 0.289 0.526 0.500 0.161 0.714 0.600
0.00000 0.000 0.716 0.214 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.716 0.214 0.000 0.000 0.000 0.000 0.000 0.000
DM9 symMono piece4 8
0.833 0.571 0.200 0.750 0.267 0.214 0.222 0.308
0.625 0.000 0.000 0.479 0.000 0.000 0.000 0.000
0.625 0.000 0.000 0.479 0.000 0.000 0.000 0.000
DM9 symMono piece5 13.000
0.333 0.158 0.099 0.064 0.588 0.273 0.250 0.222 0.176 0.286 0.370 0.273 0.208
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
DM10 symMono piece1 5
0.882 0.857 0.333 0.273 0.867
0.866 0.613 0.000 0.000 0.778
0.866 0.613 0.000 0.000 0.778
DM10 symMono piece2 5
0.900 0.438 0.800 0.928 0.929
0.569 0.000 0.093 0.900 0.908
0.569 0.000 0.093 0.900 0.908
DM10 symMono piece3 10.000
0.600 0.860 0.821 1.000 0.864 0.475 0.100 0.575 1.000 0.667
0.00000 0.767 0.498 0.190 0.670 0.000 0.000 0.000 0.667 0.000
0.00000 0.767 0.498 0.190 0.670 0.000 0.000 0.000 0.667 0.000
DM10 symMono piece4 8
1.00000 0.857 0.200 0.364 0.875 0.619 0.600 0.857
0.750 0.643 0.000 0.000 0.867 0.000 0.000 0.846
0.750 0.643 0.000 0.000 0.867 0.000 0.000 0.846
DM10 symMono piece5 13.000
0.086 0.080 0.945 0.985 0.283 0.206 0.082 0.075 0.083 0.835 0.292 0.914 0.749
0.00000 0.000 0.940 0.983 0.000 0.000 0.000 0.000 0.000 0.504 0.000 0.914 0.000
0.00000 0.000 0.940 0.983 0.000 0.000 0.000 0.000 0.000 0.504 0.000 0.914 0.000

download these results as csv

Table 5. Tabular version of Figures 14 and 15.

audPoly

AlgId TaskVersion Piece n_P n_Q P_est R_est F1_est P_occ(c=.75) R_occ(c=.75) F_1occ(c=.75) P_3 R_3 TLF_1 runtime FRT FFTP_est FFP P_occ(c=.5) R_occ(c=.5) F_1occ(c=.5) P R F_1
NF4 audPoly piece1 5 1 0.323 0.126 0.181 0.000 0.000 0.000 0.100 0.032 0.048 7.000 0.000 0.126 0.100 0.000 0.000 0.000 0.000 0.000 0.000
NF4 audPoly piece2 5 105.000 0.241 0.304 0.269 0.333 0.050 0.087 0.058 0.103 0.074 29.000 0.000 0.222 0.090 0.337 0.082 0.132 0.000 0.000 0.000
NF4 audPoly piece3 10.000 23.000 0.277 0.262 0.269 0.000 0.000 0.000 0.148 0.172 0.159 10.000 0.000 0.203 0.218 0.000 0.000 0.000 0.000 0.000 0.000
NF4 audPoly piece4 5 1 0.294 0.112 0.162 0.000 0.000 0.000 0.442 0.127 0.197 2.000 0.000 0.112 0.442 0.000 0.000 0.000 0.000 0.000 0.000
NF4 audPoly piece5 13.000 24.000 0.288 0.304 0.296 0.000 0.000 0.000 0.130 0.159 0.143 135.000 0.000 0.227 0.163 0.000 0.000 0.000 0.000 0.000 0.000

download these results as csv

Table 6. Tabular version of Figures 28-37.


AlgId TaskVersion Piece n_P R_est R_occ(c=.75) R_occ(c=.5)
NF4 audPoly piece1 5
0.323 0.000 0.308 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
NF4 audPoly piece2 5
0.800 0.277 0.333 0.084 0.026
0.050 0.000 0.000 0.000 0.000
0.050 0.000 0.000 0.000 0.000
NF4 audPoly piece3 10.000
0.341 0.205 0.333 0.273 0.242 0.250 0.200 0.230 0.292 0.250
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
NF4 audPoly piece4 5
0.00000 0.000 0.000 0.267 0.294
0.00000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000
NF4 audPoly piece5 13.000
0.379 0.242 0.271 0.280 0.314 0.429 0.121 0.188 0.333 0.375 0.381 0.351 0.290
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

download these results as csv

Table 7. Tabular version of Figures 26 and 27.

audMono

AlgId TaskVersion Piece n_P n_Q P_est R_est F1_est P_occ(c=.75) R_occ(c=.75) F_1occ(c=.75) P_3 R_3 TLF_1 runtime FRT FFTP_est FFP P_occ(c=.5) R_occ(c=.5) F_1occ(c=.5) P R F_1
NF3 audMono piece1 5 41.000 0.506 0.586 0.543 0.384 0.119 0.182 0.124 0.219 0.158 135.000 0.000 0.428 0.154 0.362 0.122 0.182 0.000 0.000 0.000
NF3 audMono piece2 5 19.000 0.344 0.422 0.379 0.406 0.109 0.172 0.097 0.110 0.103 21.000 0.000 0.274 0.104 0.392 0.088 0.143 0.000 0.000 0.000
NF3 audMono piece3 10.000 7 0.637 0.437 0.519 0.482 0.167 0.248 0.275 0.220 0.244 7.000 0.000 0.314 0.263 0.440 0.138 0.210 0.000 0.000 0.000
NF3 audMono piece4 8 10.000 0.523 0.375 0.437 0.767 0.167 0.274 0.579 0.386 0.463 20.000 0.000 0.238 0.594 0.601 0.360 0.451 0.000 0.000 0.000
NF3 audMono piece5 13.000 1 0.432 0.102 0.165 0.000 0.000 0.000 0.148 0.045 0.069 122.000 0.000 0.102 0.148 0.000 0.000 0.000 0.000 0.000 0.000

download these results as csv

Table 8. Taublar version of Figures 40-49.


AlgId TaskVersion Piece n_P R_est R_occ(c=.75) R_occ(c=.5)
NF3 audMono piece1 5
0.850 0.862 0.312 0.500 0.405
0.097 0.133 0.000 0.000 0.000
0.097 0.133 0.000 0.000 0.000
NF3 audMono piece2 5
0.727 0.875 0.333 0.124 0.049
0.00000 0.109 0.000 0.000 0.000
0.00000 0.109 0.000 0.000 0.000
NF3 audMono piece3 10.000
0.929 0.109 0.519 0.545 0.289 0.579 0.250 0.484 0.467 0.200
0.167 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.167 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
NF3 audMono piece4 8
0.00000 0.000 0.000 0.800 0.533 0.571 0.556 0.538
0.00000 0.000 0.000 0.167 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.167 0.000 0.000 0.000 0.000
NF3 audMono piece5 13.000
0.00000 0.000 0.077 0.000 0.381 0.432 0.108 0.000 0.000 0.000 0.000 0.225 0.100
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
0.00000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

download these results as csv

Table 9. Tabular version of Figures 38 and 39.

References