2008:Multiple Fundamental Frequency Estimation & Tracking Results

From MIREX Wiki

Introduction

These are the results for the 2008 running of the Multiple Fundamental Frequency Estimation and Tracking task. For background information about this task set please refer to the 2008:Multiple Fundamental Frequency Estimation & Tracking page.

General Legend

Team ID

CL1 = C. Cao, M. Li 1
CL2 = C. Cao, M. Li 2
DRD = J-L. Durrieu, G. Richard, B. David
EOS = K. Egashira, N. Ono, S. Sagayama
EBD1 = V. Emiya, R. Badeau, B. David 1
EBD2 = V. Emiya, R. Badeau, B. David 2
MG = M. Groble
PI1 = A. Pertusa, J. M. Iñesta 1
PI2 = A. Pertusa, J. M. Iñesta 2
RFF1 = G. Reis, F. Fernandez, A. Ferreira 1
RFF2 = G. Reis, F. Fernandez, A. Ferreira 2
RK = M. Ryynänen, A. Klapuri
VBB = E. Vincent, N. Bertin, R. Badeau
YRC1 = C. Yeh, A. Roebel, W-C. Chang 1
YRC2 = C. Yeh, A. Roebel, W-C. Chang 2
ZR1 = R. Zhou, J. D. Reiss 1
ZR2 = R. Zhou, J. D. Reiss 2
ZR3 = R. Zhou, J. D. Reiss 3

Overall Summary Results Task 1

Below are the average scores across 36 test files. These files consisted of 9 groups, each group having 4 files ranging from 2 polyphony to 5 polyphony. 28 real recordings, 8 synthesized from RWC samples.

CL1 CL2 DRD EBD1 EBD2 EOS MG PI1 PI2 RFF1 RFF2 RK VBB YRC1 YRC2
Accuracy 0.358 0.487 0.495 0.447 0.452 0.467 0.427 0.596 0.618 0.211 0.183 0.613 0.54 0.619 0.665
Accuracy Chroma 0.395 0.519 0.557 0.504 0.501 0.55 0.501 0.639 0.657 0.269 0.228 0.658 0.569 0.655 0.687

download these results as csv

Detailed Results

Precision Recall Accuracy Etot Esubs Emiss Efa
CL1 0.358 0.763 0.358 1.68 0.236 0.001 1.443
CL2 0.671 0.56 0.487 0.598 0.148 0.292 0.158
DRD 0.541 0.66 0.495 0.731 0.245 0.096 0.391
EBD1 0.674 0.498 0.447 0.629 0.161 0.341 0.127
EBD2 0.713 0.493 0.452 0.599 0.146 0.362 0.092
EOS 0.591 0.546 0.467 0.649 0.21 0.244 0.194
MG 0.481 0.57 0.427 0.816 0.298 0.133 0.385
PI1 0.824 0.625 0.596 0.429 0.101 0.275 0.053
PI2 0.832 0.647 0.618 0.406 0.096 0.257 0.053
RFF1 0.506 0.226 0.211 0.854 0.183 0.601 0.071
RFF2 0.509 0.191 0.183 0.857 0.155 0.656 0.047
RK 0.698 0.719 0.613 0.464 0.151 0.13 0.183
VBB 0.714 0.615 0.54 0.544 0.118 0.267 0.159
YRC1 0.698 0.741 0.619 0.477 0.129 0.129 0.218
YRC2 0.741 0.78 0.665 0.426 0.108 0.127 0.19

download these results as csv

Detailed Chroma Results

Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)

Precision Recall Accuracy Etot Esubs Emiss Efa
CL1 0.395 0.837 0.395 1.606 0.162 0.001 1.443
CL2 0.716 0.596 0.519 0.562 0.112 0.292 0.158
DRD 0.608 0.744 0.557 0.647 0.16 0.096 0.391
EBD1 0.76 0.564 0.504 0.564 0.096 0.341 0.127
EBD2 0.788 0.548 0.501 0.544 0.09 0.362 0.092
EOS 0.697 0.644 0.55 0.55 0.112 0.244 0.194
MG 0.563 0.673 0.501 0.713 0.195 0.133 0.385
PI1 0.881 0.671 0.639 0.383 0.055 0.275 0.053
PI2 0.884 0.689 0.657 0.364 0.054 0.257 0.053
RFF1 0.639 0.289 0.269 0.79 0.119 0.601 0.071
RFF2 0.634 0.239 0.228 0.808 0.106 0.656 0.047
RK 0.75 0.771 0.658 0.412 0.098 0.13 0.183
VBB 0.754 0.648 0.569 0.511 0.085 0.267 0.159
YRC1 0.741 0.786 0.655 0.432 0.085 0.129 0.218
YRC2 0.768 0.808 0.687 0.398 0.08 0.127 0.19

download these results as csv

Individual Results Files for Task 1: Scores per Query

CL1 = C. Cao, M. Li 1
CL2 = C. Cao, M. Li 2
DRD = J-L. Durrieu, G. Richard, B. David
EBD1 = CV. Emiya, R. Badeau, B. David 1
EBD2 = V. Emiya, R. Badeau, B. David 2
EOS = K. Egashira, N. Ono, S. Sagayama
MG = M. Groble
PI1 = A. Pertusa, J. M. Iñesta 1
PI2 = A. Pertusa, J. M. Iñesta 2
RFF1 = G. Reis, F. Fernandez, A. Ferreira 1
RFF2 = GG. Reis, F. Fernandez, A. Ferreira 2
RK = M. Ryynänen, A. Klapuri
VBB = E. Vincent, N. Bertin, R. Badeau
YRC1 = C. Yeh, A. Roebel, W-C. Chang 1
YRC2 = C. Yeh, A. Roebel, W-C. Chang 2

Info about the filenames

The filenames starting with part* comes from acoustic woodwind recording, the ones starting with RWC are synthesized. The legend about the instruments are:

bs = bassoon, cl = clarinet, fl = flute, hn = horn, ob = oboe, vl = violin, cel = cello, gtr = guitar, sax = saxophone, bass = electric bass guitar

Run Times

CL1 CL2 DRD EBD1 EBD2 EOS MG PI1 PI2 RFF1 RFF2 RK VBB YRC1 YRC2
Run Time (sec) 2430 2475 14502 18180 22270 9328 99 955 792 73784 70041 5058 2081 57483 57483

download these results as csv

MG ran on MAC, all other systems ran on ALE Nodes.

Overall Summary Results Task 2

This subtask is evaluated in two different ways. In the first setup , a returned note is assumed correct if its onset is within +-50ms of a ref note and its F0 is within +- quarter tone of the corresponding reference note, ignoring the returned offset values. In the second setup, on top of the above requirements, a correct returned note is required to have an offset value within 20% of the ref notes duration around the ref note`s offset, or within 50ms whichever is larger.

A total of 30 files were used in this task: 16 real recordings, 8 synthesized from RWC samples, and 6 piano. The results below are the average of these 30 files.

EBD1 EBD2 EOS PI1 PI2 RFF1 RFF2 RK VBB YRC ZR1 ZR2 ZR3
Ave. F-measure (Onset-Offset) 0.176 0.158 0.236 0.247 0.192 0.028 0.032 0.337 0.197 0.355 0.261 0.263 0.278
Ave. F-measure (Onset-Offset Chroma) 0.189 0.169 0.268 0.251 0.195 0.038 0.042 0.352 0.208 0.362 0.297 0.3 0.313
Ave. F-measure (Onset Only) 0.417 0.384 0.503 0.47 0.396 0.14 0.132 0.614 0.521 0.552 0.518 0.52 0.53
Ave. F-measure (Onset Only Chroma) 0.47 0.429 0.561 0.52 0.446 0.177 0.168 0.655 0.547 0.576 0.575 0.577 0.586

download these results as csv

Detailed Results

Precision Recall Ave. F-measure Ave. Overlap
EBD1 0.165 0.200 0.176 0.865
EBD2 0.153 0.178 0.158 0.845
EOS 0.228 0.255 0.236 0.856
PI1 0.201 0.333 0.247 0.862
PI2 0.145 0.301 0.192 0.854
RFF1 0.034 0.025 0.028 0.683
RFF2 0.037 0.030 0.032 0.645
RK 0.312 0.382 0.337 0.884
VBB 0.162 0.268 0.197 0.829
YRC 0.307 0.442 0.355 0.890
ZR1 0.233 0.303 0.261 0.875
ZR2 0.236 0.306 0.263 0.874
ZR3 0.256 0.314 0.278 0.874

download these results as csv

Detailed Chroma Results

Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)

Precision Recall Ave. F-measure Ave. Overlap
EBD1 0.178 0.215 0.189 0.862
EBD2 0.163 0.192 0.169 0.848
EOS 0.259 0.292 0.268 0.855
PI1 0.204 0.338 0.251 0.856
PI2 0.147 0.308 0.195 0.848
RFF1 0.046 0.033 0.038 0.794
RFF2 0.050 0.039 0.042 0.757
RK 0.324 0.398 0.352 0.884
VBB 0.171 0.283 0.208 0.850
YRC 0.314 0.451 0.362 0.889
ZR1 0.266 0.344 0.297 0.874
ZR2 0.269 0.349 0.300 0.874
ZR3 0.287 0.352 0.313 0.874

download these results as csv

Results Based on Onset Only

Precision Recall Ave. F-measure Ave. Overlap
EBD1 0.379 0.494 0.417 0.559
EBD2 0.356 0.455 0.384 0.561
EOS 0.482 0.553 0.503 0.688
PI1 0.385 0.626 0.470 0.661
PI2 0.297 0.624 0.396 0.612
RFF1 0.170 0.123 0.140 0.428
RFF2 0.158 0.121 0.132 0.474
RK 0.578 0.678 0.614 0.699
VBB 0.439 0.679 0.521 0.622
YRC 0.471 0.698 0.552 0.734
ZR1 0.466 0.602 0.518 0.696
ZR2 0.467 0.604 0.520 0.697
ZR3 0.486 0.600 0.530 0.701

download these results as csv

Chroma Results Based on Onset Only

Precision Recall Ave. F-measure Ave. Overlap
EBD1 0.426 0.559 0.470 0.539
EBD2 0.396 0.513 0.429 0.530
EOS 0.536 0.617 0.561 0.676
PI1 0.426 0.694 0.520 0.605
PI2 0.334 0.705 0.446 0.519
RFF1 0.217 0.155 0.177 0.435
RFF2 0.203 0.155 0.168 0.478
RK 0.616 0.724 0.655 0.683
VBB 0.462 0.710 0.547 0.607
YRC 0.493 0.729 0.576 0.707
ZR1 0.516 0.667 0.575 0.694
ZR2 0.519 0.671 0.577 0.696
ZR3 0.537 0.664 0.586 0.702

download these results as csv

Piano Subset Results Based on Onset Only

Precision Recall Ave. F-measure Ave. Overlap
EBD1 0.649 0.639 0.631 0.606
EBD2 0.622 0.541 0.569 0.610
EOS 0.541 0.539 0.527 0.550
PI1 0.390 0.654 0.487 0.544
PI2 0.279 0.704 0.398 0.486
RFF1 0.305 0.239 0.268 0.401
RFF2 0.256 0.235 0.245 0.452
RK 0.720 0.669 0.692 0.606
VBB 0.535 0.733 0.609 0.531
YRC 0.389 0.709 0.499 0.543
ZR1 0.738 0.777 0.757 0.577
ZR2 0.734 0.775 0.754 0.583
ZR3 0.743 0.744 0.743 0.605

download these results as csv

Individual Results Files for Task 2

EBD1 = CV. Emiya, R. Badeau, B. David 1
EBD2 = V. Emiya, R. Badeau, B. David 2
EOS = K. Egashira, N. Ono, S. Sagayama
PI1 = A. Pertusa, J. M. Iñesta 1
PI2 = A. Pertusa, J. M. Iñesta 2
RFF1 = G. Reis, F. Fernandez, A. Ferreira 1
RFF2 = GG. Reis, F. Fernandez, A. Ferreira 2
RK = M. Ryynänen, A. Klapuri
VBB = E. Vincent, N. Bertin, R. Badeau
YRC1 = C. Yeh, A. Roebel, W-C. Chang 1
ZR1 = R. Zhou, J. D. Reiss 1
ZR2 = R. Zhou, J. D. Reiss 2
ZR3 = R. Zhou, J. D. Reiss 3

Info About Filenames

The filenames starting with part* comes from acoustic woodwind recording, the ones starting with RWC are synthesized. The piano files are: RA_C030_align.wav,bach_847TESTp.wav,beet_pathetique_3TESTp.wav,mz_333_1TESTp.wav,scn_4TESTp.wav.note, ty_januarTESTp.wav.note

Run Times

EBD1 EBD2 EOS PI1 PI2 RFF1 RFF2 RK VBB YRC ZR1 ZR2 ZR3
Run Time (sec) 18180 22270 9328 950 790 73718 71360 5044 2058 57483 1415 1415 871

download these results as csv

ZR1,ZR2,ZR3 ran on BLACK. All other systems ran on ALE Nodes.

Friedman's Test for Significant Differences

Task 1

The Friedman test was run in MATLAB to test significant differences amongst systems with regard to the performance (accuracy) on individual files.

Friedman's Anova Table
Source SS df MS Chi-sq Prob>Chi-sq
Columns 7516.72 14 536.909 375.84 0
Error 2563.28 490 5.231
Total 10080 539

download these results as csv

Tukey-Kramer HSD Multi-Comparison
TeamID TeamID Lowerbound Mean Upperbound Significance
CL1 CL2 -6.9913 -3.4167 0.1580 FALSE
CL1 DRD -7.3802 -3.8056 -0.2309 TRUE
CL1 EBD1 -5.3802 -1.8056 1.7691 FALSE
CL1 EBD2 -5.7691 -2.1944 1.3802 FALSE
CL1 EOS -6.0191 -2.4444 1.1302 FALSE
CL1 MG -5.1302 -1.5556 2.0191 FALSE
CL1 PI1 -10.2136 -6.6389 -3.0642 TRUE
CL1 PI2 -11.6580 -8.0833 -4.5087 TRUE
CL1 RFF1 -1.1580 2.4167 5.9913 FALSE
CL1 RFF2 -0.5747 3.0000 6.5747 FALSE
CL1 RK -11.8524 -8.2778 -4.7031 TRUE
CL1 VBB -9.0747 -5.5000 -1.9253 TRUE
CL1 YRC1 -11.2691 -7.6944 -4.1198 TRUE
CL1 YRC2 -12.9913 -9.4167 -5.8420 TRUE
CL2 DRD -3.9636 -0.3889 3.1858 FALSE
CL2 EBD1 -1.9636 1.6111 5.1858 FALSE
CL2 EBD2 -2.3524 1.2222 4.7969 FALSE
CL2 EOS -2.6024 0.9722 4.5469 FALSE
CL2 MG -1.7136 1.8611 5.4358 FALSE
CL2 PI1 -6.7969 -3.2222 0.3524 FALSE
CL2 PI2 -8.2413 -4.6667 -1.0920 TRUE
CL2 RFF1 2.2587 5.8333 9.4080 TRUE
CL2 RFF2 2.8420 6.4167 9.9913 TRUE
CL2 RK -8.4358 -4.8611 -1.2864 TRUE
CL2 VBB -5.6580 -2.0833 1.4913 FALSE
CL2 YRC1 -7.8524 -4.2778 -0.7031 TRUE
CL2 YRC2 -9.5747 -6.0000 -2.4253 TRUE
DRD EBD1 -1.5747 2.0000 5.5747 FALSE
DRD EBD2 -1.9636 1.6111 5.1858 FALSE
DRD EOS -2.2136 1.3611 4.9358 FALSE
DRD MG -1.3247 2.2500 5.8247 FALSE
DRD PI1 -6.4080 -2.8333 0.7413 FALSE
DRD PI2 -7.8524 -4.2778 -0.7031 TRUE
DRD RFF1 2.6476 6.2222 9.7969 TRUE
DRD RFF2 3.2309 6.8056 10.3802 TRUE
DRD RK -8.0469 -4.4722 -0.8976 TRUE
DRD VBB -5.2691 -1.6944 1.8802 FALSE
DRD YRC1 -7.4636 -3.8889 -0.3142 TRUE
DRD YRC2 -9.1858 -5.6111 -2.0364 TRUE
EBD1 EBD2 -3.9636 -0.3889 3.1858 FALSE
EBD1 EOS -4.2136 -0.6389 2.9358 FALSE
EBD1 MG -3.3247 0.2500 3.8247 FALSE
EBD1 PI1 -8.4080 -4.8333 -1.2587 TRUE
EBD1 PI2 -9.8524 -6.2778 -2.7031 TRUE
EBD1 RFF1 0.6476 4.2222 7.7969 TRUE
EBD1 RFF2 1.2309 4.8056 8.3802 TRUE
EBD1 RK -10.0469 -6.4722 -2.8976 TRUE
EBD1 VBB -7.2691 -3.6944 -0.1198 TRUE
EBD1 YRC1 -9.4636 -5.8889 -2.3142 TRUE
EBD1 YRC2 -11.1858 -7.6111 -4.0364 TRUE
EBD2 EOS -3.8247 -0.2500 3.3247 FALSE
EBD2 MG -2.9358 0.6389 4.2136 FALSE
EBD2 PI1 -8.0191 -4.4444 -0.8698 TRUE
EBD2 PI2 -9.4636 -5.8889 -2.3142 TRUE
EBD2 RFF1 1.0364 4.6111 8.1858 TRUE
EBD2 RFF2 1.6198 5.1944 8.7691 TRUE
EBD2 RK -9.6580 -6.0833 -2.5087 TRUE
EBD2 VBB -6.8802 -3.3056 0.2691 FALSE
EBD2 YRC1 -9.0747 -5.5000 -1.9253 TRUE
EBD2 YRC2 -10.7969 -7.2222 -3.6476 TRUE
EOS MG -2.6858 0.8889 4.4636 FALSE
EOS PI1 -7.7691 -4.1944 -0.6198 TRUE
EOS PI2 -9.2136 -5.6389 -2.0642 TRUE
EOS RFF1 1.2864 4.8611 8.4358 TRUE
EOS RFF2 1.8698 5.4444 9.0191 TRUE
EOS RK -9.4080 -5.8333 -2.2587 TRUE
EOS VBB -6.6302 -3.0556 0.5191 FALSE
EOS YRC1 -8.8247 -5.2500 -1.6753 TRUE
EOS YRC2 -10.5469 -6.9722 -3.3976 TRUE
MG PI1 -8.6580 -5.0833 -1.5087 TRUE
MG PI2 -10.1024 -6.5278 -2.9531 TRUE
MG RFF1 0.3976 3.9722 7.5469 TRUE
MG RFF2 0.9809 4.5556 8.1302 TRUE
MG RK -10.2969 -6.7222 -3.1476 TRUE
MG VBB -7.5191 -3.9444 -0.3698 TRUE
MG YRC1 -9.7136 -6.1389 -2.5642 TRUE
MG YRC2 -11.4358 -7.8611 -4.2864 TRUE
PI1 PI2 -5.0191 -1.4444 2.1302 FALSE
PI1 RFF1 5.4809 9.0556 12.6302 TRUE
PI1 RFF2 6.0642 9.6389 13.2136 TRUE
PI1 RK -5.2136 -1.6389 1.9358 FALSE
PI1 VBB -2.4358 1.1389 4.7136 FALSE
PI1 YRC1 -4.6302 -1.0556 2.5191 FALSE
PI1 YRC2 -6.3524 -2.7778 0.7969 FALSE
PI2 RFF1 6.9253 10.5000 14.0747 TRUE
PI2 RFF2 7.5087 11.0833 14.6580 TRUE
PI2 RK -3.7691 -0.1944 3.3802 FALSE
PI2 VBB -0.9913 2.5833 6.1580 FALSE
PI2 YRC1 -3.1858 0.3889 3.9636 FALSE
PI2 YRC2 -4.9080 -1.3333 2.2413 FALSE
RFF1 RFF2 -2.9913 0.5833 4.1580 FALSE
RFF1 RK -14.2691 -10.6944 -7.1198 TRUE
RFF1 VBB -11.4913 -7.9167 -4.3420 TRUE
RFF1 YRC1 -13.6858 -10.1111 -6.5364 TRUE
RFF1 YRC2 -15.4080 -11.8333 -8.2587 TRUE
RFF2 RK -14.8524 -11.2778 -7.7031 TRUE
RFF2 VBB -12.0747 -8.5000 -4.9253 TRUE
RFF2 YRC1 -14.2691 -10.6944 -7.1198 TRUE
RFF2 YRC2 -15.9913 -12.4167 -8.8420 TRUE
RK VBB -0.7969 2.7778 6.3524 FALSE
RK YRC1 -2.9913 0.5833 4.1580 FALSE
RK YRC2 -4.7136 -1.1389 2.4358 FALSE
VBB YRC1 -5.7691 -2.1944 1.3802 FALSE
VBB YRC2 -7.4913 -3.9167 -0.3420 TRUE
YRC1 YRC2 -5.2969 -1.7222 1.8524 FALSE

download these results as csv

2008 multif0.task1.friedman.png

Task 2

The Friedman test was run in MATLAB to test significant differences amongst systems with regard to the performance (accuracy) on individual files.

Friedman's Anova Table
Source SS df MS Chi-sq Prob>Chi-sq
Columns 3106.67 12 258.889 204.91 0
Error 2351.33 348 6.757
Total 5458 389

download these results as csv

Tukey-Kramer HSD Multi-Comparison
TeamID TeamID Lowerbound Mean Upperbound Significance
EBD1 EBD2 -2.6305 0.7000 4.0305 FALSE
EBD1 EOS -5.8305 -2.5000 0.8305 FALSE
EBD1 PI1 -5.5305 -2.2000 1.1305 FALSE
EBD1 PI2 -3.2305 0.1000 3.4305 FALSE
EBD1 RFF1 0.7695 4.1000 7.4305 TRUE
EBD1 RFF2 0.7529 4.0833 7.4138 TRUE
EBD1 RK -8.6471 -5.3167 -1.9862 TRUE
EBD1 VBB -3.7138 -0.3833 2.9471 FALSE
EBD1 YRC -7.8805 -4.5500 -1.2195 TRUE
EBD1 ZR1 -5.9971 -2.6667 0.6638 FALSE
EBD1 ZR2 -5.9638 -2.6333 0.6971 FALSE
EBD1 ZR3 -6.7971 -3.4667 -0.1362 TRUE
EBD2 EOS -6.5305 -3.2000 0.1305 FALSE
EBD2 PI1 -6.2305 -2.9000 0.4305 FALSE
EBD2 PI2 -3.9305 -0.6000 2.7305 FALSE
EBD2 RFF1 0.0695 3.4000 6.7305 TRUE
EBD2 RFF2 0.0529 3.3833 6.7138 TRUE
EBD2 RK -9.3471 -6.0167 -2.6862 TRUE
EBD2 VBB -4.4138 -1.0833 2.2471 FALSE
EBD2 YRC -8.5805 -5.2500 -1.9195 TRUE
EBD2 ZR1 -6.6971 -3.3667 -0.0362 TRUE
EBD2 ZR2 -6.6638 -3.3333 -0.0029 TRUE
EBD2 ZR3 -7.4971 -4.1667 -0.8362 TRUE
EOS PI1 -3.0305 0.3000 3.6305 FALSE
EOS PI2 -0.7305 2.6000 5.9305 FALSE
EOS RFF1 3.2695 6.6000 9.9305 TRUE
EOS RFF2 3.2529 6.5833 9.9138 TRUE
EOS RK -6.1471 -2.8167 0.5138 FALSE
EOS VBB -1.2138 2.1167 5.4471 FALSE
EOS YRC -5.3805 -2.0500 1.2805 FALSE
EOS ZR1 -3.4971 -0.1667 3.1638 FALSE
EOS ZR2 -3.4638 -0.1333 3.1971 FALSE
EOS ZR3 -4.2971 -0.9667 2.3638 FALSE
PI1 PI2 -1.0305 2.3000 5.6305 FALSE
PI1 RFF1 2.9695 6.3000 9.6305 TRUE
PI1 RFF2 2.9529 6.2833 9.6138 TRUE
PI1 RK -6.4471 -3.1167 0.2138 FALSE
PI1 VBB -1.5138 1.8167 5.1471 FALSE
PI1 YRC -5.6805 -2.3500 0.9805 FALSE
PI1 ZR1 -3.7971 -0.4667 2.8638 FALSE
PI1 ZR2 -3.7638 -0.4333 2.8971 FALSE
PI1 ZR3 -4.5971 -1.2667 2.0638 FALSE
PI2 RFF1 0.6695 4.0000 7.3305 TRUE
PI2 RFF2 0.6529 3.9833 7.3138 TRUE
PI2 RK -8.7471 -5.4167 -2.0862 TRUE
PI2 VBB -3.8138 -0.4833 2.8471 FALSE
PI2 YRC -7.9805 -4.6500 -1.3195 TRUE
PI2 ZR1 -6.0971 -2.7667 0.5638 FALSE
PI2 ZR2 -6.0638 -2.7333 0.5971 FALSE
PI2 ZR3 -6.8971 -3.5667 -0.2362 TRUE
RFF1 RFF2 -3.3471 -0.0167 3.3138 FALSE
RFF1 RK -12.7471 -9.4167 -6.0862 TRUE
RFF1 VBB -7.8138 -4.4833 -1.1529 TRUE
RFF1 YRC -11.9805 -8.6500 -5.3195 TRUE
RFF1 ZR1 -10.0971 -6.7667 -3.4362 TRUE
RFF1 ZR2 -10.0638 -6.7333 -3.4029 TRUE
RFF1 ZR3 -10.8971 -7.5667 -4.2362 TRUE
RFF2 RK -12.7305 -9.4000 -6.0695 TRUE
RFF2 VBB -7.7971 -4.4667 -1.1362 TRUE
RFF2 YRC -11.9638 -8.6333 -5.3029 TRUE
RFF2 ZR1 -10.0805 -6.7500 -3.4195 TRUE
RFF2 ZR2 -10.0471 -6.7167 -3.3862 TRUE
RFF2 ZR3 -10.8805 -7.5500 -4.2195 TRUE
RK VBB 1.6029 4.9333 8.2638 TRUE
RK YRC -2.5638 0.7667 4.0971 FALSE
RK ZR1 -0.6805 2.6500 5.9805 FALSE
RK ZR2 -0.6471 2.6833 6.0138 FALSE
RK ZR3 -1.4805 1.8500 5.1805 FALSE
VBB YRC -7.4971 -4.1667 -0.8362 TRUE
VBB ZR1 -5.6138 -2.2833 1.0471 FALSE
VBB ZR2 -5.5805 -2.2500 1.0805 FALSE
VBB ZR3 -6.4138 -3.0833 0.2471 FALSE
YRC ZR1 -1.4471 1.8833 5.2138 FALSE
YRC ZR2 -1.4138 1.9167 5.2471 FALSE
YRC ZR3 -2.2471 1.0833 4.4138 FALSE
ZR1 ZR2 -3.2971 0.0333 3.3638 FALSE
ZR1 ZR3 -4.1305 -0.8000 2.5305 FALSE
ZR2 ZR3 -4.1638 -0.8333 2.4971 FALSE

download these results as csv

2008 multif0.task2.friedman.png