Difference between revisions of "2009:Audio Tag Classification (Mood Set) Results"
(→Team ID) |
|||
(37 intermediate revisions by 5 users not shown) | |||
Line 1: | Line 1: | ||
− | == | + | ==Introduction== |
+ | These are the results for the 2009 running of the Audio Tag Classification (Mood Set) task. For background information about this task set please refer to the [[2009:Audio_Tag_Classification]] page. The data was created by Xiao Hu and consists of 3,469 unique songs and 135 mood tags organized into 18 mood tag groups. | ||
− | === | + | === Mood tags === |
+ | The tags were collected from [http://last.fm last.fm]. All tags in this set are mood related as identified and grouped by [http://wndomains.fbk.eu/wnaffect.html WordNet-Affect] and human experts. | ||
− | + | Each mood tag group contains the following tags: | |
− | + | * G12: calm, comfort, quiet, serene, mellow, chill out, calm down, calming, chillout, comforting, content, cool down, mellow music, mellow rock, peace of mind, quietness, relaxation, serenity, solace, soothe, soothing, still, tranquil, tranquility, tranquility | |
− | + | * G15: sad, sadness, unhappy, melancholic, melancholy, feeling sad, mood: sad ΓÇô slightly, sad song | |
− | + | * G5: happy, happiness, happy songs, happy music, glad, mood: happy | |
− | + | * G32: romantic, romantic music | |
− | + | * G2: upbeat, gleeful, high spirits, zest, enthusiastic, buoyancy, elation, mood: upbeat | |
− | + | * G16: depressed, blue, dark, depressive, dreary, gloom, darkness, depress, depression, depressing, gloomy | |
− | + | * G28: anger, angry, choleric, fury, outraged, rage, angry music | |
− | + | * G17: grief, heartbreak, mournful, sorrow, sorry, doleful, heartache, heartbreaking, heartsick, lachrymose, mourning, plaintive, regret, sorrowful | |
− | + | * G14: dreamy | |
− | + | * G6: cheerful, cheer up, festive, jolly, jovial, merry, cheer, cheering, cheery, get happy, rejoice, songs that are cheerful, sunny | |
− | + | * G8: brooding, contemplative, meditative, reflective, broody, pensive, pondering, wistful | |
+ | * G29: aggression, aggressive | ||
+ | * G25: angst, anxiety, anxious, jumpy, nervous, angsty | ||
+ | * G9: confident, encouraging, encouragement, optimism, optimistic | ||
+ | * G7: desire, hope, hopeful, mood: hopeful | ||
+ | * G11: earnest, heartfelt | ||
+ | * G31: pessimism, cynical, pessimistic, weltschmerz, cynical/sarcastic | ||
+ | * G1: excitement, exciting, exhilarating, thrill, ardor, stimulating, thrilling, titillating | ||
− | + | For details on the mood tag groups, please see | |
− | + | [https://www.music-ir.org/archive/papers/ISMIR2009_MoodClassification.pdf X. Hu, J. S. Downie, A.Ehmann (2009)]. '''Lyric Text Mining in Music Mood Classification''', In the 10th International Symposium on Music Information Retrieval (ISMIR 2009), Oct. 2009, Kobe, Japan | |
+ | === Data === | ||
+ | The songs are Western pop songs mostly from the USPOP collection. Each song may belong to multiple mood tag groups. The main rationale on songs selection is: if more than one tag in a group were applied to a song, or if one tag in a group was applied more than once to a song, this song is marked as belonging to this group. | ||
+ | For details on how the songs were selected, please see the [https://www.music-ir.org/archive/papers/Mood_Multi_Tag_Data_Description.pdf Mood multi-tag data description]. | ||
− | + | Audio format: 30 second clips, 44.1kHz, stereo,16bit, WAV files; The data were split into 3 folds with artist filtering. | |
− | |||
− | + | -------------------- | |
− | |||
− | === | + | ===General Legend=== |
+ | ====Team ID==== | ||
− | < | + | '''BP1''' = [https://www.music-ir.org/mirex/abstracts/2009/BP_train_tag.pdf Juan José Burred, Geoffroy Peeters]<br /> |
+ | '''BP2''' = [https://www.music-ir.org/mirex/abstracts/2009/BP_train_tag.pdf Juan José Burred, Geoffroy Peeters]<br /> | ||
+ | '''CC1''' = [https://www.music-ir.org/mirex/abstracts/2009/CC.pdf Chuan Cao, Ming Li]<br /> | ||
+ | '''CC2''' = [https://www.music-ir.org/mirex/abstracts/2009/CC.pdf Chuan Cao, Ming Li]<br /> | ||
+ | '''CC3''' = [https://www.music-ir.org/mirex/abstracts/2009/CC.pdf Chuan Cao, Ming Li]<br /> | ||
+ | '''CC4''' = [https://www.music-ir.org/mirex/abstracts/2009/CC.pdf Chuan Cao, Ming Li]<br /> | ||
+ | '''GP''' = [https://www.music-ir.org/mirex/abstracts/2009/Peeters_2009_MIREX_classification.pdf Geoffroy Peeters]<br /> | ||
+ | '''GT1''' = [https://www.music-ir.org/mirex/abstracts/2009/GTfinal.pdf George Tzanetakis]<br /> | ||
+ | '''GT2''' = [https://www.music-ir.org/mirex/abstracts/2009/GTfinal.pdf George Tzanetakis]<br /> | ||
+ | '''HCB''' = [https://www.music-ir.org/mirex/abstracts/2009/HBC.pdf Matthew D.Hoffman, David M. Blei, Perry R.Cook]<br /> | ||
+ | '''LWW1''' = [https://www.music-ir.org/mirex/abstracts/2009/LWW.pdf Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang]<br /> | ||
+ | '''LWW2''' = [https://www.music-ir.org/mirex/abstracts/2009/LWW.pdf Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang]<br /> | ||
− | == | + | ==Overall Summary Results (Binary)== |
− | <csv p=3>tag/Mood/ | + | <csv p=3>2009/tag/Mood/summary_binary.csv</csv> |
+ | ===Summary Binary Relevance F-Measure (Average Across All Folds)=== | ||
+ | <csv p=3>2009/tag/Mood/binary_avg_Fmeasure.csv</csv> | ||
+ | ===Summary Binary Accuracy (Average Across All Folds)=== | ||
+ | <csv p=3>2009/tag/Mood/binary_avg_Accuracy.csv</csv> | ||
+ | ===Summary Positive Example Accuracy (Average Across All Folds)=== | ||
+ | <csv p=3>2009/tag/Mood/binary_avg_positive_example_Accuracy.csv</csv> | ||
+ | ===Summary Negative Example Accuracy (Average Across All Folds)=== | ||
− | <csv>tag/ | + | <csv p=3>2009/tag/Mood/binary_avg_negative_example_Accuracy.csv</csv> |
==Overall Summary Results (Affinity)== | ==Overall Summary Results (Affinity)== | ||
− | <csv p=3>tag/Mood/summary_affinity.csv</csv> | + | <csv p=3>2009/tag/Mood/summary_affinity.csv</csv> |
===Summary AUC-ROC Tag (Average Across All Folds)=== | ===Summary AUC-ROC Tag (Average Across All Folds)=== | ||
− | <csv p=3>tag/Mood/affinity_tag_AUC_ROC.csv</csv> | + | <csv p=3>2009/tag/Mood/affinity_tag_AUC_ROC.csv</csv> |
− | ==Friedman's Test Results== | + | ==Select Friedman's Test Results== |
===Tag F-measure (Binary) Friedman Test=== | ===Tag F-measure (Binary) Friedman Test=== | ||
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each '''tag''' in the test, averaged over all folds. | The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each '''tag''' in the test, averaged over all folds. | ||
− | <csv p=3>tag/Mood/binary_FMeasure.friedman.tukeyKramerHSD.csv</csv> | + | <csv p=3>2009/tag/Mood/binary_FMeasure.friedman.tukeyKramerHSD.csv</csv> |
+ | |||
− | + | https://music-ir.org/mirex/results/2009/tag/Mood/small.binary_FMeasure.friedman.tukeyKramerHSD.png | |
===Per Track F-measure (Binary) Friedman Test=== | ===Per Track F-measure (Binary) Friedman Test=== | ||
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each '''track''' in the test, averaged over all folds. | The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each '''track''' in the test, averaged over all folds. | ||
− | <csv p=3>tag/Mood/binary_FMeasure_per_track.friedman.tukeyKramerHSD.csv</csv> | + | <csv p=3>2009/tag/Mood/binary_FMeasure_per_track.friedman.tukeyKramerHSD.csv</csv> |
− | |||
− | https://music-ir.org/mirex/2009/ | + | https://music-ir.org/mirex/results/2009/tag/Mood/small.binary_FMeasure_per_track.friedman.tukeyKramerHSD.png |
− | ===AUC-ROC | + | ===Tag AUC-ROC (Affinity) Friedman Test=== |
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each '''tag''' in the test, averaged over all folds. | The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each '''tag''' in the test, averaged over all folds. | ||
− | <csv p=3>tag/Mood/affinity. | + | <csv p=3>2009/tag/Mood/affinity.AUC_ROC_TAG.friedman.tukeyKramerHSD.csv</csv> |
+ | |||
+ | https://music-ir.org/mirex/results/2009/tag/Mood/small.affinity.AUC_ROC_TAG.friedman.tukeyKramerHSD.png | ||
+ | |||
+ | |||
+ | ===Per Track AUC-ROC (Affinity) Friedman Test=== | ||
+ | The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each '''track/clip''' in the test, averaged over all folds. | ||
− | + | <csv p=3>2009/tag/Mood/affinity.AUC_ROC_TRACK.friedman.tukeyKramerHSD.csv</csv> | |
+ | https://music-ir.org/mirex/results/2009/tag/Mood/small.affinity.AUC_ROC_TRACK.friedman.tukeyKramerHSD.png | ||
==Assorted Results Files for Download== | ==Assorted Results Files for Download== | ||
− | === | + | ===General Results=== |
− | + | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity_tag_fold_AUC_ROC.csv affinity_tag_fold_AUC_ROC.csv]<br /> | |
− | [https:// | + | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity_clip_AUC_ROC.csv affinity_clip_AUC_ROC.csv]<br /> |
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_per_fold_Accuracy.csv binary_per_fold_Accuracy.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_per_fold_Fmeasure.csv binary_per_fold_Fmeasure.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_per_fold_negative_example_Accuracy.csv binary_per_fold_negative_example_Accuracy.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_per_fold_per_track_Accuracy.csv binary_per_fold_per_track_Accuracy.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_per_fold_per_track_Fmeasure.csv binary_per_fold_per_track_Fmeasure.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_per_fold_per_track_negative_example_Accuracy.csv binary_per_fold_per_track_negative_example_Accuracy.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_per_fold_per_track_positive_example_Accuracy.csv binary_per_fold_per_track_positive_example_Accuracy.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_per_fold_positive_example_Accuracy.csv binary_per_fold_positive_example_Accuracy.csv]<br /> | ||
− | === | + | ===Friedman's Tests Results=== |
− | [https:// | + | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt3.friedman.tukeyKramerHSD.csv affinity.PrecisionAt3.friedman.tukeyKramerHSD.csv ]<br /> |
− | [https:// | + | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt3.friedman.tukeyKramerHSD.png affinity.PrecisionAt3.friedman.tukeyKramerHSD.png]<br /> |
− | [https:// | + | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt6.friedman.tukeyKramerHSD.csv affinity.PrecisionAt6.friedman.tukeyKramerHSD.csv]<br /> |
− | [https:// | + | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt6.friedman.tukeyKramerHSD.png affinity.PrecisionAt6.friedman.tukeyKramerHSD.png]<br /> |
− | [https:// | + | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt9.friedman.tukeyKramerHSD.csv affinity.PrecisionAt9.friedman.tukeyKramerHSD.csv]<br /> |
− | [https:// | + | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt9.friedman.tukeyKramerHSD.png affinity.PrecisionAt9.friedman.tukeyKramerHSD.png ]<br /> |
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt12.friedman.tukeyKramerHSD.csv affinity.PrecisionAt12.friedman.tukeyKramerHSD.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt12.friedman.tukeyKramerHSD.png affinity.PrecisionAt12.friedman.tukeyKramerHSD.png]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt15.friedman.tukeyKramerHSD.csv affinity.PrecisionAt15.friedman.tukeyKramerHSD.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/affinity.PrecisionAt15.friedman.tukeyKramerHSD.png affinity.PrecisionAt15.friedman.tukeyKramerHSD.png]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_Accuracy.friedman.tukeyKramerHSD.csv binary_Accuracy.friedman.tukeyKramerHSD.csv]<br /> | ||
+ | [https://music-ir.org/mirex/results/2009/tag/Mood/binary_Accuracy.friedman.tukeyKramerHSD.png binary_Accuracy.friedman.tukeyKramerHSD.png]<br /> | ||
− | === | + | ===Results By Algorithm=== |
− | + | (.tgz format) <br /> | |
− | |||
− | |||
− | |||
− | |||
− | = | + | '''BP1''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/BP1.tgz Juan José Burred, Geoffroy Peeters]<br /> |
− | + | '''BP2''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/BP2.tgz Juan José Burred, Geoffroy Peeters]<br /> | |
− | ''' | + | '''CC1''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/CC1.tgz Chuan Cao, Ming Li]<br /> |
− | ''' | + | '''CC2''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/CC2.tgz Chuan Cao, Ming Li]<br /> |
− | ''' | + | '''CC3''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/CC3.tgz Chuan Cao, Ming Li]<br /> |
− | ''' | + | '''CC4''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/CC4.tgz Chuan Cao, Ming Li]<br /> |
− | ''' | + | '''GP''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/GP.tgz Geoffroy Peeters]<br /> |
− | ''' | + | '''GT1''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/GT1.tgz George Tzanetakis]<br /> |
− | ''' | + | '''GT2''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/GT2.tgz George Tzanetakis]<br /> |
− | ''' | + | '''LWW1''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/LWW1.tgz Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang]<br /> |
− | ''' | + | '''LWW2''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/LWW2.tgz Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang]<br /> |
− | ''' | + | '''HCB''' = [https://www.music-ir.org/mirex/results/2009/tag/Mood/HCB.tgz Matthew D.Hoffman, David M. Blei, Perry R.Cook]<br /> |
− | ''' |
Latest revision as of 16:43, 23 July 2010
Contents
Introduction
These are the results for the 2009 running of the Audio Tag Classification (Mood Set) task. For background information about this task set please refer to the 2009:Audio_Tag_Classification page. The data was created by Xiao Hu and consists of 3,469 unique songs and 135 mood tags organized into 18 mood tag groups.
Mood tags
The tags were collected from last.fm. All tags in this set are mood related as identified and grouped by WordNet-Affect and human experts.
Each mood tag group contains the following tags:
* G12: calm, comfort, quiet, serene, mellow, chill out, calm down, calming, chillout, comforting, content, cool down, mellow music, mellow rock, peace of mind, quietness, relaxation, serenity, solace, soothe, soothing, still, tranquil, tranquility, tranquility * G15: sad, sadness, unhappy, melancholic, melancholy, feeling sad, mood: sad ΓÇô slightly, sad song * G5: happy, happiness, happy songs, happy music, glad, mood: happy * G32: romantic, romantic music * G2: upbeat, gleeful, high spirits, zest, enthusiastic, buoyancy, elation, mood: upbeat * G16: depressed, blue, dark, depressive, dreary, gloom, darkness, depress, depression, depressing, gloomy * G28: anger, angry, choleric, fury, outraged, rage, angry music * G17: grief, heartbreak, mournful, sorrow, sorry, doleful, heartache, heartbreaking, heartsick, lachrymose, mourning, plaintive, regret, sorrowful * G14: dreamy * G6: cheerful, cheer up, festive, jolly, jovial, merry, cheer, cheering, cheery, get happy, rejoice, songs that are cheerful, sunny * G8: brooding, contemplative, meditative, reflective, broody, pensive, pondering, wistful * G29: aggression, aggressive * G25: angst, anxiety, anxious, jumpy, nervous, angsty * G9: confident, encouraging, encouragement, optimism, optimistic * G7: desire, hope, hopeful, mood: hopeful * G11: earnest, heartfelt * G31: pessimism, cynical, pessimistic, weltschmerz, cynical/sarcastic * G1: excitement, exciting, exhilarating, thrill, ardor, stimulating, thrilling, titillating
For details on the mood tag groups, please see
X. Hu, J. S. Downie, A.Ehmann (2009). Lyric Text Mining in Music Mood Classification, In the 10th International Symposium on Music Information Retrieval (ISMIR 2009), Oct. 2009, Kobe, Japan
Data
The songs are Western pop songs mostly from the USPOP collection. Each song may belong to multiple mood tag groups. The main rationale on songs selection is: if more than one tag in a group were applied to a song, or if one tag in a group was applied more than once to a song, this song is marked as belonging to this group.
For details on how the songs were selected, please see the Mood multi-tag data description.
Audio format: 30 second clips, 44.1kHz, stereo,16bit, WAV files; The data were split into 3 folds with artist filtering.
General Legend
Team ID
BP1 = Juan José Burred, Geoffroy Peeters
BP2 = Juan José Burred, Geoffroy Peeters
CC1 = Chuan Cao, Ming Li
CC2 = Chuan Cao, Ming Li
CC3 = Chuan Cao, Ming Li
CC4 = Chuan Cao, Ming Li
GP = Geoffroy Peeters
GT1 = George Tzanetakis
GT2 = George Tzanetakis
HCB = Matthew D.Hoffman, David M. Blei, Perry R.Cook
LWW1 = Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
LWW2 = Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
Overall Summary Results (Binary)
Measure | BP1 | BP2 | CC1 | CC2 | CC3 | CC4 | GP | GT1 | GT2 | HCB | LWW1 | LWW2 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Average Tag F-measure | 0.195 | 0.193 | 0.172 | 0.180 | 0.147 | 0.183 | 0.084 | 0.211 | 0.209 | 0.063 | 0.204 | 0.219 |
Average Tag Accuracy | 0.837 | 0.829 | 0.878 | 0.882 | 0.882 | 0.862 | 0.863 | 0.823 | 0.824 | 0.909 | 0.882 | 0.887 |
Average Positive Tag Accuracy | 0.287 | 0.296 | 0.201 | 0.210 | 0.151 | 0.234 | 0.098 | 0.318 | 0.314 | 0.057 | 0.204 | 0.220 |
Average Negative Tag Accuracy | 0.818 | 0.802 | 0.894 | 0.894 | 0.919 | 0.870 | 0.951 | 0.810 | 0.811 | 0.979 | 0.923 | 0.926 |
Summary Binary Relevance F-Measure (Average Across All Folds)
Tag | Positive Examples | Negative Examples | BP1 | BP2 | CC1 | CC2 | CC3 | CC4 | GP | GT1 | GT2 | HCB | LWW1 | LWW2 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
g9 | 61.000 | 3404.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.033 | 0.022 | 0.000 | 0.033 | 0.017 |
g7 | 45.000 | 3420.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.043 | 0.000 | 0.015 | 0.000 | 0.023 | 0.000 |
g8 | 116.000 | 3349.000 | 0.023 | 0.011 | 0.000 | 0.000 | 0.017 | 0.037 | 0.027 | 0.052 | 0.029 | 0.000 | 0.086 | 0.052 |
g11 | 40.000 | 3425.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.013 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 |
g29 | 115.000 | 3350.000 | 0.297 | 0.313 | 0.171 | 0.167 | 0.070 | 0.110 | 0.145 | 0.267 | 0.255 | 0.000 | 0.294 | 0.335 |
g16 | 470.000 | 2995.000 | 0.280 | 0.249 | 0.210 | 0.186 | 0.159 | 0.224 | 0.036 | 0.217 | 0.231 | 0.000 | 0.276 | 0.291 |
g17 | 183.000 | 3282.000 | 0.054 | 0.050 | 0.020 | 0.029 | 0.019 | 0.044 | 0.043 | 0.157 | 0.149 | 0.000 | 0.060 | 0.076 |
g28 | 254.000 | 3211.000 | 0.391 | 0.403 | 0.391 | 0.364 | 0.282 | 0.317 | 0.184 | 0.298 | 0.312 | 0.000 | 0.346 | 0.400 |
g25 | 80.000 | 3385.000 | 0.000 | 0.000 | 0.026 | 0.000 | 0.022 | 0.023 | 0.011 | 0.085 | 0.042 | 0.000 | 0.050 | 0.051 |
g12 | 1678.000 | 1787.000 | 0.665 | 0.656 | 0.677 | 0.693 | 0.642 | 0.687 | 0.125 | 0.659 | 0.659 | 0.691 | 0.677 | 0.685 |
g14 | 146.000 | 3319.000 | 0.099 | 0.050 | 0.000 | 0.027 | 0.000 | 0.093 | 0.078 | 0.142 | 0.119 | 0.000 | 0.124 | 0.131 |
g15 | 1175.000 | 2290.000 | 0.573 | 0.547 | 0.527 | 0.591 | 0.473 | 0.571 | 0.296 | 0.577 | 0.581 | 0.440 | 0.581 | 0.601 |
g2 | 543.000 | 2922.000 | 0.334 | 0.324 | 0.353 | 0.369 | 0.333 | 0.368 | 0.114 | 0.374 | 0.374 | 0.007 | 0.321 | 0.361 |
g1 | 30.000 | 3435.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.044 | 0.022 | 0.000 | 0.000 | 0.000 |
g6 | 142.000 | 3323.000 | 0.018 | 0.064 | 0.034 | 0.000 | 0.037 | 0.030 | 0.096 | 0.057 | 0.070 | 0.000 | 0.057 | 0.120 |
g5 | 749.000 | 2716.000 | 0.404 | 0.396 | 0.422 | 0.426 | 0.390 | 0.425 | 0.106 | 0.395 | 0.403 | 0.000 | 0.393 | 0.426 |
g32 | 618.000 | 2847.000 | 0.370 | 0.403 | 0.268 | 0.384 | 0.204 | 0.370 | 0.140 | 0.377 | 0.385 | 0.000 | 0.345 | 0.378 |
g31 | 38.000 | 3427.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.054 | 0.070 | 0.089 | 0.000 | 0.000 | 0.026 |
Summary Binary Accuracy (Average Across All Folds)
Tag | Positive Examples | Negative Examples | BP1 | BP2 | CC1 | CC2 | CC3 | CC4 | GP | GT1 | GT2 | HCB | LWW1 | LWW2 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
g9 | 61.000 | 3404.000 | 0.979 | 0.980 | 0.982 | 0.982 | 0.982 | 0.982 | 0.972 | 0.950 | 0.949 | 0.982 | 0.966 | 0.965 |
g7 | 45.000 | 3420.000 | 0.987 | 0.986 | 0.987 | 0.987 | 0.987 | 0.987 | 0.971 | 0.962 | 0.962 | 0.987 | 0.975 | 0.974 |
g8 | 116.000 | 3349.000 | 0.957 | 0.958 | 0.966 | 0.965 | 0.966 | 0.957 | 0.945 | 0.906 | 0.903 | 0.966 | 0.939 | 0.936 |
g11 | 40.000 | 3425.000 | 0.988 | 0.988 | 0.988 | 0.988 | 0.988 | 0.988 | 0.970 | 0.966 | 0.966 | 0.988 | 0.977 | 0.977 |
g29 | 115.000 | 3350.000 | 0.955 | 0.953 | 0.944 | 0.964 | 0.949 | 0.949 | 0.956 | 0.927 | 0.926 | 0.967 | 0.953 | 0.956 |
g16 | 470.000 | 2995.000 | 0.710 | 0.720 | 0.820 | 0.811 | 0.824 | 0.737 | 0.852 | 0.681 | 0.687 | 0.864 | 0.803 | 0.808 |
g17 | 183.000 | 3282.000 | 0.919 | 0.935 | 0.945 | 0.942 | 0.942 | 0.908 | 0.914 | 0.866 | 0.865 | 0.947 | 0.900 | 0.902 |
g28 | 254.000 | 3211.000 | 0.916 | 0.903 | 0.892 | 0.913 | 0.895 | 0.885 | 0.906 | 0.846 | 0.849 | 0.926 | 0.905 | 0.912 |
g25 | 80.000 | 3385.000 | 0.968 | 0.973 | 0.976 | 0.977 | 0.975 | 0.974 | 0.963 | 0.937 | 0.934 | 0.977 | 0.956 | 0.956 |
g12 | 1678.000 | 1787.000 | 0.528 | 0.494 | 0.575 | 0.608 | 0.596 | 0.613 | 0.533 | 0.505 | 0.506 | 0.701 | 0.687 | 0.695 |
g14 | 146.000 | 3319.000 | 0.932 | 0.945 | 0.957 | 0.956 | 0.955 | 0.939 | 0.907 | 0.892 | 0.889 | 0.958 | 0.926 | 0.927 |
g15 | 1175.000 | 2290.000 | 0.529 | 0.453 | 0.602 | 0.622 | 0.632 | 0.609 | 0.691 | 0.570 | 0.573 | 0.712 | 0.716 | 0.730 |
g2 | 543.000 | 2922.000 | 0.679 | 0.637 | 0.751 | 0.774 | 0.777 | 0.727 | 0.829 | 0.706 | 0.706 | 0.844 | 0.788 | 0.800 |
g1 | 30.000 | 3435.000 | 0.991 | 0.990 | 0.991 | 0.991 | 0.991 | 0.991 | 0.977 | 0.976 | 0.975 | 0.991 | 0.982 | 0.982 |
g6 | 142.000 | 3323.000 | 0.951 | 0.955 | 0.951 | 0.957 | 0.947 | 0.935 | 0.932 | 0.885 | 0.886 | 0.959 | 0.923 | 0.928 |
g5 | 749.000 | 2716.000 | 0.509 | 0.451 | 0.706 | 0.703 | 0.718 | 0.661 | 0.782 | 0.607 | 0.613 | 0.784 | 0.738 | 0.752 |
g32 | 618.000 | 2847.000 | 0.582 | 0.616 | 0.776 | 0.738 | 0.770 | 0.682 | 0.797 | 0.666 | 0.670 | 0.822 | 0.766 | 0.778 |
g31 | 38.000 | 3427.000 | 0.985 | 0.985 | 0.988 | 0.989 | 0.987 | 0.988 | 0.632 | 0.970 | 0.971 | 0.989 | 0.978 | 0.979 |
Summary Positive Example Accuracy (Average Across All Folds)
Tag | Positive Examples | Negative Examples | BP1 | BP2 | CC1 | CC2 | CC3 | CC4 | GP | GT1 | GT2 | HCB | LWW1 | LWW2 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
g9 | 61.000 | 3404.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.050 | 0.034 | 0.000 | 0.034 | 0.019 |
g7 | 45.000 | 3420.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.070 | 0.000 | 0.021 | 0.000 | 0.022 | 0.000 |
g8 | 116.000 | 3349.000 | 0.019 | 0.008 | 0.000 | 0.000 | 0.009 | 0.028 | 0.030 | 0.080 | 0.043 | 0.000 | 0.082 | 0.054 |
g11 | 40.000 | 3425.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.022 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 |
g29 | 115.000 | 3350.000 | 0.285 | 0.324 | 0.170 | 0.128 | 0.063 | 0.108 | 0.126 | 0.396 | 0.379 | 0.000 | 0.293 | 0.334 |
g16 | 470.000 | 2995.000 | 0.430 | 0.349 | 0.180 | 0.162 | 0.125 | 0.290 | 0.022 | 0.332 | 0.353 | 0.000 | 0.279 | 0.295 |
g17 | 183.000 | 3282.000 | 0.055 | 0.033 | 0.011 | 0.016 | 0.011 | 0.039 | 0.045 | 0.236 | 0.224 | 0.000 | 0.061 | 0.076 |
g28 | 254.000 | 3211.000 | 0.370 | 0.448 | 0.474 | 0.343 | 0.278 | 0.365 | 0.176 | 0.448 | 0.472 | 0.000 | 0.346 | 0.400 |
g25 | 80.000 | 3385.000 | 0.000 | 0.000 | 0.014 | 0.000 | 0.014 | 0.014 | 0.013 | 0.122 | 0.061 | 0.000 | 0.051 | 0.052 |
g12 | 1678.000 | 1787.000 | 0.970 | 0.997 | 0.921 | 0.916 | 0.749 | 0.878 | 0.073 | 0.985 | 0.985 | 0.693 | 0.678 | 0.686 |
g14 | 146.000 | 3319.000 | 0.099 | 0.035 | 0.000 | 0.015 | 0.000 | 0.076 | 0.126 | 0.208 | 0.174 | 0.000 | 0.121 | 0.130 |
g15 | 1175.000 | 2290.000 | 0.933 | 0.968 | 0.657 | 0.810 | 0.490 | 0.773 | 0.232 | 0.869 | 0.875 | 0.336 | 0.584 | 0.603 |
g2 | 543.000 | 2922.000 | 0.529 | 0.558 | 0.435 | 0.425 | 0.356 | 0.508 | 0.077 | 0.563 | 0.564 | 0.003 | 0.322 | 0.361 |
g1 | 30.000 | 3435.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.067 | 0.033 | 0.000 | 0.000 | 0.000 |
g6 | 142.000 | 3323.000 | 0.016 | 0.038 | 0.021 | 0.000 | 0.030 | 0.030 | 0.108 | 0.084 | 0.106 | 0.000 | 0.058 | 0.121 |
g5 | 749.000 | 2716.000 | 0.775 | 0.838 | 0.497 | 0.510 | 0.419 | 0.583 | 0.064 | 0.602 | 0.613 | 0.000 | 0.397 | 0.431 |
g32 | 618.000 | 2847.000 | 0.690 | 0.732 | 0.234 | 0.461 | 0.167 | 0.527 | 0.110 | 0.571 | 0.582 | 0.000 | 0.348 | 0.380 |
g31 | 38.000 | 3427.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.472 | 0.112 | 0.136 | 0.000 | 0.000 | 0.024 |
Summary Negative Example Accuracy (Average Across All Folds)
Tag | Positive Examples | Negative Examples | BP1 | BP2 | CC1 | CC2 | CC3 | CC4 | GP | GT1 | GT2 | HCB | LWW1 | LWW2 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
g9 | 61.000 | 3404.000 | 0.997 | 0.998 | 1.000 | 1.000 | 1.000 | 0.999 | 0.989 | 0.966 | 0.966 | 1.000 | 0.983 | 0.982 |
g7 | 45.000 | 3420.000 | 1.000 | 0.999 | 1.000 | 1.000 | 1.000 | 1.000 | 0.983 | 0.975 | 0.975 | 1.000 | 0.987 | 0.987 |
g8 | 116.000 | 3349.000 | 0.989 | 0.991 | 1.000 | 0.999 | 0.999 | 0.989 | 0.977 | 0.934 | 0.933 | 1.000 | 0.968 | 0.967 |
g11 | 40.000 | 3425.000 | 1.000 | 0.999 | 1.000 | 1.000 | 1.000 | 1.000 | 0.981 | 0.978 | 0.978 | 1.000 | 0.988 | 0.988 |
g29 | 115.000 | 3350.000 | 0.978 | 0.974 | 0.971 | 0.992 | 0.980 | 0.978 | 0.984 | 0.945 | 0.944 | 1.000 | 0.976 | 0.977 |
g16 | 470.000 | 2995.000 | 0.756 | 0.779 | 0.920 | 0.913 | 0.934 | 0.807 | 0.982 | 0.736 | 0.740 | 1.000 | 0.886 | 0.889 |
g17 | 183.000 | 3282.000 | 0.968 | 0.985 | 0.997 | 0.993 | 0.994 | 0.957 | 0.963 | 0.901 | 0.901 | 1.000 | 0.947 | 0.948 |
g28 | 254.000 | 3211.000 | 0.959 | 0.939 | 0.925 | 0.959 | 0.944 | 0.926 | 0.963 | 0.878 | 0.879 | 1.000 | 0.949 | 0.953 |
g25 | 80.000 | 3385.000 | 0.991 | 0.996 | 0.999 | 1.000 | 0.997 | 0.996 | 0.985 | 0.957 | 0.955 | 1.000 | 0.977 | 0.977 |
g12 | 1678.000 | 1787.000 | 0.113 | 0.024 | 0.251 | 0.320 | 0.453 | 0.366 | 0.967 | 0.059 | 0.059 | 0.710 | 0.696 | 0.705 |
g14 | 146.000 | 3319.000 | 0.969 | 0.985 | 0.999 | 0.998 | 0.997 | 0.977 | 0.941 | 0.921 | 0.920 | 1.000 | 0.962 | 0.962 |
g15 | 1175.000 | 2290.000 | 0.321 | 0.191 | 0.572 | 0.524 | 0.704 | 0.522 | 0.925 | 0.418 | 0.421 | 0.906 | 0.785 | 0.796 |
g2 | 543.000 | 2922.000 | 0.708 | 0.652 | 0.810 | 0.839 | 0.854 | 0.767 | 0.969 | 0.734 | 0.733 | 1.000 | 0.875 | 0.882 |
g1 | 30.000 | 3435.000 | 1.000 | 0.999 | 1.000 | 1.000 | 0.999 | 1.000 | 0.985 | 0.984 | 0.983 | 1.000 | 0.991 | 0.991 |
g6 | 142.000 | 3323.000 | 0.991 | 0.994 | 0.990 | 0.998 | 0.986 | 0.973 | 0.967 | 0.919 | 0.919 | 1.000 | 0.960 | 0.962 |
g5 | 749.000 | 2716.000 | 0.439 | 0.347 | 0.761 | 0.753 | 0.799 | 0.682 | 0.980 | 0.612 | 0.615 | 1.000 | 0.833 | 0.842 |
g32 | 618.000 | 2847.000 | 0.558 | 0.593 | 0.895 | 0.799 | 0.901 | 0.716 | 0.946 | 0.688 | 0.690 | 1.000 | 0.857 | 0.865 |
g31 | 38.000 | 3427.000 | 0.996 | 0.996 | 0.999 | 1.000 | 0.998 | 0.999 | 0.634 | 0.980 | 0.980 | 1.000 | 0.989 | 0.989 |
Overall Summary Results (Affinity)
Measure | BP1 | BP2 | CC1 | CC2 | CC3 | CC4 | GT1 | GT2 | HCB | LWW1 | LWW2 |
---|---|---|---|---|---|---|---|---|---|---|---|
Average AUC-ROC Tag | 0.648 | 0.632 | 0.652 | 0.681 | 0.629 | 0.646 | 0.649 | 0.655 | 0.664 | 0.667 | 0.701 |
Average AUC-ROC Clip | 0.854 | 0.859 | 0.849 | 0.848 | 0.812 | 0.812 | 0.860 | 0.861 | 0.861 | 0.678 | 0.704 |
Precision at 3 | 0.383 | 0.389 | 0.392 | 0.392 | 0.368 | 0.368 | 0.381 | 0.384 | 0.385 | 0.215 | 0.238 |
Precision at 6 | 0.257 | 0.259 | 0.256 | 0.256 | 0.240 | 0.240 | 0.261 | 0.260 | 0.260 | 0.182 | 0.193 |
Precision at 9 | 0.189 | 0.190 | 0.186 | 0.186 | 0.178 | 0.178 | 0.192 | 0.192 | 0.192 | 0.153 | 0.158 |
Precision at 12 | 0.149 | 0.150 | 0.146 | 0.146 | 0.142 | 0.142 | 0.150 | 0.150 | 0.151 | 0.131 | 0.134 |
Precision at 15 | 0.122 | 0.123 | 0.121 | 0.121 | 0.119 | 0.119 | 0.123 | 0.123 | 0.123 | 0.116 | 0.117 |
Summary AUC-ROC Tag (Average Across All Folds)
Tag | BP1 | BP2 | CC1 | CC2 | CC3 | CC4 | GT1 | GT2 | HCB | LWW1 | LWW2 |
---|---|---|---|---|---|---|---|---|---|---|---|
g9 | 0.508 | 0.527 | 0.574 | 0.627 | 0.564 | 0.576 | 0.474 | 0.499 | 0.521 | 0.534 | 0.558 |
g7 | 0.580 | 0.523 | 0.569 | 0.622 | 0.525 | 0.510 | 0.494 | 0.577 | 0.587 | 0.588 | 0.535 |
g8 | 0.577 | 0.578 | 0.562 | 0.609 | 0.558 | 0.624 | 0.601 | 0.574 | 0.589 | 0.608 | 0.627 |
g11 | 0.568 | 0.471 | 0.516 | 0.531 | 0.544 | 0.542 | 0.521 | 0.530 | 0.509 | 0.508 | 0.551 |
g29 | 0.775 | 0.844 | 0.833 | 0.850 | 0.780 | 0.783 | 0.854 | 0.857 | 0.837 | 0.874 | 0.888 |
g16 | 0.611 | 0.610 | 0.610 | 0.610 | 0.581 | 0.586 | 0.556 | 0.563 | 0.500 | 0.662 | 0.679 |
g17 | 0.609 | 0.589 | 0.599 | 0.625 | 0.516 | 0.532 | 0.676 | 0.662 | 0.663 | 0.611 | 0.680 |
g28 | 0.734 | 0.785 | 0.805 | 0.809 | 0.741 | 0.742 | 0.776 | 0.783 | 0.773 | 0.809 | 0.823 |
g25 | 0.668 | 0.615 | 0.694 | 0.747 | 0.658 | 0.677 | 0.685 | 0.682 | 0.663 | 0.709 | 0.717 |
g12 | 0.742 | 0.728 | 0.642 | 0.729 | 0.639 | 0.690 | 0.714 | 0.710 | 0.754 | 0.752 | 0.766 |
g14 | 0.638 | 0.618 | 0.667 | 0.716 | 0.606 | 0.661 | 0.701 | 0.697 | 0.716 | 0.706 | 0.731 |
g15 | 0.761 | 0.764 | 0.655 | 0.734 | 0.649 | 0.710 | 0.729 | 0.718 | 0.738 | 0.753 | 0.772 |
g2 | 0.681 | 0.670 | 0.721 | 0.731 | 0.705 | 0.700 | 0.724 | 0.722 | 0.720 | 0.724 | 0.747 |
g1 | 0.599 | 0.435 | 0.595 | 0.563 | 0.660 | 0.619 | 0.611 | 0.631 | 0.666 | 0.518 | 0.673 |
g6 | 0.572 | 0.573 | 0.629 | 0.636 | 0.610 | 0.616 | 0.589 | 0.585 | 0.606 | 0.608 | 0.655 |
g5 | 0.666 | 0.651 | 0.697 | 0.699 | 0.674 | 0.677 | 0.645 | 0.654 | 0.684 | 0.707 | 0.735 |
g32 | 0.681 | 0.718 | 0.667 | 0.703 | 0.622 | 0.668 | 0.694 | 0.697 | 0.697 | 0.715 | 0.743 |
g31 | 0.702 | 0.675 | 0.698 | 0.721 | 0.685 | 0.715 | 0.638 | 0.646 | 0.738 | 0.618 | 0.741 |
Select Friedman's Test Results
Tag F-measure (Binary) Friedman Test
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each tag in the test, averaged over all folds.
TeamID | TeamID | Lowerbound | Mean | Upperbound | Significance |
---|---|---|---|---|---|
LWW2 | GT1 | -2.626 | 1.028 | 4.682 | FALSE |
LWW2 | GT2 | -2.904 | 0.750 | 4.404 | FALSE |
LWW2 | LWW1 | -1.849 | 1.806 | 5.460 | FALSE |
LWW2 | BP1 | -0.460 | 3.194 | 6.849 | FALSE |
LWW2 | BP2 | -0.404 | 3.250 | 6.904 | FALSE |
LWW2 | CC4 | -0.487 | 3.167 | 6.821 | FALSE |
LWW2 | CC2 | -0.154 | 3.500 | 7.154 | FALSE |
LWW2 | CC1 | 0.512 | 4.167 | 7.821 | TRUE |
LWW2 | CC3 | 2.068 | 5.722 | 9.376 | TRUE |
LWW2 | GP | 0.762 | 4.417 | 8.071 | TRUE |
LWW2 | HCB | 3.013 | 6.667 | 10.321 | TRUE |
GT1 | GT2 | -3.932 | -0.278 | 3.376 | FALSE |
GT1 | LWW1 | -2.876 | 0.778 | 4.432 | FALSE |
GT1 | BP1 | -1.488 | 2.167 | 5.821 | FALSE |
GT1 | BP2 | -1.432 | 2.222 | 5.876 | FALSE |
GT1 | CC4 | -1.515 | 2.139 | 5.793 | FALSE |
GT1 | CC2 | -1.182 | 2.472 | 6.126 | FALSE |
GT1 | CC1 | -0.515 | 3.139 | 6.793 | FALSE |
GT1 | CC3 | 1.040 | 4.694 | 8.349 | TRUE |
GT1 | GP | -0.265 | 3.389 | 7.043 | FALSE |
GT1 | HCB | 1.985 | 5.639 | 9.293 | TRUE |
GT2 | LWW1 | -2.599 | 1.056 | 4.710 | FALSE |
GT2 | BP1 | -1.210 | 2.444 | 6.099 | FALSE |
GT2 | BP2 | -1.154 | 2.500 | 6.154 | FALSE |
GT2 | CC4 | -1.238 | 2.417 | 6.071 | FALSE |
GT2 | CC2 | -0.904 | 2.750 | 6.404 | FALSE |
GT2 | CC1 | -0.237 | 3.417 | 7.071 | FALSE |
GT2 | CC3 | 1.318 | 4.972 | 8.626 | TRUE |
GT2 | GP | 0.013 | 3.667 | 7.321 | TRUE |
GT2 | HCB | 2.263 | 5.917 | 9.571 | TRUE |
LWW1 | BP1 | -2.265 | 1.389 | 5.043 | FALSE |
LWW1 | BP2 | -2.210 | 1.444 | 5.099 | FALSE |
LWW1 | CC4 | -2.293 | 1.361 | 5.015 | FALSE |
LWW1 | CC2 | -1.960 | 1.694 | 5.349 | FALSE |
LWW1 | CC1 | -1.293 | 2.361 | 6.015 | FALSE |
LWW1 | CC3 | 0.263 | 3.917 | 7.571 | TRUE |
LWW1 | GP | -1.043 | 2.611 | 6.265 | FALSE |
LWW1 | HCB | 1.207 | 4.861 | 8.515 | TRUE |
BP1 | BP2 | -3.599 | 0.056 | 3.710 | FALSE |
BP1 | CC4 | -3.682 | -0.028 | 3.626 | FALSE |
BP1 | CC2 | -3.349 | 0.306 | 3.960 | FALSE |
BP1 | CC1 | -2.682 | 0.972 | 4.626 | FALSE |
BP1 | CC3 | -1.126 | 2.528 | 6.182 | FALSE |
BP1 | GP | -2.432 | 1.222 | 4.876 | FALSE |
BP1 | HCB | -0.182 | 3.472 | 7.126 | FALSE |
BP2 | CC4 | -3.737 | -0.083 | 3.571 | FALSE |
BP2 | CC2 | -3.404 | 0.250 | 3.904 | FALSE |
BP2 | CC1 | -2.737 | 0.917 | 4.571 | FALSE |
BP2 | CC3 | -1.182 | 2.472 | 6.126 | FALSE |
BP2 | GP | -2.487 | 1.167 | 4.821 | FALSE |
BP2 | HCB | -0.237 | 3.417 | 7.071 | FALSE |
CC4 | CC2 | -3.321 | 0.333 | 3.987 | FALSE |
CC4 | CC1 | -2.654 | 1.000 | 4.654 | FALSE |
CC4 | CC3 | -1.099 | 2.556 | 6.210 | FALSE |
CC4 | GP | -2.404 | 1.250 | 4.904 | FALSE |
CC4 | HCB | -0.154 | 3.500 | 7.154 | FALSE |
CC2 | CC1 | -2.987 | 0.667 | 4.321 | FALSE |
CC2 | CC3 | -1.432 | 2.222 | 5.876 | FALSE |
CC2 | GP | -2.737 | 0.917 | 4.571 | FALSE |
CC2 | HCB | -0.487 | 3.167 | 6.821 | FALSE |
CC1 | CC3 | -2.099 | 1.556 | 5.210 | FALSE |
CC1 | GP | -3.404 | 0.250 | 3.904 | FALSE |
CC1 | HCB | -1.154 | 2.500 | 6.154 | FALSE |
CC3 | GP | -4.960 | -1.306 | 2.349 | FALSE |
CC3 | HCB | -2.710 | 0.944 | 4.599 | FALSE |
GP | HCB | -1.404 | 2.250 | 5.904 | FALSE |
https://music-ir.org/mirex/results/2009/tag/Mood/small.binary_FMeasure.friedman.tukeyKramerHSD.png
Per Track F-measure (Binary) Friedman Test
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the F-measure for each track in the test, averaged over all folds.
TeamID | TeamID | Lowerbound | Mean | Upperbound | Significance |
---|---|---|---|---|---|
CC2 | CC1 | -0.174 | 0.079 | 0.331 | FALSE |
CC2 | BP1 | 0.086 | 0.338 | 0.590 | TRUE |
CC2 | BP2 | 0.167 | 0.420 | 0.672 | TRUE |
CC2 | CC4 | 0.264 | 0.516 | 0.768 | TRUE |
CC2 | GT2 | 0.360 | 0.613 | 0.865 | TRUE |
CC2 | GT1 | 0.463 | 0.716 | 0.968 | TRUE |
CC2 | CC3 | 0.651 | 0.903 | 1.156 | TRUE |
CC2 | LWW2 | 0.678 | 0.931 | 1.183 | TRUE |
CC2 | LWW1 | 0.896 | 1.149 | 1.401 | TRUE |
CC2 | HCB | 2.081 | 2.334 | 2.586 | TRUE |
CC2 | GP | 3.503 | 3.756 | 4.008 | TRUE |
CC1 | BP1 | 0.007 | 0.259 | 0.512 | TRUE |
CC1 | BP2 | 0.089 | 0.341 | 0.593 | TRUE |
CC1 | CC4 | 0.185 | 0.438 | 0.690 | TRUE |
CC1 | GT2 | 0.282 | 0.534 | 0.786 | TRUE |
CC1 | GT1 | 0.385 | 0.637 | 0.889 | TRUE |
CC1 | CC3 | 0.573 | 0.825 | 1.077 | TRUE |
CC1 | LWW2 | 0.600 | 0.852 | 1.105 | TRUE |
CC1 | LWW1 | 0.818 | 1.070 | 1.322 | TRUE |
CC1 | HCB | 2.003 | 2.255 | 2.507 | TRUE |
CC1 | GP | 3.425 | 3.677 | 3.929 | TRUE |
BP1 | BP2 | -0.170 | 0.082 | 0.334 | FALSE |
BP1 | CC4 | -0.074 | 0.178 | 0.431 | FALSE |
BP1 | GT2 | 0.023 | 0.275 | 0.527 | TRUE |
BP1 | GT1 | 0.126 | 0.378 | 0.630 | TRUE |
BP1 | CC3 | 0.313 | 0.566 | 0.818 | TRUE |
BP1 | LWW2 | 0.341 | 0.593 | 0.845 | TRUE |
BP1 | LWW1 | 0.558 | 0.811 | 1.063 | TRUE |
BP1 | HCB | 1.744 | 1.996 | 2.248 | TRUE |
BP1 | GP | 3.166 | 3.418 | 3.670 | TRUE |
BP2 | CC4 | -0.156 | 0.097 | 0.349 | FALSE |
BP2 | GT2 | -0.059 | 0.193 | 0.445 | FALSE |
BP2 | GT1 | 0.044 | 0.296 | 0.548 | TRUE |
BP2 | CC3 | 0.232 | 0.484 | 0.736 | TRUE |
BP2 | LWW2 | 0.259 | 0.511 | 0.763 | TRUE |
BP2 | LWW1 | 0.477 | 0.729 | 0.981 | TRUE |
BP2 | HCB | 1.662 | 1.914 | 2.166 | TRUE |
BP2 | GP | 3.084 | 3.336 | 3.588 | TRUE |
CC4 | GT2 | -0.156 | 0.097 | 0.349 | FALSE |
CC4 | GT1 | -0.053 | 0.199 | 0.452 | FALSE |
CC4 | CC3 | 0.135 | 0.387 | 0.640 | TRUE |
CC4 | LWW2 | 0.162 | 0.415 | 0.667 | TRUE |
CC4 | LWW1 | 0.380 | 0.632 | 0.885 | TRUE |
CC4 | HCB | 1.565 | 1.817 | 2.070 | TRUE |
CC4 | GP | 2.987 | 3.240 | 3.492 | TRUE |
GT2 | GT1 | -0.149 | 0.103 | 0.355 | FALSE |
GT2 | CC3 | 0.038 | 0.291 | 0.543 | TRUE |
GT2 | LWW2 | 0.066 | 0.318 | 0.570 | TRUE |
GT2 | LWW1 | 0.283 | 0.536 | 0.788 | TRUE |
GT2 | HCB | 1.469 | 1.721 | 1.973 | TRUE |
GT2 | GP | 2.891 | 3.143 | 3.395 | TRUE |
GT1 | CC3 | -0.064 | 0.188 | 0.440 | FALSE |
GT1 | LWW2 | -0.037 | 0.215 | 0.467 | FALSE |
GT1 | LWW1 | 0.181 | 0.433 | 0.685 | TRUE |
GT1 | HCB | 1.366 | 1.618 | 1.870 | TRUE |
GT1 | GP | 2.788 | 3.040 | 3.292 | TRUE |
CC3 | LWW2 | -0.225 | 0.027 | 0.280 | FALSE |
CC3 | LWW1 | -0.007 | 0.245 | 0.497 | FALSE |
CC3 | HCB | 1.178 | 1.430 | 1.682 | TRUE |
CC3 | GP | 2.600 | 2.852 | 3.104 | TRUE |
LWW2 | LWW1 | -0.035 | 0.218 | 0.470 | FALSE |
LWW2 | HCB | 1.151 | 1.403 | 1.655 | TRUE |
LWW2 | GP | 2.573 | 2.825 | 3.077 | TRUE |
LWW1 | HCB | 0.933 | 1.185 | 1.437 | TRUE |
LWW1 | GP | 2.355 | 2.607 | 2.860 | TRUE |
HCB | GP | 1.170 | 1.422 | 1.674 | TRUE |
Tag AUC-ROC (Affinity) Friedman Test
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each tag in the test, averaged over all folds.
TeamID | TeamID | Lowerbound | Mean | Upperbound | Significance |
---|---|---|---|---|---|
LWW2 | CC2 | -1.669 | 1.889 | 5.447 | FALSE |
LWW2 | LWW1 | -0.614 | 2.944 | 6.503 | FALSE |
LWW2 | HCB | 0.497 | 4.056 | 7.614 | TRUE |
LWW2 | GT2 | 1.497 | 5.056 | 8.614 | TRUE |
LWW2 | CC1 | 1.497 | 5.056 | 8.614 | TRUE |
LWW2 | GT1 | 1.831 | 5.389 | 8.947 | TRUE |
LWW2 | BP1 | 1.831 | 5.389 | 8.947 | TRUE |
LWW2 | CC4 | 1.608 | 5.167 | 8.725 | TRUE |
LWW2 | BP2 | 2.553 | 6.111 | 9.669 | TRUE |
LWW2 | CC3 | 3.053 | 6.611 | 10.169 | TRUE |
CC2 | LWW1 | -2.503 | 1.056 | 4.614 | FALSE |
CC2 | HCB | -1.392 | 2.167 | 5.725 | FALSE |
CC2 | GT2 | -0.392 | 3.167 | 6.725 | FALSE |
CC2 | CC1 | -0.392 | 3.167 | 6.725 | FALSE |
CC2 | GT1 | -0.058 | 3.500 | 7.058 | FALSE |
CC2 | BP1 | -0.058 | 3.500 | 7.058 | FALSE |
CC2 | CC4 | -0.281 | 3.278 | 6.836 | FALSE |
CC2 | BP2 | 0.664 | 4.222 | 7.781 | TRUE |
CC2 | CC3 | 1.164 | 4.722 | 8.281 | TRUE |
LWW1 | HCB | -2.447 | 1.111 | 4.670 | FALSE |
LWW1 | GT2 | -1.447 | 2.111 | 5.670 | FALSE |
LWW1 | CC1 | -1.447 | 2.111 | 5.670 | FALSE |
LWW1 | GT1 | -1.114 | 2.444 | 6.003 | FALSE |
LWW1 | BP1 | -1.114 | 2.444 | 6.003 | FALSE |
LWW1 | CC4 | -1.336 | 2.222 | 5.781 | FALSE |
LWW1 | BP2 | -0.392 | 3.167 | 6.725 | FALSE |
LWW1 | CC3 | 0.108 | 3.667 | 7.225 | TRUE |
HCB | GT2 | -2.558 | 1.000 | 4.558 | FALSE |
HCB | CC1 | -2.558 | 1.000 | 4.558 | FALSE |
HCB | GT1 | -2.225 | 1.333 | 4.892 | FALSE |
HCB | BP1 | -2.225 | 1.333 | 4.892 | FALSE |
HCB | CC4 | -2.447 | 1.111 | 4.670 | FALSE |
HCB | BP2 | -1.503 | 2.056 | 5.614 | FALSE |
HCB | CC3 | -1.003 | 2.556 | 6.114 | FALSE |
GT2 | CC1 | -3.558 | 0.000 | 3.558 | FALSE |
GT2 | GT1 | -3.225 | 0.333 | 3.892 | FALSE |
GT2 | BP1 | -3.225 | 0.333 | 3.892 | FALSE |
GT2 | CC4 | -3.447 | 0.111 | 3.670 | FALSE |
GT2 | BP2 | -2.503 | 1.056 | 4.614 | FALSE |
GT2 | CC3 | -2.003 | 1.556 | 5.114 | FALSE |
CC1 | GT1 | -3.225 | 0.333 | 3.892 | FALSE |
CC1 | BP1 | -3.225 | 0.333 | 3.892 | FALSE |
CC1 | CC4 | -3.447 | 0.111 | 3.670 | FALSE |
CC1 | BP2 | -2.503 | 1.056 | 4.614 | FALSE |
CC1 | CC3 | -2.003 | 1.556 | 5.114 | FALSE |
GT1 | BP1 | -3.558 | 0.000 | 3.558 | FALSE |
GT1 | CC4 | -3.781 | -0.222 | 3.336 | FALSE |
GT1 | BP2 | -2.836 | 0.722 | 4.281 | FALSE |
GT1 | CC3 | -2.336 | 1.222 | 4.781 | FALSE |
BP1 | CC4 | -3.781 | -0.222 | 3.336 | FALSE |
BP1 | BP2 | -2.836 | 0.722 | 4.281 | FALSE |
BP1 | CC3 | -2.336 | 1.222 | 4.781 | FALSE |
CC4 | BP2 | -2.614 | 0.944 | 4.503 | FALSE |
CC4 | CC3 | -2.114 | 1.444 | 5.003 | FALSE |
BP2 | CC3 | -3.058 | 0.500 | 4.058 | FALSE |
Per Track AUC-ROC (Affinity) Friedman Test
The following table and plot show the results of Friedman's ANOVA with Tukey-Kramer multiple comparisons computed over the Area Under the ROC curve (AUC-ROC) for each track/clip in the test, averaged over all folds.
TeamID | TeamID | Lowerbound | Mean | Upperbound | Significance |
---|---|---|---|---|---|
GT2 | HCB | -0.198 | 0.038 | 0.275 | FALSE |
GT2 | GT1 | -0.124 | 0.112 | 0.349 | FALSE |
GT2 | BP2 | -0.059 | 0.177 | 0.414 | FALSE |
GT2 | BP1 | -0.012 | 0.224 | 0.461 | FALSE |
GT2 | CC1 | -0.237 | -0.001 | 0.236 | FALSE |
GT2 | CC2 | -0.235 | 0.001 | 0.237 | FALSE |
GT2 | CC4 | 0.607 | 0.844 | 1.080 | TRUE |
GT2 | CC3 | 0.611 | 0.847 | 1.083 | TRUE |
GT2 | LWW2 | 2.307 | 2.544 | 2.780 | TRUE |
GT2 | LWW1 | 2.713 | 2.950 | 3.186 | TRUE |
HCB | GT1 | -0.163 | 0.074 | 0.310 | FALSE |
HCB | BP2 | -0.097 | 0.139 | 0.375 | FALSE |
HCB | BP1 | -0.051 | 0.186 | 0.422 | FALSE |
HCB | CC1 | -0.276 | -0.039 | 0.197 | FALSE |
HCB | CC2 | -0.274 | -0.037 | 0.199 | FALSE |
HCB | CC4 | 0.569 | 0.805 | 1.041 | TRUE |
HCB | CC3 | 0.572 | 0.808 | 1.044 | TRUE |
HCB | LWW2 | 2.269 | 2.505 | 2.741 | TRUE |
HCB | LWW1 | 2.675 | 2.911 | 3.147 | TRUE |
GT1 | BP2 | -0.171 | 0.065 | 0.301 | FALSE |
GT1 | BP1 | -0.124 | 0.112 | 0.348 | FALSE |
GT1 | CC1 | -0.349 | -0.113 | 0.123 | FALSE |
GT1 | CC2 | -0.347 | -0.111 | 0.125 | FALSE |
GT1 | CC4 | 0.495 | 0.731 | 0.968 | TRUE |
GT1 | CC3 | 0.498 | 0.735 | 0.971 | TRUE |
GT1 | LWW2 | 2.195 | 2.432 | 2.668 | TRUE |
GT1 | LWW1 | 2.601 | 2.837 | 3.074 | TRUE |
BP2 | BP1 | -0.190 | 0.047 | 0.283 | FALSE |
BP2 | CC1 | -0.414 | -0.178 | 0.058 | FALSE |
BP2 | CC2 | -0.413 | -0.176 | 0.060 | FALSE |
BP2 | CC4 | 0.430 | 0.666 | 0.902 | TRUE |
BP2 | CC3 | 0.433 | 0.669 | 0.906 | TRUE |
BP2 | LWW2 | 2.130 | 2.366 | 2.603 | TRUE |
BP2 | LWW1 | 2.536 | 2.772 | 3.008 | TRUE |
BP1 | CC1 | -0.461 | -0.225 | 0.011 | FALSE |
BP1 | CC2 | -0.460 | -0.223 | 0.013 | FALSE |
BP1 | CC4 | 0.383 | 0.619 | 0.856 | TRUE |
BP1 | CC3 | 0.386 | 0.623 | 0.859 | TRUE |
BP1 | LWW2 | 2.083 | 2.320 | 2.556 | TRUE |
BP1 | LWW1 | 2.489 | 2.725 | 2.962 | TRUE |
CC1 | CC2 | -0.235 | 0.002 | 0.238 | FALSE |
CC1 | CC4 | 0.608 | 0.844 | 1.081 | TRUE |
CC1 | CC3 | 0.611 | 0.848 | 1.084 | TRUE |
CC1 | LWW2 | 2.308 | 2.544 | 2.781 | TRUE |
CC1 | LWW1 | 2.714 | 2.950 | 3.187 | TRUE |
CC2 | CC4 | 0.606 | 0.843 | 1.079 | TRUE |
CC2 | CC3 | 0.610 | 0.846 | 1.082 | TRUE |
CC2 | LWW2 | 2.306 | 2.543 | 2.779 | TRUE |
CC2 | LWW1 | 2.712 | 2.949 | 3.185 | TRUE |
CC4 | CC3 | -0.233 | 0.003 | 0.239 | FALSE |
CC4 | LWW2 | 1.464 | 1.700 | 1.936 | TRUE |
CC4 | LWW1 | 1.870 | 2.106 | 2.342 | TRUE |
CC3 | LWW2 | 1.461 | 1.697 | 1.933 | TRUE |
CC3 | LWW1 | 1.867 | 2.103 | 2.339 | TRUE |
LWW2 | LWW1 | 0.170 | 0.406 | 0.642 | TRUE |
Assorted Results Files for Download
General Results
affinity_tag_fold_AUC_ROC.csv
affinity_clip_AUC_ROC.csv
binary_per_fold_Accuracy.csv
binary_per_fold_Fmeasure.csv
binary_per_fold_negative_example_Accuracy.csv
binary_per_fold_per_track_Accuracy.csv
binary_per_fold_per_track_Fmeasure.csv
binary_per_fold_per_track_negative_example_Accuracy.csv
binary_per_fold_per_track_positive_example_Accuracy.csv
binary_per_fold_positive_example_Accuracy.csv
Friedman's Tests Results
affinity.PrecisionAt3.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt3.friedman.tukeyKramerHSD.png
affinity.PrecisionAt6.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt6.friedman.tukeyKramerHSD.png
affinity.PrecisionAt9.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt9.friedman.tukeyKramerHSD.png
affinity.PrecisionAt12.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt12.friedman.tukeyKramerHSD.png
affinity.PrecisionAt15.friedman.tukeyKramerHSD.csv
affinity.PrecisionAt15.friedman.tukeyKramerHSD.png
binary_Accuracy.friedman.tukeyKramerHSD.csv
binary_Accuracy.friedman.tukeyKramerHSD.png
Results By Algorithm
(.tgz format)
BP1 = Juan José Burred, Geoffroy Peeters
BP2 = Juan José Burred, Geoffroy Peeters
CC1 = Chuan Cao, Ming Li
CC2 = Chuan Cao, Ming Li
CC3 = Chuan Cao, Ming Li
CC4 = Chuan Cao, Ming Li
GP = Geoffroy Peeters
GT1 = George Tzanetakis
GT2 = George Tzanetakis
LWW1 = Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
LWW2 = Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
HCB = Matthew D.Hoffman, David M. Blei, Perry R.Cook