2019:Music Detection Results

From MIREX Wiki
Revision as of 05:37, 30 October 2019 by Blai Melendez-Catalan (talk | contribs) (Created page with "==Introduction== These are the results for the 2018 running of the Music and/or Speech Detection tasks. For background information about this task set please refer to the 20...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Introduction

These are the results for the 2018 running of the Music and/or Speech Detection tasks. For background information about this task set please refer to the 2018:Music and/or Speech Detection page.

General Legend

Sub code Abstract Contributors
MMG1 PDF Blai Meléndez-Catalán, Emilio Molina, Emilia Gómez
MMG2 PDF Blai Meléndez-Catalán, Emilio Molina, Emilia Gómez
MMG3 PDF Blai Meléndez-Catalán, Emilio Molina, Emilia Gómez

Statistics notation

Accuracy = segment-level accuracy

<class>_P = segment-level precision for the class <class>

<class>_R = segment-level recall for the class <class>

<class>_F = segment-level F-measure for the class <class>

<class>_F_500_on = onset-only event-level F-measure (500 ms tolerance) for the class <class>

<class>_F_500_onoff = onset-offset event-level F-measure (500 ms tolerance) for the class <class>

<class>_F_1000_on = onset-only event-level F-measure (1000 ms tolerance) for the class <class>

<class>_F_1000_onoff = onset-offset event-level F-measure (1000 ms tolerance) for the class <class>

Datasets description

Dataset description

Task 1: Music Detection

Segment-level Evaluation

Sub code Accuracy Music_P Music_R Music_F No-Music_P No-Music_R No-Music_F
MMG1 0.9049 0.9131 0.8865 0.8996 0.8978 0.9219 0.9097
MMG2 0.9049 0.9131 0.8865 0.8996 0.8978 0.9219 0.9097
MMG3 0.8506 0.967 0.7134 0.8211 0.7866 0.9775 0.8717

Event-level Evaluation

Sub code Music_F_500_on Music_F_500_onoff Music_F_1000_on Music_F_1000_onoff
MMG1 0.5177 0.2693 0.5813 0.3502
MMG2 0.5177 0.2693 0.5813 0.3502
MMG3 0.4403 0.1991 0.4973 0.2788


Task 2: Relative Music Loudness Estimation

Segment-level Evaluation

Sub code Accuracy Fg-Music_P Fg-Music_R Fg-Music_F Bg-Music_P Bg-Music_R Bg-Music_F No-Music_P No-Music_R No-Music_F
MMG1 0.8615 0.8025 0.774 0.788 0.8211 0.821 0.821 0.9026 0.9103 0.9064
MMG2 0.8615 0.8025 0.774 0.788 0.8211 0.821 0.821 0.9026 0.9103 0.9064
MMG3 0.8615 0.8025 0.774 0.788 0.8211 0.821 0.821 0.9026 0.9103 0.9064

Event-level Evaluation

Sub code Fg-Music_F_500_on Fg-Music_F_500_onoff Fg-Music_F_1000_on Fg-Music_F_1000_onoff Bg-Music_F_500_on Bg-Music_F_500_onoff Bg-Music_F_1000_on Bg-Music_F_1000_onoff No-Music_F_500_on No-Music_F_500_onoff No-Music_F_1000_on Speech_F_1000_onoff
MMG1 0.3298 0.1775 0.4106 0.2742 0.3853 0.1388 0.4463 0.2024 0.5254 0.3123 0.5927 0.3925
MMG2 0.3298 0.1775 0.4106 0.2742 0.3853 0.1388 0.4463 0.2024 0.5254 0.3123 0.5927 0.3925
MMG3 0.3298 0.1775 0.4106 0.2742 0.3853 0.1388 0.4463 0.2024 0.5254 0.3123 0.5927 0.3925