- 1 Welcome to MIREX 2017
- 2 Task Leadership Model
- 3 MIREX 2017 Deadline Dates
- 4 MIREX 2017 Submission Instructions
- 5 MIREX 2017 Possible Evaluation Tasks
- 6 Getting Involved in MIREX 2017
- 7 MIREX 2005 - 2016 Wikis
Welcome to MIREX 2017
This is the main page for the eleventh running of the Music Information Retrieval Evaluation eXchange (MIREX 2017). The International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) at School of Information Sciences (), University of Illinois at Urbana-Champaign (UIUC) is the principal organizer of MIREX 2017.
The MIREX 2017 community will hold its annual meeting as part of The 18th International Conference on Music Information Retrieval, ISMIR 2017, which will be held in Suzhou, China, October 23-28, 2017.
J. Stephen Downie
Task Leadership Model
Like ISMIR 2016, we are prepared to improve the distribution of tasks for the upcoming MIREX 2017. To do so, we really need leaders to help us organize and run each task.
To volunteer to lead a task, please send an email to IMIRSEL. Current information about task captains can be found on the 2017:Task Captains page. Please direct any other communication to the EvalFest mailing list.
What does it mean to lead a task?
- Update wiki pages as needed
- Communicate with submitters and troubleshooting submissions
- Execution and evaluation of submissions
- Publishing final results
Due to the proprietary nature of much of the data, the submission system, evaluation framework, and most of the datasets will continue to be hosted by IMIRSEL. However, we are prepared to provide access to task organizers to manage and run submissions on the IMIRSEL systems.
We really need leaders to help us this year!
MIREX 2017 Deadline Dates
The deadlines will be announced soon.
MIREX 2017 Submission Instructions
- Be sure to read through the rest of this page
- Be sure to read though the task pages for which you are submitting
- Be sure to follow the Best Coding Practices for MIREX
- Be sure to follow the MIREX 2016 Submission Instructions including both the tutorial video and the text
- The MIREX 2017 Submission System can be found at: https://www.music-ir.org/mirex/sub/ .
MIREX 2017 Possible Evaluation Tasks
- 2017:Audio Classification (Train/Test) Tasks, incorporating:
- 2017:Audio Beat Tracking
- 2017:Audio Chord Estimation
- 2017:Audio Cover Song Identification
- 2017:Audio Downbeat Estimation
- 2017:Audio Key Detection
- 2017:Audio Onset Detection
- 2017:Audio Tempo Estimation
- 2017:Automatic Lyrics-to-Audio Alignment
- 2017:Drum Transcription
- 2017:Multiple Fundamental Frequency Estimation & Tracking
- 2017:Real-time Audio to Score Alignment (a.k.a Score Following)
- 2017:Structural Segmentation
- 2017:Discovery of Repeated Themes & Sections
- 2017:Audio Fingerprinting
- 2017:Set List Identification
- 2016:Singing Voice Separation
- 2016:Audio Offset Detection
- 2016:Audio Tag Classification
- 2016:Audio Music Similarity and Retrieval
- 2016:Symbolic Melodic Similarity
- 2016:Query by Singing/Humming
- 2016:Audio Melody Extraction
- 2016:Query by Tapping
Note to New Participants
Please take the time to read the following review articles that explain the history and structure of MIREX.
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):
A window into music information retrieval research.Acoustical Science and Technology 29 (4): 247-255.
Available at: http://dx.doi.org/10.1250/ast.29.247
Downie, J. Stephen, Andreas F. Ehmann, Mert Bay and M. Cameron Jones. (2010).
The Music Information Retrieval Evaluation eXchange: Some Observations and Insights.
Advances in Music Information Retrieval Vol. 274, pp. 93-115
Available at: http://bit.ly/KpM5u5
We reserve the right to stop any process that exceeds runtime limits for each task. We will do our best to notify you in enough time to allow revisions, but this may not be possible in some cases. Please respect the published runtime limits.
Note to All Participants
Because MIREX is premised upon the sharing of ideas and results, ALL MIREX participants are expected to:
- submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).
- submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2016 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)
- present a poster at the MIREX 2016 poster session at ISMIR 2016
Software Dependency Requests
If you have not submitted to MIREX before or are unsure whether IMIRSEL currently supports some of the software/architecture dependencies for your submission a dependency request form is available. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you.
Due to the high volume of submissions expected at MIREX 2015, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.
Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.
Getting Involved in MIREX 2017
MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2017 the best yet.
Mailing List Participation
If you are interested in formal MIR evaluation, you should also subscribe to the "MIREX" (aka "EvalFest") mail list and participate in the community discussions about defining and running MIREX 2017 tasks. Subscription information at: EvalFest Central.
If you are participating in MIREX 2017, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2017 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here.
Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2017 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.
If you find that you cannot edit a MIREX wiki page, you will need to create a new account via: Special:Userlogin.
Please note that because of "spam-bots", MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).