From MIREX Wiki
Welcome to MIREX 2013
This is the main page for the ninth running of the Music Information Retrieval Evaluation eXchange (MIREX 2013). The International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) at the Graduate School of Library and Information Science (GSLIS), University of Illinois at Urbana-Champaign (UIUC) is the principal organizer of MIREX 2013.
The MIREX 2013 community will hold its annual meeting as part of The 14th International Conference on Music Information Retrieval, ISMIR 2013, which will be held in Curitiba, PR, Brazil, the 4-8 November, 2013. The MIREX plenary and poster sessions will be held during the conference.
J. Stephen Downie
New Task Leadership Model
In response to discussions at ISMIR 2012, we are prepared to improve the distribution of tasks for the upcoming MIREX 2013. To do so, we really need leaders to help us organize and run each task.
What does it mean to lead a task?
- Update wiki pages as needed
- Communicate with submitters and troubleshooting submissions
- Execution and evaluation of submissions
- Publishing final results
Due to the proprietary nature of much of the data, the submission system, evaluation framework, and most of the datasets will continue to be hosted by IMIRSEL. However, we are prepared to provide access to task organizers to manage and run submissions on the IMIRSEL systems.
We really need leaders to help us this year!
MIREX 2013 Deadline Dates
This year, we have a single deadline for all submissions. Submissions for all tasks are due by:
3 September 2013
Nota Bene: In the past we have been rather flexible about deadlines. This year, however, we simply do not have the time flexibility, sorry.
Please, please, please, let's start getting those submissions made. The sooner we have the code, the sooner we can start running the evaluations.
PS: If you have a slower running algorithm, help us help you by getting your code in ASAP. Please do pay attention to runtime limits.
MIREX 2013 Submission Instructions
- Be sure to read through the rest of this page
- Be sure to read though the task pages for which you are submitting
- Be sure to follow the Best Coding Practices for MIREX
- Be sure to follow the MIREX 2013 Submission Instructions including both the tutorial video and the text
MIREX 2013 Possible Evaluation Tasks
- 2013:Audio Classification (Train/Test) Tasks, incorporating:
- Audio US Pop Genre Classification
- Audio Latin Genre Classification
- Audio Music Mood Classification
- Audio Classical Composer Identification
- 2013:Audio Cover Song Identification
- 2013:Audio Tag Classification
- 2013:Audio Music Similarity and Retrieval
- 2013:Symbolic Melodic Similarity
- 2013:Audio Onset Detection
- 2013:Audio Key Detection
- 2013:Real-time Audio to Score Alignment (a.k.a Score Following)
- 2013:Query by Singing/Humming
- 2013:Audio Melody Extraction
- 2013:Multiple Fundamental Frequency Estimation & Tracking
- 2013:Audio Chord Estimation
- 2013:Query by Tapping
- 2013:Audio Beat Tracking
- 2013:Structural Segmentation
- 2013:Audio Tempo Estimation
- 2013:Discovery of Repeated Themes & Sections
Note to New Participants
Please take the time to read the following review articles that explain the history and structure of MIREX.
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):
A window into music information retrieval research.Acoustical Science and Technology 29 (4): 247-255.
Available at: http://dx.doi.org/10.1250/ast.29.247
Downie, J. Stephen, Andreas F. Ehmann, Mert Bay and M. Cameron Jones. (2010).
The Music Information Retrieval Evaluation eXchange: Some Observations and Insights.
Advances in Music Information Retrieval Vol. 274, pp. 93-115
Available at: http://bit.ly/KpM5u5
We reserve the right to stop any process that exceeds runtime limits for each task. We will do our best to notify you in enough time to allow revisions, but this may not be possible in some cases. Please respect the published runtime limits.
Note to All Participants
Because MIREX is premised upon the sharing of ideas and results, ALL MIREX participants are expected to:
- submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).
- submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2013 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)
- present a poster at the MIREX 2013 poster session at ISMIR 2013
Software Dependency Requests
If you have not submitted to MIREX before or are unsure whether IMIRSEL currently supports some of the software/architecture dependencies for your submission a dependency request form is available. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you.
Due to the high volume of submissions expected at MIREX 2013, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.
Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.
Getting Involved in MIREX 2013
MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2013 the best yet.
Mailing List Participation
If you are interested in formal MIR evaluation, you should also subscribe to the "MIREX" (aka "EvalFest") mail list and participate in the community discussions about defining and running MIREX 2013 tasks. Subscription information at: EvalFest Central.
If you are participating in MIREX 2013, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2013 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here.
Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2013 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.
If you find that you cannot edit a MIREX wiki page, you will need to create a new account via: Special:Userlogin.
Please note that because of "spam-bots", MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).