From MIREX Wiki
Welcome to MIREX 2012
This is the main page for the eighth running of the Music Information Retrieval Evaluation eXchange (MIREX 2012). The International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) at the Graduate School of Library and Information Science (GSLIS), University of Illinois at Urbana-Champaign (UIUC) is the principal organizer of MIREX 2012.
The MIREX 2012 community will hold its annual meeting as part of The 13th International Conference on Music Information Retrieval, ISMIR 2012, which will be held in Porto, Portugal, the 8-12 October, 2012. The MIREX plenary and poster sessions will be held Friday, October 12 during the conference.
J. Stephen Downie
Announcing the KETI/Illinois K-MIREX Collaboration
The MIREX team at the University of Illinois is very proud to announce its new K-MIREX Collaboration with the research team led by Dr. Seok-Pil Lee and Chai-Jong Song of the Digital Media Research Center at the Korea Electronics Technology Institute (KETI) http://www.keti.re.kr/e-keti/. Song and his KETI colleagues will be taking the lead on running the 2012 Query by Singing/Humming (QBSH) and Audio Melody Extraction (AME) Tasks . We do not foresee any special deviations from traditional MIREX submission procedures for these two tasks. Should they arise, participants will be informed.
MIREX and the Million Song Dataset Challenge
The MIREX team at the University of Illinois is also proud to announce its co-operative engagement with the Million Song Dataset (MSD) Challenge team of Brian McFee, Thierry Bertin-Mahieux, Daniel P.W. Ellis, and Gert Lanckriet. Below is the recent email announcement of the MSD Challenge.
- The Million Song Dataset Challenge
- DEADLINE: 2012-08-09 23:59 UTC.
- We are happy to announce the Million Song Dataset Challenge: a large-scale, open evaluation of personalized music recommendation algorithms.
- We provide listening history data for 1.1 million users (1 million train, 110 thousand validation and test) and over 380 thousand songs from the Million Song Dataset. Given partial historical data for each test user, the goal is to produce a ranking over the remaining songs for that user.
- What makes this challenge unique? Openness! The songs in the dataset are equipped with metadata (e.g., artist and title), as well as audio content analysis, semantic annotations, lyrics, etc. Because the data is open, participants are free to construct and include any additional features, or ignore the features altogether. The field is wide open!
- The challenge ends 2012-08-09 23:59 UTC.
- Evaluation is handled through Kaggle.com, and the details can be found at http://www.kaggle.com/c/msdchallenge.
- A post-analysis of the leading submissions will be performed by the Music Information Retrieval Evaluation eXchange (MIREX), and the results presented at the 13th International Society for Music Information Retrieval (ISMIR) conference in October, 2012.
- For details on the Million Song Dataset, please see http://labrosa.ee.columbia.edu/millionsong/.
- For background information about the contest and the organizing team, please see http://labrosa.ee.columbia.edu/millionsong/challenge.
MIREX 2012 Deadline Dates
We have two sets of deadlines for submissions. We have to stagger the deadlines because of runtime and human evaluation considerations. The submission system will open July 30, 2012.
Tasks with a 20 August 2012 deadline:
- Audio Classification (Train/Test) Tasks
- Audio Music Similarity and Retrieval
- Symbolic Melodic Similarity
Tasks with a 27 August 2012 deadline:
- All remaining MIREX 2012 tasks.
Nota Bene: In the past we have been rather flexible about deadlines. This year, however, we simply do not have the time flexibility, sorry.
Please, please, please, let's start getting those submissions made. The sooner we have the code, the sooner we can start running the evaluations.
PS: If you have a slower running algorithm, help us help you by getting your code in ASAP. Please do pay attention to runtime limits.
MIREX 2012 Submission Instructions
- Be sure to read through the rest of this page
- Be sure to read though the task pages for which you are submitting
- Be sure to follow the Best Coding Practices for MIREX
- Be sure to follow the MIREX 2012 Submission Instructions including both the tutorial video and the text
- The submission system will open Monday, July 30, 2012
MIREX 2012 Task Participation Poll
Please answer MIREX 2012 Task Participation Poll on your likelihood of participation in each task. Poll will close on Sunday July 29th 2012.
Also, please add your name and email address to the bottom of each task page for those tasks in which you plan to participate.
MIREX 2012 Possible Evaluation Tasks
- 2012:Audio Classification (Train/Test) Tasks, incorporating:
- Audio US Pop Genre Classification
- Audio Latin Genre Classification
- Audio Music Mood Classification
- Audio Classical Composer Identification
- 2012:Audio Cover Song Identification
- 2012:Audio Tag Classification
- 2012:Audio Music Similarity and Retrieval
- 2012:Symbolic Melodic Similarity
- 2012:Audio Onset Detection
- 2012:Audio Key Detection
- 2012:Real-time Audio to Score Alignment (a.k.a Score Following)
- 2012:Query by Singing/Humming
- 2012:Audio Melody Extraction
- 2012:Multiple Fundamental Frequency Estimation & Tracking
- 2012:Audio Chord Estimation
- 2012:Query by Tapping
- 2012:Audio Beat Tracking
- 2012:Structural Segmentation
- 2012:Audio Tempo Estimation
- 2013:Discovery of Repeated Themes & Sections
Note to New Participants
Please take the time to read the following review articles that explain the history and structure of MIREX.
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):
A window into music information retrieval research.Acoustical Science and Technology 29 (4): 247-255.
Available at: http://dx.doi.org/10.1250/ast.29.247
Downie, J. Stephen, Andreas F. Ehmann, Mert Bay and M. Cameron Jones. (2010).
The Music Information Retrieval Evaluation eXchange: Some Observations and Insights.
Advances in Music Information Retrieval Vol. 274, pp. 93-115
Available at: http://bit.ly/KpM5u5
Note to All Participants
Because MIREX is premised upon the sharing of ideas and results, ALL MIREX participants are expected to:
- submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).
- submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2012 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)
- present a poster at the MIREX 2012 poster session at ISMIR 2012
Software Dependency Requests
If you have not submitted to MIREX before or are unsure whether IMIRSEL/NEMA currently supports some of the software/architecture dependencies for your submission a dependency request form is available. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you.
Due to the high volume of submissions expected at MIREX 2012, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.
Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.
Getting Involved in MIREX 2012
MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2012 the best yet.
Mailing List Participation
If you are interested in formal MIR evaluation, you should also subscribe to the "MIREX" (aka "EvalFest") mail list and participate in the community discussions about defining and running MIREX 2012 tasks. Subscription information at: EvalFest Central.
If you are participating in MIREX 2012, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2012 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here.
Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2012 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.
If you find that you cannot edit a MIREX wiki page, you will need to create a new account via: Special:Userlogin.
Please note that because of "spam-bots", MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).
MIREX 2005 - 2011 Wikis
Content from MIREX 2005 - 2011 are available at: