https://www.music-ir.org/mirex/w/api.php?action=feedcontributions&user=Jdownie&feedformat=atomMIREX Wiki - User contributions [en]2024-03-28T23:45:19ZUser contributionsMediaWiki 1.31.1https://www.music-ir.org/mirex/w/index.php?title=User:Kun_Fang&diff=13342User:Kun Fang2021-09-10T17:38:40Z<p>Jdownie: Creating user page for new user.</p>
<hr />
<div>I'm a Msc Acoustics and Music Technology student from the University of Edinburgh.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Kun_Fang&diff=13343User talk:Kun Fang2021-09-10T17:38:40Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [https://www.mediawiki.org/wiki/Special:MyLanguage/Help:Contents help pages].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] ([[User talk:Jdownie|talk]]) 12:38, 10 September 2021 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Noah_Henry&diff=13340User:Noah Henry2021-09-10T17:32:35Z<p>Jdownie: Creating user page for new user.</p>
<hr />
<div>PhD candidate in music psychology at the University of York.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Noah_Henry&diff=13341User talk:Noah Henry2021-09-10T17:32:35Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [https://www.mediawiki.org/wiki/Special:MyLanguage/Help:Contents help pages].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] ([[User talk:Jdownie|talk]]) 12:32, 10 September 2021 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Guoyf&diff=13338User:Guoyf2021-09-10T17:31:12Z<p>Jdownie: Creating user page for new user.</p>
<hr />
<div>A postgraduate student in Beijing, China</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Guoyf&diff=13339User talk:Guoyf2021-09-10T17:31:12Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [https://www.mediawiki.org/wiki/Special:MyLanguage/Help:Contents help pages].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] ([[User talk:Jdownie|talk]]) 12:31, 10 September 2021 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:WHJ&diff=13336User:WHJ2021-09-10T17:30:40Z<p>Jdownie: Creating user page for new user.</p>
<hr />
<div>I am a Ph.D. Student from Renmin University of China.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:WHJ&diff=13337User talk:WHJ2021-09-10T17:30:40Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [https://www.mediawiki.org/wiki/Special:MyLanguage/Help:Contents help pages].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] ([[User talk:Jdownie|talk]]) 12:30, 10 September 2021 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Djevans&diff=13335User talk:Djevans2021-09-10T17:30:15Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [https://www.mediawiki.org/wiki/Special:MyLanguage/Help:Contents help pages].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] ([[User talk:Jdownie|talk]]) 12:30, 10 September 2021 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Djevans&diff=13334User:Djevans2021-09-10T17:30:15Z<p>Jdownie: Creating user page for new user.</p>
<hr />
<div>PhD Student at UIUC's School of Information Sciences</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Pei-Chun_Chang&diff=12684User:Pei-Chun Chang2018-09-14T10:56:04Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>I'm a PhD. student in Chiao Tung University, Taiwan.<br />
My major research topic is discrete signal processing, and I'm interesting in MIR.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Pei-Chun_Chang&diff=12685User talk:Pei-Chun Chang2018-09-14T10:56:04Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 05:56, 14 September 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Enk100&diff=12682User:Enk1002018-09-14T10:55:40Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>Learn Msc at Tel aviv university on deep learning, error correcting codes and communication systems, now im Phd candidate at tel aviv university</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Enk100&diff=12683User talk:Enk1002018-09-14T10:55:40Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 05:55, 14 September 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Lior_Wolf&diff=12680User:Lior Wolf2018-09-14T10:55:12Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>I am a full professor at the school of computer science at Tel Aviv University and a researcher at FAIR.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Lior_Wolf&diff=12681User talk:Lior Wolf2018-09-14T10:55:12Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 05:55, 14 September 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Daniel_Alejandro_P%C3%A9rez_Alvarez&diff=12678User:Daniel Alejandro Pérez Alvarez2018-09-14T10:54:54Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>I'am a master's degree student at Center for Computer Research of the National Polytechnic Institute of Mexico. My areas of interest are Music information retrieval, Natural languaje processing and Applied machine learning.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Daniel_Alejandro_P%C3%A9rez_Alvarez&diff=12679User talk:Daniel Alejandro Pérez Alvarez2018-09-14T10:54:54Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 05:54, 14 September 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Revanth_Akella&diff=12676User:Revanth Akella2018-09-14T10:54:24Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>I am a 2nd-year graduate Computer Science student at San Jose State University working classifying mood in music. I am an ML and MIR enthusiast who loves to explore and contribute to different areas of MIR. I have worked on projects dealing with music classification and recommendation and am excited to work in the area of music mood classification.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Revanth_Akella&diff=12677User talk:Revanth Akella2018-09-14T10:54:24Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 05:54, 14 September 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Zehren_Mickael&diff=12674User:Zehren Mickael2018-09-14T10:53:52Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>I am a PhD. student working on the development of a Fully Automated Mixing System, with Focus on Electronic Dance Music.<br />
<br />
Personal page:<br />
http://hpac.rwth-aachen.de/~zehren/</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Zehren_Mickael&diff=12675User talk:Zehren Mickael2018-09-14T10:53:52Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 05:53, 14 September 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Anshul_Thakur&diff=12599User:Anshul Thakur2018-07-27T01:12:02Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>I am a PhD research scholar working at IIT Mandi, India. My research interests include classification and pattern analysis of audio signals.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Anshul_Thakur&diff=12600User talk:Anshul Thakur2018-07-27T01:12:02Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 20:12, 26 July 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Zitao_Liao&diff=12597User:Zitao Liao2018-07-27T01:11:32Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>ha ha ha a a a a a a a a a a a a a a a a a a a a</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Zitao_Liao&diff=12598User talk:Zitao Liao2018-07-27T01:11:32Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 20:11, 26 July 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:J_GONG&diff=12595User:J GONG2018-07-27T01:10:59Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>I am a student in King's College London. And I am doing audio processing,i.e, extract some dynamics of music like tempo tracking. I want to get some database in MIREX and learn some knowledge.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:J_GONG&diff=12596User talk:J GONG2018-07-27T01:10:59Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 20:10, 26 July 2018 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2015:GC15UX:JDISC&diff=115502015:GC15UX:JDISC2015-10-21T17:01:43Z<p>Jdownie: </p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2015: User Experience (GC15UX:J-DISC)}}<br />
=Purpose=<br />
''Holistic, user-centered evaluation of the user experience in interacting with more complete, user-facing music information retrieval (MIR) systems.''<br />
<br />
=Goals=<br />
# ''To inspire the development of complete MIR systems.''<br />
# ''To promote the notion of user experience as a first-class research objective in the MIR community.''<br />
<br />
=About J-DISC=<br />
J-DISC ([http://jdisc.columbia.edu http://jdisc.columbia.edu]) is a resource for searching and exploring jazz recordings created by the Center for Jazz Studies at Columbia University. It is organized to present complete information on jazz recording sessions, and merge a large corpus of session data into a single easily accessible repository, in a manner that can be easily searched, cross-searched, navigated and cited. In addition to the focus on recording artist/leaders of traditional discography, J-DISC incorporates extensive cultural, geographic, biographical, composer and studio information that can also be easily searched and accessed.<br />
<br />
=Dataset=<br />
J-DISC contains fully structured and searchable metadata. Key entities in the dataset include '''person, skill, session, track, composition, and issue'''. There are 19 tables in the dataset representing various relationships between those entities. Below are brief descriptions for each of the 19 tables (table names in alphabetical order, numbers at the end showing the number of rows in each table):<br />
<br />
*'''composition: '''Compositions, which may be recorded as tracks at sessions. (7,104)<br />
*'''composition_composer: '''Associations between compositions and their composers. (8,622)<br />
* '''composition_lyricist: '''Associations between compositions and their lyricists. (880)<br />
* ''' composition_title: '''Alternative titles for compositions. (811)<br />
*''' issue: '''Releases or issues of tracks e.g. as albums. (545)<br />
*'''issue_leader: '''Associations between releases and their leaders. (610)<br />
*'''issue_overdub_people: '''Associations between overdubbed releases and people working on them. (17)<br />
*'''issue_track: '''Associations between releases and the included tracks. (3,769)<br />
*'''person: '''People associated with musical recordings, including musicians and composers. (5,734)<br />
* '''person_ethnicity: '''Association of people with ethnic descriptions. (70)<br />
*'''person_session_skill: '''Correlation between people, sessions, and skills or instruments. (21,424)<br />
*'''person_skill: '''Correlation between people and primary skills or instruments. (6,450)<br />
*'''person_track_skill: '''Variation from sessions in correlation between people, tracks, and skills or instruments. (7,044)<br />
*'''session: '''Recording sessions, at which one or more musicians produced one or more tracks. (2,711)<br />
* '''session_leader: '''Associations between sessions and their leader(s). (3,123)<br />
* '''skill: '''Skills associated with musical recordings, including instruments played, conducting, composing. (209)<br />
*'''track: '''Tracks laid down at recording sessions by musicians. (15,361)<br />
*'''track_composition: '''Associations between tracks and compositions. (15,672)<br />
*'''track_soloist: '''Associations between tracks and soloists. (630)<br />
<br />
<br />
[[2015:GC15UX:JDISC_Schema]] presents more details about each table.<br />
<br />
==ER Diagram of the Schema==<br />
[[File:jdisc_schema.png|400px]]<br />
<br />
[https://www.music-ir.org/mirex/gc15ux_jdisc/jdisc_schema.pdf Click to see the full version of the diagram (PDF)]<br />
<br />
=Download the Dataset=<br />
# <span style="color:#808080">user agreement signed during download?</span><br />
<br />
=Participating Systems=<br />
''Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC15UX team.''<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
|<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC15UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': How would you rate your overall satisfaction with the system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Aesthetics''': How would you rate the visual attractiveness of the system?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Ease of use''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Clarity''': How well does the system communicate what is going on?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Performance''': Does the system work efficiently and without bugs/glitches?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC15UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Tasks for Evaluators==<br />
<br />
The final motivating task will be defined after ISMIR 2015. Because this dataset is has strong and interesting networking information, and no underlying audio, we should define a task definition that best fits this state of affairs.<br />
<br />
Below is what we framed for the GC15UX:Jamendo task to give you an idea to start thinking about possible task definitions. <br />
<br />
===GC15UX:Jamendo Task Definition===<br />
''To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You need to put together a playlist for a particular event (e.g., dinner party at your house, workout session). Try to use the assigned system to make playlists for at least a couple of different events.''<br />
<br />
''The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' purposes ("music for their own situation"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system or service design. '' <br />
<br />
''Another important consideration in designing the task is the music collection available for this GC15UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.''<br />
<br />
==Evaluation Results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC15UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
This year GC15UX:JDISC adopts the two-phase model with two evaluations. The first phase will end by the ISMIR conference and we will disclose preliminary results at the conference. Then, phase II will start. Participating developers can continue improving their systems based on the feedback from the first phase and another round of evaluation will be conducted in February. We believe that this model serves the developers well since it is in accordance with the iterative nature of user-centered design. In this way, the developers will also have enough time to develop their complete MIR systems.<br />
<br />
*July ?: announce the GC15UX:JDISC<br />
*Sep. 28st: the first deadline for system submission <br />
*Feb. 28st: the second deadline for system submission<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
''The GC15UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Christopher R. Maden, University of Illinois<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
:Yun Hao, University of Illinois''<br />
<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2015:GC15UX:JDISC&diff=115492015:GC15UX:JDISC2015-10-21T16:33:54Z<p>Jdownie: /* Purpose */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2015: User Experience (GC15UX: JDISC)}}<br />
=Purpose=<br />
''Holistic, user-centered evaluation of the user experience in interacting with more complete, user-facing music information retrieval (MIR) systems.''<br />
<br />
=Goals=<br />
# ''To inspire the development of complete MIR systems.''<br />
# ''To promote the notion of user experience as a first-class research objective in the MIR community.''<br />
<br />
=About J-DISC=<br />
JDISC ([http://jdisc.columbia.edu http://jdisc.columbia.edu]) is a resource for searching and exploring jazz recordings created by the Center for Jazz Studies at Columbia University. It is organized to present complete information on jazz recording sessions, and merge a large corpus of session data into a single easily accessible repository, in a manner that can be easily searched, cross-searched, navigated and cited. In addition to the focus on recording artist/leaders of traditional discography, J-DISC incorporates extensive cultural, geographic, biographical, composer and studio information that can also be easily searched and accessed.<br />
<br />
=Dataset=<br />
J-DISC contains fully structured and searchable metadata. Key entities in the dataset include '''person, skill, session, track, composition, and issue'''. There are 19 tables in the dataset representing various relationships between those entities. Below are brief descriptions for each of the 19 tables (table names in alphabetical order, numbers at the end showing the number of rows in each table):<br />
<br />
*'''composition: '''Compositions, which may be recorded as tracks at sessions. (7,104)<br />
*'''composition_composer: '''Associations between compositions and their composers. (8,622)<br />
* '''composition_lyricist: '''Associations between compositions and their lyricists. (880)<br />
* ''' composition_title: '''Alternative titles for compositions. (811)<br />
*''' issue: '''Releases or issues of tracks e.g. as albums. (545)<br />
*'''issue_leader: '''Associations between releases and their leaders. (610)<br />
*'''issue_overdub_people: '''Associations between overdubbed releases and people working on them. (17)<br />
*'''issue_track: '''Associations between releases and the included tracks. (3,769)<br />
*'''person: '''People associated with musical recordings, including musicians and composers. (5,734)<br />
* '''person_ethnicity: '''Association of people with ethnic descriptions. (70)<br />
*'''person_session_skill: '''Correlation between people, sessions, and skills or instruments. (21,424)<br />
*'''person_skill: '''Correlation between people and primary skills or instruments. (6,450)<br />
*'''person_track_skill: '''Variation from sessions in correlation between people, tracks, and skills or instruments. (7,044)<br />
*'''session: '''Recording sessions, at which one or more musicians produced one or more tracks. (2,711)<br />
* '''session_leader: '''Associations between sessions and their leader(s). (3,123)<br />
* '''skill: '''Skills associated with musical recordings, including instruments played, conducting, composing. (209)<br />
*'''track: '''Tracks laid down at recording sessions by musicians. (15,361)<br />
*'''track_composition: '''Associations between tracks and compositions. (15,672)<br />
*'''track_soloist: '''Associations between tracks and soloists. (630)<br />
<br />
<br />
[[2015:GC15UX:JDISC_Schema]] presents more details about each table.<br />
<br />
==ER Diagram of the Schema==<br />
[[File:jdisc_schema.png|400px]]<br />
<br />
[https://www.music-ir.org/mirex/gc15ux_jdisc/jdisc_schema.pdf Click to see the full version of the diagram (PDF)]<br />
<br />
=Download the Dataset=<br />
# <span style="color:#808080">user agreement signed during download?</span><br />
<br />
=Participating Systems=<br />
''Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC15UX team.''<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
|<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC15UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': How would you rate your overall satisfaction with the system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Aesthetics''': How would you rate the visual attractiveness of the system?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Ease of use''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Clarity''': How well does the system communicate what is going on?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Performance''': Does the system work efficiently and without bugs/glitches?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC15UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Tasks for Evaluators==<br />
<br />
The final motivating task will be defined after ISMIR 2015. Because this dataset is has strong and interesting networking information, and no underlying audio, we should define a task definition that best fits this state of affairs.<br />
<br />
Below is what we framed for the GC15UX:Jamendo task to give you an idea to start thinking about possible task definitions. <br />
<br />
===GC15UX:Jamendo Task Definition===<br />
''To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You need to put together a playlist for a particular event (e.g., dinner party at your house, workout session). Try to use the assigned system to make playlists for at least a couple of different events.''<br />
<br />
''The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' purposes ("music for their own situation"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system or service design. '' <br />
<br />
''Another important consideration in designing the task is the music collection available for this GC15UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.''<br />
<br />
==Evaluation Results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC15UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
This year GC15UX:JDISC adopts the two-phase model with two evaluations. The first phase will end by the ISMIR conference and we will disclose preliminary results at the conference. Then, phase II will start. Participating developers can continue improving their systems based on the feedback from the first phase and another round of evaluation will be conducted in February. We believe that this model serves the developers well since it is in accordance with the iterative nature of user-centered design. In this way, the developers will also have enough time to develop their complete MIR systems.<br />
<br />
*July ?: announce the GC15UX:JDISC<br />
*Sep. 28st: the first deadline for system submission <br />
*Feb. 28st: the second deadline for system submission<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
''The GC15UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Christopher R. Maden, University of Illinois<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
:Yun Hao, University of Illinois''<br />
<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2015:GC15UX:JDISC&diff=115482015:GC15UX:JDISC2015-10-21T16:33:19Z<p>Jdownie: /* Tasks for Evaluators */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2015: User Experience (GC15UX: JDISC)}}<br />
=Purpose=<br />
''Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.''<br />
<br />
=Goals=<br />
# ''To inspire the development of complete MIR systems.''<br />
# ''To promote the notion of user experience as a first-class research objective in the MIR community.''<br />
<br />
=About J-DISC=<br />
JDISC ([http://jdisc.columbia.edu http://jdisc.columbia.edu]) is a resource for searching and exploring jazz recordings created by the Center for Jazz Studies at Columbia University. It is organized to present complete information on jazz recording sessions, and merge a large corpus of session data into a single easily accessible repository, in a manner that can be easily searched, cross-searched, navigated and cited. In addition to the focus on recording artist/leaders of traditional discography, J-DISC incorporates extensive cultural, geographic, biographical, composer and studio information that can also be easily searched and accessed.<br />
<br />
=Dataset=<br />
J-DISC contains fully structured and searchable metadata. Key entities in the dataset include '''person, skill, session, track, composition, and issue'''. There are 19 tables in the dataset representing various relationships between those entities. Below are brief descriptions for each of the 19 tables (table names in alphabetical order, numbers at the end showing the number of rows in each table):<br />
<br />
*'''composition: '''Compositions, which may be recorded as tracks at sessions. (7,104)<br />
*'''composition_composer: '''Associations between compositions and their composers. (8,622)<br />
* '''composition_lyricist: '''Associations between compositions and their lyricists. (880)<br />
* ''' composition_title: '''Alternative titles for compositions. (811)<br />
*''' issue: '''Releases or issues of tracks e.g. as albums. (545)<br />
*'''issue_leader: '''Associations between releases and their leaders. (610)<br />
*'''issue_overdub_people: '''Associations between overdubbed releases and people working on them. (17)<br />
*'''issue_track: '''Associations between releases and the included tracks. (3,769)<br />
*'''person: '''People associated with musical recordings, including musicians and composers. (5,734)<br />
* '''person_ethnicity: '''Association of people with ethnic descriptions. (70)<br />
*'''person_session_skill: '''Correlation between people, sessions, and skills or instruments. (21,424)<br />
*'''person_skill: '''Correlation between people and primary skills or instruments. (6,450)<br />
*'''person_track_skill: '''Variation from sessions in correlation between people, tracks, and skills or instruments. (7,044)<br />
*'''session: '''Recording sessions, at which one or more musicians produced one or more tracks. (2,711)<br />
* '''session_leader: '''Associations between sessions and their leader(s). (3,123)<br />
* '''skill: '''Skills associated with musical recordings, including instruments played, conducting, composing. (209)<br />
*'''track: '''Tracks laid down at recording sessions by musicians. (15,361)<br />
*'''track_composition: '''Associations between tracks and compositions. (15,672)<br />
*'''track_soloist: '''Associations between tracks and soloists. (630)<br />
<br />
<br />
[[2015:GC15UX:JDISC_Schema]] presents more details about each table.<br />
<br />
==ER Diagram of the Schema==<br />
[[File:jdisc_schema.png|400px]]<br />
<br />
[https://www.music-ir.org/mirex/gc15ux_jdisc/jdisc_schema.pdf Click to see the full version of the diagram (PDF)]<br />
<br />
=Download the Dataset=<br />
# <span style="color:#808080">user agreement signed during download?</span><br />
<br />
=Participating Systems=<br />
''Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC15UX team.''<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
|<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC15UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': How would you rate your overall satisfaction with the system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Aesthetics''': How would you rate the visual attractiveness of the system?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Ease of use''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Clarity''': How well does the system communicate what is going on?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Performance''': Does the system work efficiently and without bugs/glitches?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC15UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Tasks for Evaluators==<br />
<br />
The final motivating task will be defined after ISMIR 2015. Because this dataset is has strong and interesting networking information, and no underlying audio, we should define a task definition that best fits this state of affairs.<br />
<br />
Below is what we framed for the GC15UX:Jamendo task to give you an idea to start thinking about possible task definitions. <br />
<br />
===GC15UX:Jamendo Task Definition===<br />
''To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You need to put together a playlist for a particular event (e.g., dinner party at your house, workout session). Try to use the assigned system to make playlists for at least a couple of different events.''<br />
<br />
''The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' purposes ("music for their own situation"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system or service design. '' <br />
<br />
''Another important consideration in designing the task is the music collection available for this GC15UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.''<br />
<br />
==Evaluation Results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC15UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
This year GC15UX:JDISC adopts the two-phase model with two evaluations. The first phase will end by the ISMIR conference and we will disclose preliminary results at the conference. Then, phase II will start. Participating developers can continue improving their systems based on the feedback from the first phase and another round of evaluation will be conducted in February. We believe that this model serves the developers well since it is in accordance with the iterative nature of user-centered design. In this way, the developers will also have enough time to develop their complete MIR systems.<br />
<br />
*July ?: announce the GC15UX:JDISC<br />
*Sep. 28st: the first deadline for system submission <br />
*Feb. 28st: the second deadline for system submission<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
''The GC15UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Christopher R. Maden, University of Illinois<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
:Yun Hao, University of Illinois''<br />
<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2015:GC15UX:JDISC&diff=115472015:GC15UX:JDISC2015-10-21T16:31:31Z<p>Jdownie: /* =GC15UX:Jamendo Task Definition */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2015: User Experience (GC15UX: JDISC)}}<br />
=Purpose=<br />
''Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.''<br />
<br />
=Goals=<br />
# ''To inspire the development of complete MIR systems.''<br />
# ''To promote the notion of user experience as a first-class research objective in the MIR community.''<br />
<br />
=About J-DISC=<br />
JDISC ([http://jdisc.columbia.edu http://jdisc.columbia.edu]) is a resource for searching and exploring jazz recordings created by the Center for Jazz Studies at Columbia University. It is organized to present complete information on jazz recording sessions, and merge a large corpus of session data into a single easily accessible repository, in a manner that can be easily searched, cross-searched, navigated and cited. In addition to the focus on recording artist/leaders of traditional discography, J-DISC incorporates extensive cultural, geographic, biographical, composer and studio information that can also be easily searched and accessed.<br />
<br />
=Dataset=<br />
J-DISC contains fully structured and searchable metadata. Key entities in the dataset include '''person, skill, session, track, composition, and issue'''. There are 19 tables in the dataset representing various relationships between those entities. Below are brief descriptions for each of the 19 tables (table names in alphabetical order, numbers at the end showing the number of rows in each table):<br />
<br />
*'''composition: '''Compositions, which may be recorded as tracks at sessions. (7,104)<br />
*'''composition_composer: '''Associations between compositions and their composers. (8,622)<br />
* '''composition_lyricist: '''Associations between compositions and their lyricists. (880)<br />
* ''' composition_title: '''Alternative titles for compositions. (811)<br />
*''' issue: '''Releases or issues of tracks e.g. as albums. (545)<br />
*'''issue_leader: '''Associations between releases and their leaders. (610)<br />
*'''issue_overdub_people: '''Associations between overdubbed releases and people working on them. (17)<br />
*'''issue_track: '''Associations between releases and the included tracks. (3,769)<br />
*'''person: '''People associated with musical recordings, including musicians and composers. (5,734)<br />
* '''person_ethnicity: '''Association of people with ethnic descriptions. (70)<br />
*'''person_session_skill: '''Correlation between people, sessions, and skills or instruments. (21,424)<br />
*'''person_skill: '''Correlation between people and primary skills or instruments. (6,450)<br />
*'''person_track_skill: '''Variation from sessions in correlation between people, tracks, and skills or instruments. (7,044)<br />
*'''session: '''Recording sessions, at which one or more musicians produced one or more tracks. (2,711)<br />
* '''session_leader: '''Associations between sessions and their leader(s). (3,123)<br />
* '''skill: '''Skills associated with musical recordings, including instruments played, conducting, composing. (209)<br />
*'''track: '''Tracks laid down at recording sessions by musicians. (15,361)<br />
*'''track_composition: '''Associations between tracks and compositions. (15,672)<br />
*'''track_soloist: '''Associations between tracks and soloists. (630)<br />
<br />
<br />
[[2015:GC15UX:JDISC_Schema]] presents more details about each table.<br />
<br />
==ER Diagram of the Schema==<br />
[[File:jdisc_schema.png|400px]]<br />
<br />
[https://www.music-ir.org/mirex/gc15ux_jdisc/jdisc_schema.pdf Click to see the full version of the diagram (PDF)]<br />
<br />
=Download the Dataset=<br />
# <span style="color:#808080">user agreement signed during download?</span><br />
<br />
=Participating Systems=<br />
''Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC15UX team.''<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
|<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC15UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': How would you rate your overall satisfaction with the system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Aesthetics''': How would you rate the visual attractiveness of the system?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Ease of use''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Clarity''': How well does the system communicate what is going on?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Performance''': Does the system work efficiently and without bugs/glitches?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC15UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Tasks for Evaluators==<br />
<br />
The final motivating task will be defined after ISMIR 2015. At this point, it <br />
<br />
Below is what we framed for the GC15UX:Jamendo task to give you an idea to start thinking about possible task definitions. <br />
<br />
===GC15UX:Jamendo Task Definition===<br />
''To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You need to put together a playlist for a particular event (e.g., dinner party at your house, workout session). Try to use the assigned system to make playlists for at least a couple of different events.''<br />
<br />
''The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' purposes ("music for their own situation"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system or service design. '' <br />
<br />
''Another important consideration in designing the task is the music collection available for this GC15UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.''<br />
<br />
==Evaluation Results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC15UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
This year GC15UX:JDISC adopts the two-phase model with two evaluations. The first phase will end by the ISMIR conference and we will disclose preliminary results at the conference. Then, phase II will start. Participating developers can continue improving their systems based on the feedback from the first phase and another round of evaluation will be conducted in February. We believe that this model serves the developers well since it is in accordance with the iterative nature of user-centered design. In this way, the developers will also have enough time to develop their complete MIR systems.<br />
<br />
*July ?: announce the GC15UX:JDISC<br />
*Sep. 28st: the first deadline for system submission <br />
*Feb. 28st: the second deadline for system submission<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
''The GC15UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Christopher R. Maden, University of Illinois<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
:Yun Hao, University of Illinois''<br />
<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2015:GC15UX:JDISC&diff=115462015:GC15UX:JDISC2015-10-21T16:31:15Z<p>Jdownie: /* Tasks for Evaluators */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2015: User Experience (GC15UX: JDISC)}}<br />
=Purpose=<br />
''Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.''<br />
<br />
=Goals=<br />
# ''To inspire the development of complete MIR systems.''<br />
# ''To promote the notion of user experience as a first-class research objective in the MIR community.''<br />
<br />
=About J-DISC=<br />
JDISC ([http://jdisc.columbia.edu http://jdisc.columbia.edu]) is a resource for searching and exploring jazz recordings created by the Center for Jazz Studies at Columbia University. It is organized to present complete information on jazz recording sessions, and merge a large corpus of session data into a single easily accessible repository, in a manner that can be easily searched, cross-searched, navigated and cited. In addition to the focus on recording artist/leaders of traditional discography, J-DISC incorporates extensive cultural, geographic, biographical, composer and studio information that can also be easily searched and accessed.<br />
<br />
=Dataset=<br />
J-DISC contains fully structured and searchable metadata. Key entities in the dataset include '''person, skill, session, track, composition, and issue'''. There are 19 tables in the dataset representing various relationships between those entities. Below are brief descriptions for each of the 19 tables (table names in alphabetical order, numbers at the end showing the number of rows in each table):<br />
<br />
*'''composition: '''Compositions, which may be recorded as tracks at sessions. (7,104)<br />
*'''composition_composer: '''Associations between compositions and their composers. (8,622)<br />
* '''composition_lyricist: '''Associations between compositions and their lyricists. (880)<br />
* ''' composition_title: '''Alternative titles for compositions. (811)<br />
*''' issue: '''Releases or issues of tracks e.g. as albums. (545)<br />
*'''issue_leader: '''Associations between releases and their leaders. (610)<br />
*'''issue_overdub_people: '''Associations between overdubbed releases and people working on them. (17)<br />
*'''issue_track: '''Associations between releases and the included tracks. (3,769)<br />
*'''person: '''People associated with musical recordings, including musicians and composers. (5,734)<br />
* '''person_ethnicity: '''Association of people with ethnic descriptions. (70)<br />
*'''person_session_skill: '''Correlation between people, sessions, and skills or instruments. (21,424)<br />
*'''person_skill: '''Correlation between people and primary skills or instruments. (6,450)<br />
*'''person_track_skill: '''Variation from sessions in correlation between people, tracks, and skills or instruments. (7,044)<br />
*'''session: '''Recording sessions, at which one or more musicians produced one or more tracks. (2,711)<br />
* '''session_leader: '''Associations between sessions and their leader(s). (3,123)<br />
* '''skill: '''Skills associated with musical recordings, including instruments played, conducting, composing. (209)<br />
*'''track: '''Tracks laid down at recording sessions by musicians. (15,361)<br />
*'''track_composition: '''Associations between tracks and compositions. (15,672)<br />
*'''track_soloist: '''Associations between tracks and soloists. (630)<br />
<br />
<br />
[[2015:GC15UX:JDISC_Schema]] presents more details about each table.<br />
<br />
==ER Diagram of the Schema==<br />
[[File:jdisc_schema.png|400px]]<br />
<br />
[https://www.music-ir.org/mirex/gc15ux_jdisc/jdisc_schema.pdf Click to see the full version of the diagram (PDF)]<br />
<br />
=Download the Dataset=<br />
# <span style="color:#808080">user agreement signed during download?</span><br />
<br />
=Participating Systems=<br />
''Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC15UX team.''<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
|<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC15UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': How would you rate your overall satisfaction with the system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Aesthetics''': How would you rate the visual attractiveness of the system?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Ease of use''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Clarity''': How well does the system communicate what is going on?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Performance''': Does the system work efficiently and without bugs/glitches?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC15UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Tasks for Evaluators==<br />
<br />
The final motivating task will be defined after ISMIR 2015. At this point, it <br />
<br />
Below is what we framed for the GC15UX:Jamendo task to give you an idea to start thinking about possible task definitions. <br />
<br />
===GC15UX:Jamendo Task Definition==<br />
''To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You need to put together a playlist for a particular event (e.g., dinner party at your house, workout session). Try to use the assigned system to make playlists for at least a couple of different events.''<br />
<br />
''The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' purposes ("music for their own situation"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system or service design. '' <br />
<br />
''Another important consideration in designing the task is the music collection available for this GC15UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.''<br />
<br />
==Evaluation Results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC15UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
This year GC15UX:JDISC adopts the two-phase model with two evaluations. The first phase will end by the ISMIR conference and we will disclose preliminary results at the conference. Then, phase II will start. Participating developers can continue improving their systems based on the feedback from the first phase and another round of evaluation will be conducted in February. We believe that this model serves the developers well since it is in accordance with the iterative nature of user-centered design. In this way, the developers will also have enough time to develop their complete MIR systems.<br />
<br />
*July ?: announce the GC15UX:JDISC<br />
*Sep. 28st: the first deadline for system submission <br />
*Feb. 28st: the second deadline for system submission<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
''The GC15UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Christopher R. Maden, University of Illinois<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
:Yun Hao, University of Illinois''<br />
<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2015:GC15UX:JDISC&diff=115452015:GC15UX:JDISC2015-10-21T16:29:34Z<p>Jdownie: /* Tasks for Evaluators */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2015: User Experience (GC15UX: JDISC)}}<br />
=Purpose=<br />
''Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.''<br />
<br />
=Goals=<br />
# ''To inspire the development of complete MIR systems.''<br />
# ''To promote the notion of user experience as a first-class research objective in the MIR community.''<br />
<br />
=About J-DISC=<br />
JDISC ([http://jdisc.columbia.edu http://jdisc.columbia.edu]) is a resource for searching and exploring jazz recordings created by the Center for Jazz Studies at Columbia University. It is organized to present complete information on jazz recording sessions, and merge a large corpus of session data into a single easily accessible repository, in a manner that can be easily searched, cross-searched, navigated and cited. In addition to the focus on recording artist/leaders of traditional discography, J-DISC incorporates extensive cultural, geographic, biographical, composer and studio information that can also be easily searched and accessed.<br />
<br />
=Dataset=<br />
J-DISC contains fully structured and searchable metadata. Key entities in the dataset include '''person, skill, session, track, composition, and issue'''. There are 19 tables in the dataset representing various relationships between those entities. Below are brief descriptions for each of the 19 tables (table names in alphabetical order, numbers at the end showing the number of rows in each table):<br />
<br />
*'''composition: '''Compositions, which may be recorded as tracks at sessions. (7,104)<br />
*'''composition_composer: '''Associations between compositions and their composers. (8,622)<br />
* '''composition_lyricist: '''Associations between compositions and their lyricists. (880)<br />
* ''' composition_title: '''Alternative titles for compositions. (811)<br />
*''' issue: '''Releases or issues of tracks e.g. as albums. (545)<br />
*'''issue_leader: '''Associations between releases and their leaders. (610)<br />
*'''issue_overdub_people: '''Associations between overdubbed releases and people working on them. (17)<br />
*'''issue_track: '''Associations between releases and the included tracks. (3,769)<br />
*'''person: '''People associated with musical recordings, including musicians and composers. (5,734)<br />
* '''person_ethnicity: '''Association of people with ethnic descriptions. (70)<br />
*'''person_session_skill: '''Correlation between people, sessions, and skills or instruments. (21,424)<br />
*'''person_skill: '''Correlation between people and primary skills or instruments. (6,450)<br />
*'''person_track_skill: '''Variation from sessions in correlation between people, tracks, and skills or instruments. (7,044)<br />
*'''session: '''Recording sessions, at which one or more musicians produced one or more tracks. (2,711)<br />
* '''session_leader: '''Associations between sessions and their leader(s). (3,123)<br />
* '''skill: '''Skills associated with musical recordings, including instruments played, conducting, composing. (209)<br />
*'''track: '''Tracks laid down at recording sessions by musicians. (15,361)<br />
*'''track_composition: '''Associations between tracks and compositions. (15,672)<br />
*'''track_soloist: '''Associations between tracks and soloists. (630)<br />
<br />
<br />
[[2015:GC15UX:JDISC_Schema]] presents more details about each table.<br />
<br />
==ER Diagram of the Schema==<br />
[[File:jdisc_schema.png|400px]]<br />
<br />
[https://www.music-ir.org/mirex/gc15ux_jdisc/jdisc_schema.pdf Click to see the full version of the diagram (PDF)]<br />
<br />
=Download the Dataset=<br />
# <span style="color:#808080">user agreement signed during download?</span><br />
<br />
=Participating Systems=<br />
''Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC15UX team.''<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
| <br />
|-<br />
|-<br />
| <br />
| <br />
|<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC15UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': How would you rate your overall satisfaction with the system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Aesthetics''': How would you rate the visual attractiveness of the system?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Ease of use''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Clarity''': How well does the system communicate what is going on?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Performance''': Does the system work efficiently and without bugs/glitches?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC15UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Tasks for Evaluators==<br />
<br />
The final motivating task will be defined after ISMIR 2015. At this point, it <br />
<br />
Below is what we framed for the GC15UX:Jamendo task to give you an idea to start thinking about----- <br />
<br />
--<br />
''To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You need to put together a playlist for a particular event (e.g., dinner party at your house, workout session). Try to use the assigned system to make playlists for at least a couple of different events.''<br />
<br />
''The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' purposes ("music for their own situation"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system or service design. '' <br />
<br />
''Another important consideration in designing the task is the music collection available for this GC15UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.''<br />
<br />
==Evaluation Results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC15UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
This year GC15UX:JDISC adopts the two-phase model with two evaluations. The first phase will end by the ISMIR conference and we will disclose preliminary results at the conference. Then, phase II will start. Participating developers can continue improving their systems based on the feedback from the first phase and another round of evaluation will be conducted in February. We believe that this model serves the developers well since it is in accordance with the iterative nature of user-centered design. In this way, the developers will also have enough time to develop their complete MIR systems.<br />
<br />
*July ?: announce the GC15UX:JDISC<br />
*Sep. 28st: the first deadline for system submission <br />
*Feb. 28st: the second deadline for system submission<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
''The GC15UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Christopher R. Maden, University of Illinois<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
:Yun Hao, University of Illinois''<br />
<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Jan_Schl%C3%BCter&diff=11251User talk:Jan Schlüter2015-09-11T21:32:09Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 16:32, 11 September 2015 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Jan_Schl%C3%BCter&diff=11250User:Jan Schlüter2015-09-11T21:32:08Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>PhD student at the OFAI (Austrian Research Institute for Artificial Intelligence), researching on solving different event detection and sequence labeling tasks with semi-supervised and supervised learning, mostly based on neural networks.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Steve_Wang&diff=11231User:Steve Wang2015-08-24T02:27:37Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>ACRCloud is a cloud platform that helps excellent companies and developers integrate the most advanced ACR (Automatic Content Recognition) techniques into their products to get the abilities to recognize audios and videos, monitor radio streams, detect live TV contents and so on.<br />
I'm a master of computer science ( majored in Machine Learning ). I'm now working in ACRCloud and in charge of Audio Fingerprinting algorithms.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Steve_Wang&diff=11232User talk:Steve Wang2015-08-24T02:27:37Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 21:27, 23 August 2015 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User:Yujia_Yan&diff=10899User:Yujia Yan2015-05-04T12:24:19Z<p>Jdownie: Creating user page with biography of new user.</p>
<hr />
<div>Just graduated from MT program in Georgia tech, and most of my study were on MIR. My research foucuses on multipitch tracking and score following system.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=User_talk:Yujia_Yan&diff=10900User talk:Yujia Yan2015-05-04T12:24:19Z<p>Jdownie: Welcome!</p>
<hr />
<div>'''Welcome to ''MIREX Wiki''!'''<br />
We hope you will contribute much and well.<br />
You will probably want to read the [[Help:Contents|help pages]].<br />
Again, welcome and have fun! [[User:Jdownie|Jdownie]] 07:24, 4 May 2015 (CDT)</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=104372014:GC14UX2014-08-31T15:05:14Z<p>Jdownie: /* Dataset */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14,742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX are sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. It represents a randomly chosen subset the content available at Jamendo that is published under the terms of the Creative Commons Attribution-Non-Commercial-ShareAlike (by-nc-sa), where user-supplied data has tagged a track with 1 or more genre categories. For more details about usage of this dataset, see the LICENSE.txt file contained in the downloaded files.<br />
<br />
The dataset contains the MP3 tracks and the metadata the Jamendo site publishes on the respective items (represented in JSON format), retrieved using the site's API (6th Aug 2014). The dataset is available both zipped up and as a tar-ball(you only need one of these); however, at 60+ Gb it is a non-trival size of file to download over the web, and so we suggest you install a Download Manager extension to your browser if you do not already have one and make use of that. In a test using the DownThemAll! extension to Firefox, downloading the dataset between University of Illinois at Urbana-Champaign and Waikato University in New Zealand took a little under 2 hours.<br />
<br />
You need to register to download the main dataset.<br />
<br />
[https://www.music-ir.org/mirex/gc14ux/ https://www.music-ir.org/mirex/gc14ux/] <br/><br />
<br />
<br />
==Metadata Extracted from JSON Files==<br />
The JSON files retrieved from Jamendo site contain various metadata:<br />
<br />
#album_id <br />
#album_image <br />
#album_name <br />
#artist_id <br />
#artist_idstr<br />
#artist_name<br />
#audio <br />
#audiodownload<br />
#duration<br />
#id <br />
#license_ccurl <br />
#musicinfo_lang <br />
#musicinfo_speed <br />
#musicinfo_acousticelectric <br />
#musicinfo_vocalinstrumental <br />
#musicinfo_gender <br />
#musicinfo_tags_vartags <br />
#musicinfo_tags_genres <br />
#musicinfo_tags_instruments <br />
#name <br />
#position <br />
#releasedate <br />
#shareurl<br />
#shorturl<br />
<br />
[[2014:GC14UX:JSON Metadata]] presents statistics and plots of selected fields.<br />
<br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size. <br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
<span style="color:#008000">'''''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.'''''</span><br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 28st: deadline for system submission <br />
*Oct. 5th: start the evaluation<br />
*Oct. 27th: close the evaluation system<br />
*Oct. 29th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=104362014:GC14UX2014-08-31T15:04:31Z<p>Jdownie: /* Metadata Extracted from JSON Files */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14,742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX are sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. It represents a randomly chosen subset the content available at Jamendo that is published under the terms of the Creative Commons Attribution-Non-Commercial-ShareAlike (by-nc-sa), where user-supplied data has tagged a track with 1 or more genre categories. For more details about usage of this dataset, see the LICENSE.txt file contained in the downloaded files.<br />
<br />
The dataset contains the MP3 tracks and the metadata the Jamendo site publishes on the respective items (represented in JSON format), retrieved using the site's API (6th Aug 2014). The dataset is available both zipped up and as a tar-ball(you only need one of these); however, at 60+ Gb it is a non-trival size of file to download over the web, and so we suggest you install a Download Manager extension to your browser if you do not already have one and make use of that. In a test using the DownThemAll! extension to Firefox, downloading the dataset between University of Illinois at Urbana-Champaign and Waikato University in New Zealand took a little under 2 hours.<br />
<br />
You need to register to download the main dataset.<br />
<br />
[https://www.music-ir.org/mirex/gc14ux/] <br/><br />
<br />
<br />
==Metadata Extracted from JSON Files==<br />
The JSON files retrieved from Jamendo site contain various metadata:<br />
<br />
#album_id <br />
#album_image <br />
#album_name <br />
#artist_id <br />
#artist_idstr<br />
#artist_name<br />
#audio <br />
#audiodownload<br />
#duration<br />
#id <br />
#license_ccurl <br />
#musicinfo_lang <br />
#musicinfo_speed <br />
#musicinfo_acousticelectric <br />
#musicinfo_vocalinstrumental <br />
#musicinfo_gender <br />
#musicinfo_tags_vartags <br />
#musicinfo_tags_genres <br />
#musicinfo_tags_instruments <br />
#name <br />
#position <br />
#releasedate <br />
#shareurl<br />
#shorturl<br />
<br />
[[2014:GC14UX:JSON Metadata]] presents statistics and plots of selected fields.<br />
<br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size. <br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
<span style="color:#008000">'''''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.'''''</span><br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 28st: deadline for system submission <br />
*Oct. 5th: start the evaluation<br />
*Oct. 27th: close the evaluation system<br />
*Oct. 29th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=104352014:GC14UX2014-08-31T15:01:04Z<p>Jdownie: /* Dataset */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14,742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX are sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. It represents a randomly chosen subset the content available at Jamendo that is published under the terms of the Creative Commons Attribution-Non-Commercial-ShareAlike (by-nc-sa), where user-supplied data has tagged a track with 1 or more genre categories. For more details about usage of this dataset, see the LICENSE.txt file contained in the downloaded files.<br />
<br />
The dataset contains the MP3 tracks and the metadata the Jamendo site publishes on the respective items (represented in JSON format), retrieved using the site's API (6th Aug 2014). The dataset is available both zipped up and as a tar-ball(you only need one of these); however, at 60+ Gb it is a non-trival size of file to download over the web, and so we suggest you install a Download Manager extension to your browser if you do not already have one and make use of that. In a test using the DownThemAll! extension to Firefox, downloading the dataset between University of Illinois at Urbana-Champaign and Waikato University in New Zealand took a little under 2 hours.<br />
<br />
You need to register to download the main dataset.<br />
<br />
[https://www.music-ir.org/mirex/gc14ux/] <br/><br />
<br />
<br />
==Metadata Extracted from JSON Files==<br />
The JSON files retrieved from Jamendo site contain various metadata:<br />
<br />
[album_id album_image album_name artist_id artist_idstr artist_name audio audiodownload duration id license_ccurl musicinfo_lang musicinfo_speed musicinfo_acousticelectric musicinfo_vocalinstrumental musicinfo_gender musicinfo_tags_vartags musicinfo_tags_genres musicinfo_tags_instruments name position releasedate shareurl shorturl]<br />
<br />
[[2014:GC14UX:JSON Metadata]] presents statistics and plots of selected fields.<br />
<br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size. <br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evaluators' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
<span style="color:#008000">'''''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.'''''</span><br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evaluators' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
Graders can take as many assignments as they wish in the My Assignments page. They are allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
<br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br/><br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 28st: deadline for system submission <br />
*Oct. 5th: start the evaluation<br />
*Oct. 27th: close the evaluation system<br />
*Oct. 29th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=102352014:GC14UX2014-07-01T21:40:30Z<p>Jdownie: /* Task for evaluators */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX will be sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. <br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size. <br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evalutors' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
<span style="color:#008000">'''''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.'''''</span><br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evalutors' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br/><br />
Users can task as many submissions as they want in the My Assignments page. They are also allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 21st: deadline for system submission <br />
*Sep. 28th: start the evaluation<br />
*Oct. 20th: close the evaluation system<br />
*Oct. 27th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=102342014:GC14UX2014-07-01T21:37:07Z<p>Jdownie: /* Criteria */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX will be sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. <br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size. <br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evalutors' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.''<br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evalutors' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br/><br />
Users can task as many submissions as they want in the My Assignments page. They are also allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 21st: deadline for system submission <br />
*Sep. 28th: start the evaluation<br />
*Oct. 20th: close the evaluation system<br />
*Oct. 27th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=102332014:GC14UX2014-07-01T21:35:11Z<p>Jdownie: /* Participating Systems */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX will be sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. <br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size. <br />
<br />
See the [[#Evaluation Webforms]] below for a better understanding of our E6K-inpsired evaluation system design.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evalutors' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
<br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.''<br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evalutors' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br/><br />
Users can task as many submissions as they want in the My Assignments page. They are also allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 21st: deadline for system submission <br />
*Sep. 28th: start the evaluation<br />
*Oct. 20th: close the evaluation system<br />
*Oct. 27th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=102322014:GC14UX2014-07-01T21:33:23Z<p>Jdownie: /* Criteria */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX will be sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. <br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evalutors' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
<br />
<br />
* '''Open Text Feedback''': An open-ended question is provided for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.''<br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evalutors' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br/><br />
Users can task as many submissions as they want in the My Assignments page. They are also allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 21st: deadline for system submission <br />
*Sep. 28th: start the evaluation<br />
*Oct. 20th: close the evaluation system<br />
*Oct. 27th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=102312014:GC14UX2014-07-01T21:29:58Z<p>Jdownie: /* Evaluators */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX will be sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. <br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evalutors' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent ||| Not Applicable<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
<br />
* '''Aesthetics''': How beautiful is the design? (Is it aesthetically pleasing?)<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Feedback''' (Optional): An open-ended question is provided but is optional for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.''<br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evalutors' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br/><br />
Users can task as many submissions as they want in the My Assignments page. They are also allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 21st: deadline for system submission <br />
*Sep. 28th: start the evaluation<br />
*Oct. 20th: close the evaluation system<br />
*Oct. 27th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=102302014:GC14UX2014-07-01T21:29:36Z<p>Jdownie: /* Evaluators */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX will be sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. <br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evalutors' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent ||| Not Applicable<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
<br />
* '''Aesthetics''': How beautiful is the design? (Is it aesthetically pleasing?)<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Feedback''' (Optional): An open-ended question is provided but is optional for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.''<br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evalutors' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br/><br />
Users can task as many submissions as they want in the My Assignments page. They are also allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 21st: deadline for system submission <br />
*Sep. 28th: start the evaluation<br />
*Oct. 20th: close the evaluation system<br />
*Oct. 27th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdowniehttps://www.music-ir.org/mirex/w/index.php?title=2014:GC14UX&diff=102292014:GC14UX2014-07-01T21:28:37Z<p>Jdownie: /* Goals */</p>
<hr />
<div>{{DISPLAYTITLE:Grand Challenge 2014: User Experience}}<br />
=Purpose=<br />
Holistic, user-centered evaluation of the user experience in interacting with complete, user-facing music information retrieval (MIR) systems.<br />
<br />
=Goals=<br />
# To inspire the development of complete MIR systems.<br />
# To promote the notion of user experience as a first-class research objective in the MIR community.<br />
<br />
=Dataset=<br />
A set of music 10,000 music audio tracks is provided for the GC14UX. It will be a subset of tracks drawn from the [http://www.jamendo.com/en/welcome Jamendo collection's] CC-BY licensed works.<br />
<br />
The Jamendo collection contains music in a variety of genres and moods, but is mostly unknown to most listeners. This will mitigate against the possible user experience bias induced by the differential presence (or absence) of popular or known music within the participating systems. <br />
<br />
As of May 20, 2014, the Jamendo collection contains 14742 tracks with the [http://creativecommons.org/licenses/by/3.0/ CC-BY license]. The CC-BY license allows others to distribute, modify, optimize and use your work as a basis, even commercially, as long as you give credit for the original creation. This is one of the most permissive licenses possible.<br />
<br />
The 10,000 tracks in GC14UX will be sampled (w.r.t. maximizing music variety) from the Jamendo collection with CC-BY license and made available for participants (system developers) to download to build their systems. <br />
=Participating Systems=<br />
Unlike conventional MIREX tasks, participants are not asked to submit their systems. Instead, the systems will be hosted by their developers. All participating systems need to be constructed as websites accessible to users through normal web browsers. Participating teams will submit the URLs to their systems to the GC14UX team.<br />
<br />
To ensure a consistent experience, evaluators will see participating systems in fixed size window: '''1024x768'''. Please test your system for this screen size.<br />
<br />
==Potential Participants==<br />
<br />
Please put your names and email contacts in the following table. It is encouraged that you give your team a cool name! <br />
{| class="wikitable"<br />
|-<br />
! (Cool) Team Name<br />
! Name(s)<br />
! Email(s)<br />
|-<br />
| The MIR UX Master<br />
| Dr. MIR<br />
| mir@domain.com<br />
|-<br />
|}<br />
<br />
=Evaluation=<br />
<br />
As written in the name of the Grand Challenge, the evaluation will be user-centered. All systems will be used by a number of human evaluators and be rated by them on several most important criteria in evaluating user experience. <br />
<br />
==Criteria==<br />
<br />
''Note that the evaluation criteria or its descriptions may be slightly changed in the months leading up to the submission deadline, as we test it and work to improve it.''<br />
<br />
Given the GC14UX is all about how users perceive their experiences of the systems, we intend to capture the user perceptions in a minimally intrusive manner and not to burden the users/evaluators with too many questions or required data inputs. The following criteria are grounded on the literature of Human Computer Interaction (HCI) and User Experience (UX), with a careful consideration on striking a balance between being comprehensive and minimizing evalutors' cognitive load. <br />
<br />
Evaluators will rate systems on the following criteria: <br />
<br />
* '''Overall satisfaction''': Overall, how pleasurable do you find the experience of using this system?<br />
Very unsatisfactory / Unsatisfactory / Slightly unsatisfactory / Neutral / Slightly satisfactory / Satisfactory / Very satisfactory<br />
<br />
* '''Learnability''': How easy was it to figure out how to use the system? <br />
Very difficult / Difficult / Slightly difficult / Neutral / Slightly easy / Easy / Very easy<br />
<br />
* '''Robustness''': How good is the system’s ability to warn you when you’re about to make a mistake and allow you to recover?<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent ||| Not Applicable<br />
<br />
* '''Affordances''': How well does the system allow you to perform what you want to do?<br />
<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent <br />
<br />
* '''Presentation''': How well does the system communicate what’s going on? (How well do you feel the system informs you of its status? Can you clearly understand the labels and words used in the system? How visible are all of your options and menus when you use this system?)<br />
<br />
* '''Aesthetics''': How beautiful is the design? (Is it aesthetically pleasing?)<br />
Very Poor / Poor / Slightly Poor / Neutral / Slightly Good / Good / Excellent<br />
<br />
* '''Feedback''' (Optional): An open-ended question is provided but is optional for evaluators to give feedback if they wish to do so.<br />
<br />
==Evaluators==<br />
Evaluators will be users aged 18 and above. For this round, evaluators will be drawn primarily from the MIR community through solicitations via the ISMIR-community mailing list. The [[#Evaluation Webforms]] developed by the GC14UX team will ensure all participating systems will get equal number of evaluators.<br />
<br />
==Task for evaluators==<br />
<br />
To motivate the evaluators, a defined yet open task is given to the evaluators:<br />
<br />
''You are creating a short video about a memorable occasion that happened to you recently, and you need to find some (copyright-free) songs to use as background music.''<br />
<br />
The task is to ensure that evaluators have a (more or less) consistent goal when they interact with the systems. The goal is flexible and authentic to the evalutors' lives ("a recent, memorable occasion"). As the task is not too specific, evaluators can potentially look for a wide range of music in terms of genre, mood and other aspects. This allows great flexibility and virtually unlimited possibility in system design. <br />
<br />
Another important consideration in designing the task is the music collection available for this GC14UX: the Jamando collection. Jamando music is not well-known to most users/evaluators, whereas many more commonly seen music information tasks are more or less influenced by users' familiarity to the songs and song popularity. Through this task of "finding (copyright-free) background music for a self-made video", we strive to minimize the need of looking for familiar or popular music.<br />
<br />
==Evaluation results==<br />
Statistics of the scores given by all evaluators will be reported: mean, average deviation. Meaningful text comments from the evaluators will also be reported.<br />
<br />
==Evaluation Webforms==<br />
To facilitate the evaluators and minimize their burden, the GC14UX team will provide a set of evaluation forms which wrap around the participating systems. As shown in the following image, the evaluation webforms are for scoring the participating systems, with their client interfaces embedded within an iframe in the left side of the webform.<br />
<br />
[[File:GCUX wireframe evaluation.png|800px]]<br />
<br/><br />
Users can task as many submissions as they want in the My Assignments page. They are also allowed to go back to the evaluation page anytime by clicking the thumbnail of the submission. <br />
[[File:GCUX_wireframe_my_assignments.png|800px]]<br />
<br />
=Organization=<br />
<br />
==Important Dates==<br />
<br />
*July 1: announce the GC<br />
*Sep. 21st: deadline for system submission <br />
*Sep. 28th: start the evaluation<br />
*Oct. 20th: close the evaluation system<br />
*Oct. 27th: announce the results<br />
*Oct. 31st: MIREX and GC session in ISMIR2014<br />
<br />
==What to Submit==<br />
<br />
A URL to the participanting system.<br />
<br />
==Contacts==<br />
The GC14UX team consists of: <br />
:J. Stephen Downie, University of Illinois (MIREX director)<br />
:Xiao Hu, University of Hong Kong (ISMIR2014 co-chair)<br />
:Jin Ha Lee, University of Washington (ISMIR2014 program co-chair)<br />
:Yi-Hsuan (Eric) Yang, Academic Sinica, Taiwan (ISMIR2014 program co-chair)<br />
:David Bainbridge, Waikato University, New Zealand<br />
:Kahyun Choi, University of Illinois<br />
:Peter Organisciak, University of Illinois<br />
<br />
Inquiries, suggestions, questions, comments are all highly welcome! Please contact Prof. Downie [mailto:jdownie@illinois.edu] or anyone in the team.</div>Jdownie