Difference between revisions of "2025:RenCon Results"

From MIREX Wiki
(Blanked the page)
(Tag: Blanking)
Line 1: Line 1:
 +
= 2025:RenCon Results =
  
 +
== Preliminary (Audition) Round Results ==
 +
 +
=== Evaluation Methodology ===
 +
The preliminary round was evaluated through an online listening test with '''25 expert evaluators'''. The evaluation used a weighted voting system where participants self-rated their expertise level from 1-5 stars, with responses weighted accordingly.
 +
 +
=== Participant Demographics ===
 +
Our evaluation panel consisted of highly qualified judges:
 +
 +
'''Expertise Distribution:'''
 +
* Expert evaluators (5 stars): 7 participants (29.2%)
 +
* High confidence (4 stars): 5 participants (20.8%)
 +
* Moderate confidence (3 stars): 10 participants (41.7%)
 +
* Lower confidence (1-2 stars): 2 participants (8.4%)
 +
* '''Average expertise weight:''' 3.67/5.0
 +
 +
'''Professional Background:'''
 +
* Music researchers: 12 (54.5%)
 +
* Music technologists: 10 (45.5%)
 +
* Active performers: 8 (36.4%)
 +
* Conservatory students: 6 (27.3%)
 +
* Music lovers: 15 (68.2%)
 +
* Concert-goers: 8 (36.4%)
 +
 +
'''Musical Experience:'''
 +
* Strong representation of classical music expertise
 +
* Diverse musical preferences spanning classical, jazz, pop, and rock
 +
* Substantial piano experience among evaluators
 +
* Mix of academic researchers and practicing musicians
 +
 +
=== System Rankings ===
 +
 +
The following table shows the final rankings based on weighted average scores from the preliminary round evaluation:
 +
 +
{| class="wikitable sortable"
 +
! Rank
 +
! Anonymous Name
 +
! Real System Name
 +
! Authors/Institution
 +
! Weighted Score
 +
! Simple Average
 +
! Responses
 +
|-
 +
| 1
 +
| EmberSky
 +
| [System Name]
 +
| [Author Names]
 +
| [X.XXX]/5.0
 +
| [X.XX]/5.0
 +
| 24
 +
|-
 +
| 2
 +
| AzureThunder
 +
| [System Name]
 +
| [Author Names]
 +
| [X.XXX]/5.0
 +
| [X.XX]/5.0
 +
| 24
 +
|-
 +
| 3
 +
| CrimsonDawn
 +
| [System Name]
 +
| [Author Names]
 +
| [X.XXX]/5.0
 +
| [X.XX]/5.0
 +
| 24
 +
|-
 +
| 4
 +
| SilverWave
 +
| [System Name]
 +
| [Author Names]
 +
| [X.XXX]/5.0
 +
| [X.XX]/5.0
 +
| 24
 +
|-
 +
| 5
 +
| VelvetStorm
 +
| [System Name]
 +
| [Author Names]
 +
| [X.XXX]/5.0
 +
| [X.XX]/5.0
 +
| 24
 +
|-
 +
| 6
 +
| GoldenMist
 +
| [System Name]
 +
| [Author Names]
 +
| [X.XXX]/5.0
 +
| [X.XX]/5.0
 +
| 24
 +
|}
 +
 +
''Note: Complete rankings and system details will be updated following the live contest and final results announcement.''
 +
 +
=== Qualitative Feedback ===
 +
 +
Evaluators provided extensive qualitative feedback on the systems' performances:
 +
 +
'''Common Positive Attributes:'''
 +
* Natural expressiveness and human-like phrasing
 +
* Appropriate tempo variations and rubato
 +
* Musical sensitivity to harmonic structure
 +
* Dynamic expression and articulation
 +
 +
'''Areas for Improvement:'''
 +
* Consistency across different musical styles
 +
* Handling of complex rhythmic patterns
 +
* Balance between technical accuracy and musical expression
 +
 +
== Live Contest Results ==
 +
 +
''[To be updated following the live contest on September 25, 2025]''
 +
 +
=== Surprise Piece ===
 +
* '''Title:''' [To be announced]
 +
* '''Composer:''' [To be announced]
 +
* '''Duration:''' [X minutes]
 +
* '''Style:''' [Musical characteristics]
 +
 +
=== Live Performance Rankings ===
 +
''[Results pending live audience voting]''
 +
 +
=== Winner Announcement ===
 +
''[To be announced at the conclusion of ISMIR 2025]''
 +
 +
 +
 +
== External Links ==
 +
 +
* [https://ren-con2025.vercel.app/ Official RenCon 2025 Website]
 +
* [https://ismir2025.ismir.net/ ISMIR 2025 Conference Website]
 +
* [https://www.music-ir.org/mirex/wiki/2025:RenCon RenCon 2025 MIREX Task Page]
 +
 +
[[Category:MIREX]]
 +
[[Category:ISMIR 2025]]
 +
[[Category:Performance Rendering]]
 +
[[Category:Competition Results]]

Revision as of 18:21, 18 September 2025

2025:RenCon Results

Preliminary (Audition) Round Results

Evaluation Methodology

The preliminary round was evaluated through an online listening test with 25 expert evaluators. The evaluation used a weighted voting system where participants self-rated their expertise level from 1-5 stars, with responses weighted accordingly.

Participant Demographics

Our evaluation panel consisted of highly qualified judges:

Expertise Distribution:

  • Expert evaluators (5 stars): 7 participants (29.2%)
  • High confidence (4 stars): 5 participants (20.8%)
  • Moderate confidence (3 stars): 10 participants (41.7%)
  • Lower confidence (1-2 stars): 2 participants (8.4%)
  • Average expertise weight: 3.67/5.0

Professional Background:

  • Music researchers: 12 (54.5%)
  • Music technologists: 10 (45.5%)
  • Active performers: 8 (36.4%)
  • Conservatory students: 6 (27.3%)
  • Music lovers: 15 (68.2%)
  • Concert-goers: 8 (36.4%)

Musical Experience:

  • Strong representation of classical music expertise
  • Diverse musical preferences spanning classical, jazz, pop, and rock
  • Substantial piano experience among evaluators
  • Mix of academic researchers and practicing musicians

System Rankings

The following table shows the final rankings based on weighted average scores from the preliminary round evaluation:

Rank Anonymous Name Real System Name Authors/Institution Weighted Score Simple Average Responses
1 EmberSky [System Name] [Author Names] [X.XXX]/5.0 [X.XX]/5.0 24
2 AzureThunder [System Name] [Author Names] [X.XXX]/5.0 [X.XX]/5.0 24
3 CrimsonDawn [System Name] [Author Names] [X.XXX]/5.0 [X.XX]/5.0 24
4 SilverWave [System Name] [Author Names] [X.XXX]/5.0 [X.XX]/5.0 24
5 VelvetStorm [System Name] [Author Names] [X.XXX]/5.0 [X.XX]/5.0 24
6 GoldenMist [System Name] [Author Names] [X.XXX]/5.0 [X.XX]/5.0 24

Note: Complete rankings and system details will be updated following the live contest and final results announcement.

Qualitative Feedback

Evaluators provided extensive qualitative feedback on the systems' performances:

Common Positive Attributes:

  • Natural expressiveness and human-like phrasing
  • Appropriate tempo variations and rubato
  • Musical sensitivity to harmonic structure
  • Dynamic expression and articulation

Areas for Improvement:

  • Consistency across different musical styles
  • Handling of complex rhythmic patterns
  • Balance between technical accuracy and musical expression

Live Contest Results

[To be updated following the live contest on September 25, 2025]

Surprise Piece

  • Title: [To be announced]
  • Composer: [To be announced]
  • Duration: [X minutes]
  • Style: [Musical characteristics]

Live Performance Rankings

[Results pending live audience voting]

Winner Announcement

[To be announced at the conclusion of ISMIR 2025]


External Links