remi
remi copied to clipboard
Issues with evaluation (std and salience for beat/downbeat)
Hi! I was trying to realize the evaluation part of the code according to the paper (see the attached picture) Pop Music Transformer: Beat-based Modeling and Generation of Expressive Pop Piano Compositions:

However, There must be some mistakes with my calculation because of the perceptable distincts between my result of beat std and downbeat salience and the one the paper. Also, I found difficulty in realizing the part of downbeat std because of the different description (perhaps?) of the paper and the file for madmom. The data presented in my result is based on the train set (Remi/data/train) and madmom, and I think I transferred the .midi files into correct .wav files:


I've viewed https://madmom.readthedocs.io/en/latest/modules/features/downbeats.html for related information but still failed to solve it by myself. Therefore, could you please share the evaluation part with me? Thank you so much!
please attach your code, thanks.
Here's my code, thanks. `
def Eval_rhythm(audio_path):
print("path ", audio_path)
# RNNDownBeatProcessor
act = downbeats.RNNDownBeatProcessor()(audio_path)
# DBNDownBeatTrackingProcessor
proc2 = downbeats.DBNDownBeatTrackingProcessor(beats_per_bar=[4, 4], fps=100)
result = proc2(act)
# print("act = ", act)
# print("---" * 50)
# print(result)
# Beat Std
rawbeat_std = [i[0] for i in result]
beat_new = [0] * len(rawbeat_std)
beat_new[0] = rawbeat_std[0]
for i in range(1, len(rawbeat_std)):
beat_new[i] = rawbeat_std[i] - rawbeat_std[i - 1]
beat_std = np.std(beat_new, ddof=1)
print("Beat Std = ", beat_std)
# Downbeat Std
rawdownbeat_std = [i[1] for i in result]
downbeat_new = [0] * len(rawdownbeat_std)
downbeat_new[0] = rawdownbeat_std[0]
for i in range(1, len(rawdownbeat_std)):
downbeat_new[i] = rawdownbeat_std[i] - rawdownbeat_std[i - 1]
downbeat_std = np.std(downbeat_new, ddof=1)
print("Downbeat Std = ", downbeat_std)
# Downbeat Salience
downbeat_salience = [i[1] for i in act]
salience = sum(downbeat_salience) / len(downbeat_salience)
print("Downbeat Salience = ", salience)
return beat_std, downbeat_std, salience
`
Take a quick look at your code.
- why
beat_new[0] = rawbeat_std[0]
, butbeat_new[i] = rawbeat_std[i] - rawbeat_std[i - 1]
- is "downbeat" salience, not "all" salience


I think I was calculating the downbeat salience. There's a nonstandard in the name of it ("salience") and I'll revise it. Thanks again!
Would you like to share the evaluation code?
Can you share the code of the evaluation。I can't find it in the website