Make note rendering based on frames AND song position
Note positions are updated based on the songPosition, songPosition isn't updated every frame, and thus the notes seem to lag on higher framerates.
What this PR does, is adds a new method to Conductor that grabs the songPosition and adds a new variable songPositionDelta to it to create a more accurate songPosition.
Essentially doing what this pr did, without the worry of desync.
This is based on how Minecraft does it's rendering at higher framerates, where instead of everything rendering at 20 fps (since the game runs at 20 ticks per second), they render things at the current tick + the time between that tick and the next tick.
Closes https://github.com/FunkinCrew/Funkin/issues/3127
Here's a vid of Blazin':
https://github.com/user-attachments/assets/66aba943-c436-4043-943f-ac2540eb93a1
And here's the same vid slowed down to x0.25 speed so people on 60hz monitors can actually see a difference:
https://github.com/user-attachments/assets/7897faa8-2ded-4e14-91aa-4d4170b82b00
This issue has been bugging me for a rly long time as a 120hz player, thanks!
I approve of this It's what we do in Troll Engine and Stepmania does something very similar
I can (kinda?) see the problem you're trying to solve, but this is definitely a janky solution for it.
The proper fix is to just improve the code for retrieving the songPosition such that the value is more accurate.
It is a janky solution, but I couldn't think of anything for fixing songPosition itself so this is the best I've got.
As far as I'm aware this doesn't have any issues with it besides the fact it doesn't solve the actual problem at hand, but I feel like if Stepmania does something similar as nebula said, then it can't be that terrible of a thing to do.
There's a point in which a problem becomes too complicated to solve and it's just easier to do a band-aid fix, especially when the only thing it effects is the note rendering.
I don't think we can do anything about getting a more accurate time for songPosition. We would need to look into openAL or lime.NativeAudioSource
lime._internal.backend.native.NativeAudioSource.hx:
else
{
// var offset = AL.getSourcei(handle, AL.BYTE_OFFSET);
// var ratio = (offset / dataLength);
// var totalSeconds = samples / parent.buffer.sampleRate;
// var sampleTime = AL.getSourcef(handle, AL.SAMPLE_OFFSET);
// var time = (sampleTime / parent.buffer.sampleRate * 1000) -
// var time = Std.int(totalSeconds * ratio * 1000) - parent.offset;
// var time = Std.int (AL.getSourcef (handle, AL.SEC_OFFSET) * 1000) - parent.offset;
var value = AL.getSourcedvSOFT(handle, AL.SEC_OFFSET_LATENCY_SOFT, 2);
var deviceOffset:Float = value[1];
var realOffset:Float = value[0];
var time:Float = ((realOffset - deviceOffset) * 1000) - parent.offset;
if (time < 0) return 0;
return Std.int(time);
}
that is the function for retriving the songPosition. I dont know what getSourcedvSOFT does exactly, the only thing i know is that (atleast for me) the songPosition updates every 10ms, meaning 100fps
i'll see if getSourcef is better, but i doubt it
Im not sure if OpenAL supports setting the update interval, maybe someone knows that?
yeah I feel like if it could update faster they would've done it already
there could be a solution somewhere but it requires so much digging and learning how all that stuff works, it just doesn't seem worth it
I don't think it's janky, it seems pretty clean and simple implementation!
- If
songPositionis updated when we callConductor.update(), we use that value, which is thesongPositionvalue we've always used - If
songPositionisn't updated, theframeSongPositionis incremented by delta time (FlxG.elapsed), essentially making an assumption that ifsongPositionwere to be updated that frame, it would be about at that time. - With rhythm games you obviously want to use the music time to keep in time, so incrementing is usually not the way to go, but I think we get the actual
songPositionupdated frequently enough that there won't be any drift occuring that lasts more than, maybe a frame or two, which would purely be visual, since we use a different system for getting accurate inputs.
just rebased / cleaned up the commit history here to match current develop branch 🤝
guys github is so confusing why is there magically 38 commits again
you may have pushed your local "changes" which would have been the old commit history with all of the merge commits and whatnot. If you're using command line, you can type (as long as you dont have any local changes you ACTUALLY want to make)
git fetch origin
git reset --hard origin/frame-song-position-for-note-rendering
and then your git history on your computer will match the one i very evilly force pushed
thank you mr muffin I will just simply not touch this again as to make sure I don't break anything
I can (kinda?) see the problem you're trying to solve, but this is definitely a janky solution for it.
The proper fix is to just improve the code for retrieving the
songPositionsuch that the value is more accurate.
Don't know how much better you CAN get though, given audio clock n all. Unless flixel adds something similar to Unity's dspTime this is probably the best we're gonna get.
The method used here is used in Stepmania and all of it's forks (including highly competitive ones like Etterna) because it simply.. Works. (I think Quaver does something similar, too?)
I can (kinda?) see the problem you're trying to solve, but this is definitely a janky solution for it.
The proper fix is to just improve the code for retrieving the
songPositionsuch that the value is more accurate.
tbh this "janky" solution is the only one i can think of, its actually really effective in how it works, and the "proper fix" you are eluding to here isnt really achievable unless if you change the audio system flixel or lime itself.
for that i'd argue it isnt really that janky at all and the audio clock stuff we'd have to do otherwise would be infinitely jankier, and it would not fix the note rendering issue at all because the position updating would still be independent to the framerate, and doing any other method would just loop back to doing FlxG.elapsed * 1000 but even more complicated
stepmania, openitg, notitg, etterna, project outfox and osu!lazer do it the same way so finding any other method to do so is a fool's errand
I think an alternative off the top of my own head is to use the Conductor.getTimeWithDiff(), which is what we use to timestamp and get a more accurate song position for INPUTS (which works if you call it in-between frames as well). If using that provides a similar result to using this PR's framePosition stuff, then I think getTimeWithDiff() might be optimal, so we don't have two functions/variables that do nearly the same thing
I think an alternative off the top of my own head is to use the
Conductor.getTimeWithDiff(), which is what we use to timestamp and get a more accurate song position for INPUTS (which works if you call it in-between frames as well). If using that provides a similar result to using this PR's framePosition stuff, then I thinkgetTimeWithDiff()might be optimal, so we don't have two functions/variables that do nearly the same thing
using getTimeWithDiff won't change anything, since _channel.position is frame-independent. In my case its 10ms, meaning it doesn't matter when or where you call it, it will always use the 10ms as a check for updating the sound.
I logged getTimeWithDiff in the update function of PlayState:
source/funkin/play/PlayState.hx:837: TIME: 1166
source/funkin/play/PlayState.hx:837: TIME: 1176
source/funkin/play/PlayState.hx:837: TIME: 1186
source/funkin/play/PlayState.hx:837: TIME: 1196
source/funkin/play/PlayState.hx:837: TIME: 1196
source/funkin/play/PlayState.hx:837: TIME: 1206
source/funkin/play/PlayState.hx:837: TIME: 1216
source/funkin/play/PlayState.hx:837: TIME: 1226
source/funkin/play/PlayState.hx:837: TIME: 1236
source/funkin/play/PlayState.hx:837: TIME: 1236
source/funkin/play/PlayState.hx:837: TIME: 1246
source/funkin/play/PlayState.hx:837: TIME: 1256
source/funkin/play/PlayState.hx:837: TIME: 1266
source/funkin/play/PlayState.hx:837: TIME: 1276
source/funkin/play/PlayState.hx:837: TIME: 1286
@KutikiPlayz one question though, don't we need to apply this:
https://github.com/FunkinCrew/Funkin/blob/f61789e45352b0859ea5cef902fcedc92da2afee/source/funkin/Conductor.hx#L416
to frameSongPos, or am i wrong?
see I was thinking about that but the offsets are added directly to songPosition itself, and since frameSongPosition increments itself we don’t wanna add the offsets to it over and over again, it should just get the offsets when it syncs with songPosition
I think an alternative off the top of my own head is to use the
Conductor.getTimeWithDiff(), which is what we use to timestamp and get a more accurate song position for INPUTS (which works if you call it in-between frames as well). If using that provides a similar result to using this PR's framePosition stuff, then I thinkgetTimeWithDiff()might be optimal, so we don't have two functions/variables that do nearly the same thing
I’ll test it with getTimeWithDiff() and see what happens
lemz was right, there's no difference between using songPosition and getTimeWithDiff()
@ninjamuffin99 was there a reason this stuff was commented out?
Cuz using it actually does what I assume it wants to do, and also effectively does what I did here with
frameSongPosition
so should I just make this pr fix that and use it instead?
i believe it was an old implementation of sorts, i think when fnf explicitly gets the openfl channel position thing, we added stuff in lime so that what it requests is super accurate i thinky
although maybe something brokey allng the way and we still dont actually use it properly lol. i had a PR in lime for it in anycase
I found the pr and have been messing around with your test app (very very useful btw glad I found it)
music.time | frameTime | timeWithDiff (60fps)
music.time | frameTime | timeWithDiff (300fps)
it makes sense now why we're still having these issues, whatever changes you made just didn't do much I guess, as it still does the plateauing you aimed to fix
and the old code in
getTimeWithDiff() seems very very unstable so frameSongPosition still seems like a good solution to me
Don't think. Just merge
Don't abandon this
Don't think. Just merge
They still need to review this internally first, they can't just merge PRs left and right. They'll get to this PR eventually.
They still need to review this internally first, they can't just merge PRs left and right. They'll get to this PR eventually.
If they were reviewing it internally they would have given it that tag. This one has "needs revision".
Nice I forgot to say this back in the review, but there also seems to be a problem with this PR interacting with Linux (at least my Ubuntu installation) and audio device latency. Could be outside the scope of this PR though?
https://github.com/user-attachments/assets/35755cb8-09ad-4ab1-9525-052e5328f94b
Linux seems to correctly retrieve the time of playing audio, even accounting for audio device latency. This means that even with pretty much 0 audio/visual offset and using Bluetooth headphones, the game will continue to exactly sync with what you can see. However, this means that the audio stays at position 0 for quite a while until the song becomes audible.
The PR recognizes this as "Hey, songPosition hasn't updated yet! Let's increment songPositionDelta". Then it lags waaay back when the audio finally starts being audible and songPosition actually updates-- it's bad enough for me that it reaches up to 300 on songPositionDelta.
Not really sure how you'd fix this anyway, but an idea I had was to wait until songPosition updates after any pauses before allowing songPositionDelta to increment-- and it's probably the best way to go about it? (Asides finding a way to disable this behavior entirely, maybe) I tried implementing it on my own using a boolean, but it was pretty janky...
This PR closes this issue: https://github.com/FunkinCrew/Funkin/issues/3127
any updates on this being merged
We'll get back to reviewing PRs after this round of hotfixes!