Skip to content

Add Start Offset to AudioStreamSynchronized#116735

Draft
BuzzLord wants to merge 1 commit intogodotengine:masterfrom
BuzzLord:schedule-synchronized-stream
Draft

Add Start Offset to AudioStreamSynchronized#116735
BuzzLord wants to merge 1 commit intogodotengine:masterfrom
BuzzLord:schedule-synchronized-stream

Conversation

@BuzzLord
Copy link

An attempt at adding audio stream scheduling to Godot. This change adds a new offset parameter to the AudioStreamSynchronized streams, that is a time in seconds. When the AudioStreamSynchronized is playing, it keeps track of its own playback time, and will wait to start mixing in the streams until their offset time has come. It calculates it down to the frame (or as close as it can get based on the audio servers mix rate). Since all the times are relative to the overall AudioStreamSynchronized playback, this should make it easy schedule sounds to play at any exact time you want, relative to each other.

Seeking also works, and will restart any streams that have completed, if needed.

There is currently an arbitrary limit (32) on the number of streams that can be added, but maybe that can be changed if we think it necessary for this to be useful. Also, adding streams dynamically does not work smoothly, as it stops the playback as part of its re-initialization. This means you can't bypass the 32 limit by dynamically adding/removing streams on the fly.

Potentially fixes godotengine/godot-proposals#1151.

Each stream in the AudioStreamSynchronized can have an offset, which is
a time delay offset, in seconds, from when the stream is started (or seeked).
@LemmaEOF
Copy link

I'm not sure this is very useful without #107226's AudioServer.get_absolute_time() method - having a way to set the specific start time absolutely, instead of relative to other audio tracks that may or may not exist, is kinda essential.

@BuzzLord
Copy link
Author

I'm not sure this is very useful without #107226's AudioServer.get_absolute_time() method - having a way to set the specific start time absolutely, instead of relative to other audio tracks that may or may not exist, is kinda essential.

Can you give an example of a situation where you'd need absolute time? I was picturing the issue as needing a way to schedule sounds to play at specific times so they lined up with other sounds or background music or something... always relative to something else. But I'm admittedly not the target audience of this change.

@LemmaEOF
Copy link

A specific use case of mine right now is a count-in metronome before a song begins in a rhythm game - the metronome sound effects happen before the main background music starts playing and are too short to be used as the base reference for a synchronized stream, so they need to have access to an absolute time in order to sync properly with the song.

@BuzzLord
Copy link
Author

You can just have the song start at the delay then. Metronome sounds play at time 0.0, 1.0, 2.0, etc, then the main song starts at 5.0. All subsequent sounds can play based off that offset.

@LemmaEOF
Copy link

Ehhh, I still feel like having the option for absolute time is important enough to consider - it gives more flexibility as to how you can approach things rather than just being locked into relative start times. This doesn't feel like it properly covers the use cases as well as 107226, given it also doesn't have support for dynamically adding streams.

@PizzaLovers007
Copy link
Contributor

Thanks for investigating! Being able to dynamically add streams would definitely be necessary if we were to move forward with this approach. Additionally, a large portion of the use cases would likely reuse the same stream multiple times (hitsounds/metronome), so having each scheduled hit require its own stream is a bit clunky. I think the actual overlapping feature set between AudioStreamSynchronized and scheduled sounds is minimal, so there would need to be some significant rework to make it easier to use. If we were to consider the best practices here of "to each problem, its own solution", maybe introducing another AudioStream type for this specific problem would be better suited? I do know we were trying to avoid that from the meeting discussion though 😅

@BuzzLord
Copy link
Author

After playing around with this change, reading about how Unity does it (which your PR duplicates very closely), and thinking of some counter examples, I'm convinced that your PR is the way to go.

This PR could be changed to allow for dynamically adding streams, or rescheduling sounds within the stream playback, but I think the biggest drawback is just having all the sync'd audio go through the same stream playback, and thus the same audio bus, with the same mixing. You couldn't do something like Hi-Fi Rush, where there are lots of audio sources in the world, and you want to optionally schedule them to play on the beat.

I also think that using the absolute time the way you implemented it is the best way to do it. This PR does nearly the same thing, but the time is the local playback time, rather than the global playback time. No need to use a reference audio playback to be relative to.

I don't know if this PR is worth keeping. It does allow for shifting the start time of sync'd sounds, which may be a useful feature. I'll leave it as draft for now, but I'm going to advocate for your PR to be merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add an absolute time (DSP time) feature to play audio effects at specific intervals

4 participants