-
Notifications
You must be signed in to change notification settings - Fork 452
Description
When a channel is added to a mixer stream, its first byte gets processed the next time data is retrieved from the mixer. This effectively causes the start of a channel to snap to the start of the next period, which introduces 0-1 periods of variance to the latency. E.g. when the audio engine uses a 10ms periodicity, then it takes anywhere between 0-10ms (depending on cycle alignment) until the channel's data begins to flow towards the output.
For tracks this isn't a problem, because the audio clock is based on the number of bytes that have been processed, so any initial delay affecting the track will by extension also affect the clock, and thus everything that's timed relative to that clock. shifting everything is the same as shifting nothing, so this is fine.
For samples however this means ±5ms uncertainty relative to an ongoing track, which is a problem. For instance, fully keysounded beatmaps tend to rely on seamless transitions between adjacent samples, so that any discontinuity between them reflects the temporal imprecision in the player's inputs. But right now, even auto-play has audible gaps/overlaps, unless the first sample's duration happens to be a perfect integer multiple of the audio update interval.
I still need to check if BASS has a mechanism for scheduling samples with an offset for intra-period precision, but if not, then a possible workaround would be to prepend samples with 1 period worth of silence, and then skip 1 period - time since last update of that silence when registering the sample for playback. Alternatively we could try to configure a stupidly fast update rate in order to reduce the variance to irrelevance