Skip to content

Audio

SAGE provides a channel-based audio system built on BabylonJS AudioV2. You define named channels (like music, sfx, ambient) and the system creates an audio bus graph with independent volume and mute controls per channel. This gives your players the standard per-category volume sliders they expect from a settings menu.

The audio system is opt-in. If you don't pass audioChannels to createGameEngine(), no audio infrastructure is created and SoundBehavior / the sound handler will not create sounds.

Architecture

The audio system follows SAGE's iDesign layering:

┌────────────────────────────────────────────┐
│              AudioManager                  │  ← Orchestration layer
│  - Channel volume/mute                     │
│  - Master volume/mute                      │
│  - Sound creation (routed to channels)     │
├────────────────────────────────────────────┤
│              AudioEngine                   │  ← Engine layer (wraps BabylonJS AudioV2)
│  - WebAudio engine lifecycle               │
│  - Bus creation & management               │
│  - Raw sound creation                      │
├────────────────────────────────────────────┤
│         BabylonJS AudioV2                  │  ← Underlying implementation
│  - WebAudio API                            │
│  - AudioBus, StaticSound                   │
└────────────────────────────────────────────┘

Audio graph:

MainBus (master volume)
  ├── music bus
  ├── sfx bus
  ├── ambient bus
  └── ... (one bus per channel)

Each sound is routed to a channel bus. Adjusting a channel's volume affects all sounds on that bus. The master volume sits at the top of the graph and scales everything.

Setup

Pass audioChannels to createGameEngine():

typescript
import { createGameEngine } from '@skewedaspect/sage';

const engine = await createGameEngine(canvas, entityDefs, {
    audioChannels: [ 'music', 'sfx', 'ambient', 'voice' ],
});

This creates an AudioEngine and AudioManager, both accessible on the engine instance:

typescript
engine.engines.audioEngine   // AudioEngine (low-level)
engine.managers.audioManager  // AudioManager (use this one)

Scene integration

SoundBehavior and the Blender sound handler receive the AudioManager automatically via gameEngine. No manual wiring is needed per-level -- as long as you passed audioChannels to createGameEngine(), behaviors and handlers will have access to the audio system.

AudioManager

The AudioManager is the primary interface for game code. It handles channel routing, volume, muting, and sound creation.

Creating sounds

typescript
const audioManager = engine.managers.audioManager;

// Create a sound on the 'sfx' channel
const jumpSound = await audioManager.createSound('jump', 'audio/jump.ogg', 'sfx');

// Create a sound on the 'music' channel with options
const bgMusic = await audioManager.createSound('bgm', 'audio/theme.ogg', 'music', {
    loop: true,
    autoplay: true,
    volume: 0.7,
});

// No channel -- routes to the main bus directly
const uiClick = await audioManager.createSound('click', 'audio/click.ogg');

Signature:

typescript
async createSound(
    name : string,
    url : string,
    channel ?: string,
    options ?: Partial<IStaticSoundOptions>
) : Promise<StaticSound>
ParameterTypeDescription
namestringUnique identifier for the sound
urlstringPath to the audio file
channelstringChannel name to route to (optional)
optionsPartial<IStaticSoundOptions>BabylonJS AudioV2 sound options (optional)

If channel doesn't match a registered channel, a warning is logged and the sound routes to the main bus.

Master volume

typescript
// Set master volume (0-1)
audioManager.setMasterVolume(0.8);

// Get current master volume
const vol = audioManager.getMasterVolume();

// Mute/unmute everything
audioManager.setMasterMuted(true);
audioManager.setMasterMuted(false);  // restores previous volume

// Check mute state
if(audioManager.isMasterMuted()) { /* ... */ }

Master mute stores the current volume and sets it to 0. Unmuting restores the stored volume, so setMasterVolume() and setMasterMuted() work independently without stepping on each other.

Channel volume

typescript
// Set channel volume (0-1)
audioManager.setChannelVolume('music', 0.5);
audioManager.setChannelVolume('sfx', 1.0);

// Get channel volume
const musicVol = audioManager.getChannelVolume('music');

// Mute/unmute a channel
audioManager.setChannelMuted('music', true);
audioManager.setChannelMuted('music', false);  // restores previous volume

// Check mute state
if(audioManager.isChannelMuted('music')) { /* ... */ }

Channel mute works the same way as master mute -- volume is stored and restored on unmute.

Querying channels

typescript
// Get all registered channel names
const channels = audioManager.getChannels();
// => ['music', 'sfx', 'ambient', 'voice']

This is useful for building settings UIs dynamically.

AudioEngine

The AudioEngine is the lower-level wrapper around BabylonJS AudioV2. You typically don't need to use it directly -- AudioManager covers the common cases. It's exposed for advanced scenarios like creating custom bus topologies.

API

MethodSignatureDescription
initialize()() => Promise<void>Creates the WebAudio engine and main bus
createBus(name)(string) => Promise<AudioBus>Create a named bus routed to the main bus
getBus(name)(string) => AudioBus | undefinedLook up a bus by name
createSound(name, url, bus?, options?)(...) => Promise<StaticSound>Create a sound, optionally routed to a bus
setMasterVolume(volume)(number) => voidSet the engine-level master volume
getMasterVolume()() => numberGet the engine-level master volume
setBusVolume(bus, volume)(AudioBus, number) => voidSet volume on a specific bus

Settings UI example

A typical audio settings implementation using AudioManager:

typescript
import type { AudioManager } from '@skewedaspect/sage';

function buildAudioSettings(audioManager : AudioManager)
{
    // Build sliders for each channel
    for(const channel of audioManager.getChannels())
    {
        const volume = audioManager.getChannelVolume(channel);
        const muted = audioManager.isChannelMuted(channel);

        // Create your UI slider with these values...
        // On slider change:
        //   audioManager.setChannelVolume(channel, newValue);
        // On mute toggle:
        //   audioManager.setChannelMuted(channel, !muted);
    }

    // Master volume
    const masterVol = audioManager.getMasterVolume();
    // audioManager.setMasterVolume(newValue);
    // audioManager.setMasterMuted(true/false);
}

Without AudioManager

When audioChannels is not provided (or is empty), the audio system is not initialized:

  • engine.engines.audioEngine is undefined
  • engine.managers.audioManager is undefined
  • SoundBehavior logs a warning and creates no sounds
  • The Blender sound handler logs a warning and creates no sounds

To enable audio, pass audioChannels to createGameEngine().

Released under the MIT License.