-
Notifications
You must be signed in to change notification settings - Fork 7
SDK Overview
An SDK is provided in the packages/sdk
directory in the repository. It includes API definition files written in TypeScript, abstract classes that conforms the API written in JavaScript, and different implementations and utilities that can be used in WAM projects.
The following command will install a local distribution of the SDK into current project with an alias wamsdk
.
npm i wamsdk@file:../webaudiomodule/packages/sdk -D
The WAM API is considered as the plugin specification that should be implemented in each WAM. All the interfaces and types in the specification are described in TypeScript language in packages/sdk/api/types.d.ts
.
The API is designed for making Web-based audio plugins (WAMs) and using them in the hosts. As the VST, AudioUnit or AAX standards on the desktop DAWs, audio plugins usually includes an insertable DSP and an UI on the given platform along with some extra features such as parameter automations, MIDI message processing, state saving and loading, etc. These features' interface are standardized in the API for audio plugin and host developers.
VSCode IntelliSense will take the types into account by using JSDoc or TypeScript import. For example:
// JavaScript
/** @typedef {import('wamsdk/src/api/types').WamEvent} IWamEvent */
// TypeScript
import { WamEvent } from 'wamsdk/src/api/types';
The API supports these primary features:
-
Getting the WAM's information by fetching a JSON file.
-
Loading the WAM plugin constructor by fetching an ECMAScript Module file.
-
Getting a WebAudio AudioNode-compatible processor that can be inserted into an existing audio graph.
-
Saving and Restoring the plugin's state.
-
Getting parameter information from both main thread and audio thread (
AudioWorklet
). -
Scheduling automation events of audio parameters from both threads.
-
Scheduling transport, MIDI and OSC events with the host from both threads.
-
Emitting events for downstream WAM plugins from both threads.
-
The clean up when the plugin instance is destroyed.
The interfaces defined are:
-
A
WebAudioModule
interface, which is the main entry point of a WAM plugin instance. -
A
WamDescriptor
interface, which the descriptor JSON file should provide as the plugin's general information. -
A
WamNode
interface, which extends WebAudioAudioNode
that will be connected to the host's audio graph. -
A
WamProcessor
interface, which extendsAudioWorkletProcessor
that process signals in the audio thread. -
A
WamParameter
interface, which provides parameter information on both threads. -
A
WamEvent
interface, which can be used to schedule or emit WAM related events like automations or MIDI messages. -
A
WamEnv
interface, which is available on the audio thread to maintain the graph information there.
As a WAM distribution should include at least a descriptor in JSON and a JavaScript file that exports by default a WebAudioModule constructor. The constructor should provide statically:
-
isWebAudioModuleConstructor
getter that returnstrue
. -
createInstance
method that asynchronously instantiates the WebAudioModule.This method is a short hand for calling the constructor then the
initialize
method, and should return a Promise that resolves the WebAudioModule constructed and initialized. -
the
new
constructor.The WAM instance constructed by the
new
operator is only usable after callinginitialize
method.
From the host side, once imported the default export from the ESM module, the host can firstly do a type check using the isWebAudioModuleConstructor
getter, then construct the WAM instance using the createInstance
method. For example,
/** @typedef {typeof import('wamsdk').WebAudioModule} WebAudioModuleConstructor */
(async () => {
const audioCtx = new AudioContext();
const initialState = {};
const imported = await import('./path_to_wam/index.js');
/** @type {WebAudioModuleConstructor} */
const WAM = imported.default;
const isWAM = typeof WAM === 'function' && WAM.isWebAudioModuleConstructor;
if (!isWAM) return;
const wam = await WAM.createInstance(audioCtx, initialState);
return wam;
})();
Here,
const wam = await WAM.createInstance(audioCtx, initialState);
is equivalent to
const wam = new WAM(audioCtx);
await wam.initialize(initialState);
The following properties and methods should also be implemented.
-
isWebAudioModule
getter that returnstrue
. -
audioContext
getter that returns the currentBaseAudioContext
the WAM belongs to. -
audioNode
getter that returns theAudioNode
to be attached to an audio graph. -
initialized
getter that returnsfalse
before initialized, andtrue
after. -
moduleId
getter that returns an identifier of the current WAM, usually composed by its vender + its name. -
instanceId
getter that returns the unique identifier of the current WAM instance. -
descriptor
getter that returns aWamDescriptor
, same as the WAM's information in the JSON file. -
name
getter that returns the WAM's name. -
vendor
getter that returns the WAM vendor's name. -
initialize
method to asynchronously initialize the newly constructed WAM and itsAudioNode
, accepting one optional argument to set its initial state, returning a Promise that resolves aWamNode
. After initialized, the WAM will be available to connect itsAudioNode
to the host's audio graph. -
createGui
method to asynchronously create anElement
that can be attached to the HTML Document as the WAM's GUI, returning a Promise that resolves anElement
.There could be multiple GUI controlling the same WAM. Make sure all the GUI can control the WAM and are responding to any state change.
-
destroyGui
method, used to clean up a created GUI, accepting an argument of typeElement
which is an existing but no longer useful GUI, returningvoid
.
For example, a host can get and append to the document the WAM's GUI by doing following:
(async () => {
const container = document.getElementById('wam-container');
const wamGui = await wam.createGui();
container.appendChild(wamGui);
})();
and remove it by:
wamGui.remove();
wam.destroyGui(wamGui);
To connect an initialized WAM to an audio graph:
(async () => {
const defaultConstraints = {
audio: {
echoCancellation: false,
mozNoiseSuppression: false,
mozAutoGainControl: false,
},
};
const stream = await navigator.mediaDevices.getUserMedia(defaultConstraints);
const inputNode = audioCtx.createMediaStreamSource(stream);
const { audioNode } = wam;
inputNode.connect(audioNode);
audioNode.connect(audioCtx.destination);
})();
The WAM descriptor contains information that can be used for the host to properly categorize, display, and load WAM by its features. The WamDescriptor
interface is an object used in the WAM's descriptor JSON file and in its instance's descriptor
property. It has the following properties.
-
name
: the WAM's name. -
vendor
: the WAM vendor's name. -
version
: current version (string). -
sdkVersion
: the WAM SDK (API) version used. -
thumbnail
: a URL containing an image for the WAM's thumbnail. -
keywords
: an array of keyword strings. -
isInstrument
: boolean,true
if the WAM is a MIDI instrument. -
website
: a URL of the WAM's development website.
a set of boolean properties indicating the IO support of the WAM. They are optional in the descriptor JSON, but mandatory to the descriptor
getter under the WebAudioModule
interface. These properties will affect the WAM's behavior in the host when it receives audio or events from the upstream WAMs.
hasAudioInput
hasAudioOutput
hasMidiInput
hasMidiOutput
hasAutomationInput
hasAutomationOutput
hasMpeInput
hasMpeOutput
hasOscInput
hasOscOutput
hasSysexInput
hasSysexOutput
WamNode
is an extended WebAudio AudioNode
, available with the audioNode
getter under the WebAudioModule
interface.
A WAM host will use its native (or overridden) connect
and disconnect
methods to run its underlying DSP in an audio graph. The WamNode
can also be the destination node of any AudioNode
connection.
In this WamNode
interface, the related WebAudioModule
can be found using the module
getter.
It has following methods:
Lifecycle related:
-
destroy
: This method should be called by the host before removing theWamNode
. The WAM developer could perform a clean up by overriding this method. For example, remove event listeners or close AudioWorklet port.
State related:
getState
setState
(async () => {
const currentState = await wamNode.getState();
await wamNode.setState(currentState);
})();
A state could be any serializable type used to save or restore a state of a WAM.
Parameters related:
getParameterInfo
getParameterValues
setParameterValues
Note that a WAM parameter is different from WebAudio AudioParam
to support audio thread side manipulations. To schedule automations to the WAM parameters, the host can use scheduleEvents
.
Event related:
-
scheduleEvents
: schedule an WAM event with a timestamp. -
clearEvents
: remove all the future events.
An WAM events can contain parameter changes, MIDI events, etc. To all a WAM to send events to other WAMs the host can call following methods.
connectEvents
disconnectEvents
The connection should be done on the audio thread by calling webAudioModules.connectEvents
or webAudioModules.disconnectEvents
.
These events will be dispatched when sended or processed at the scheduled time. The host can capture them by addEventListener
.
-
getCompensationDelay
: The host can get a compensation delay hint value in samples. The value is not measured by the host but provided by the WAM developer.
Each WAM plugin should provide an WamProcessor
interface on the AudioWorklet
thread. The interface is extended by an AudioWorkletProcessor
, created by an WamNode
. On the audio thread, the processor can access a WamEnv
interface under globalThis.webAudioModules
. When the processor is created, it should call webAudioModules.create(this);
to register itself to the WamEnv
.
WamProcessor
has the following getters and methods:
-
moduleId
getter: returns an identifier of the current WAM, same as in theWebAudioModule
interface. -
instanceId
getter: returns the unique identifier of the current WAM instance, same as in theWebAudioModule
interface. -
getCompensationDelay
-
scheduleEvents
-
clearEvents
-
destroy
: the method should disconnect from its event graph by callingwebAudioModules.destroy(this);
They are the same as in the WamNode
interface.
-
emitEvents
can be used by the WAM to pass any event to downstream WAMs in the event graph.