The AudioNode
interface is a generic interface for representing an audio processing module like an audio source (e.g. an HTML <audio>
or <video>
element, an OscillatorNode
, etc.), the audio destination, intermediate processing module (e.g. a filter like BiquadFilterNode
or ConvolverNode
), or volume control (like GainNode
).
<div id="interfaceDiagram" style="display: inline-block; position: relative; width: 100%; padding-bottom: 11.666666666666666%; vertical-align: middle; overflow: hidden;"><svg style="display: inline-block; position: absolute; top: 0; left: 0;" viewbox="-50 0 600 70" preserveAspectRatio="xMinYMin meet"><a xlink:href="https://developer.mozilla.org/en-US/docs/Web/API/EventTarget" target="_top"><rect x="1" y="1" width="110" height="50" fill="#fff" stroke="#D4DDE4" stroke-width="2px" /><text x="56" y="30" font-size="12px" font-family="Consolas,Monaco,Andale Mono,monospace" fill="#4D4E53" text-anchor="middle" alignment-baseline="middle">EventTarget</text></a><polyline points="111,25 121,20 121,30 111,25" stroke="#D4DDE4" fill="none"/><line x1="121" y1="25" x2="151" y2="25" stroke="#D4DDE4"/><a xlink:href="https://developer.mozilla.org/en-US/docs/Web/API/AudioNode" target="_top"><rect x="151" y="1" width="90" height="50" fill="#F4F7F8" stroke="#D4DDE4" stroke-width="2px" /><text x="196" y="30" font-size="12px" font-family="Consolas,Monaco,Andale Mono,monospace" fill="#4D4E53" text-anchor="middle" alignment-baseline="middle">AudioNode</text></a></svg></div>
a:hover text { fill: #0095DD; pointer-events: all;}
An AudioNode
has inputs and outputs, each with a given amount of channels. An AudioNode
with zero inputs and one or multiple outputs is called a source node. The exact processing done varies from one AudioNode
to another but, in general, a node reads its inputs, does some audio-related processing, and generates new values for its outputs, or simply lets the audio pass through (for example in the AnalyserNode
, where the result of the processing is accessed separately).
Different nodes can be linked together to build a processing graph. Such a graph is contained in an AudioContext
. Each AudioNode
participates in exactly one such context. In general, processing nodes inherit the properties and methods of AudioNode
, but also define their own functionality on top. See the individual node pages for more details, as listed on the Web Audio API homepage.
Note: An AudioNode
can be target of events, therefore it implements the EventTarget
interface.
Properties
AudioNode.context
Read only- Returns the associated
AudioContext
, that is the object representing the processing graph the node is participating in.
AudioNode.numberOfInputs
Read only- Returns the number of inputs feeding the node. Source nodes are defined as nodes having a
numberOfInputs
property with a value of0
.
AudioNode.numberOfOutputs
Read only- Returns the number of outputs coming out of the node. Destination nodes — like
AudioDestinationNode
— have a value of0
for this attribute.
AudioNode.channelCount
- Represents an integer used to determine how many channels are used when up-mixing and down-mixing connections to any inputs to the node. Its usage and precise definition depend on the value of
AudioNode.channelCountMode
.
AudioNode.channelCountMode
- Represents an enumerated value describing the way channels must be matched between the node's inputs and outputs.
AudioNode.channelInterpretation
- Represents an enumerated value describing the meaning of the channels. This interpretation will define how audio up-mixing and down-mixing will happen.
The possible values are"speakers"
or"discrete"
.
Methods
Also implements methods from the interface EventTarget
.
AudioNode.connect()
- Allows us to connect the output of this node to be input into another node, either as audio data or as the value of an
AudioParam
. AudioNode.disconnect()
- Allows us to disconnect the current node from another one it is already connected to.
Example
This simple snippet of code shows the creation of some audio nodes, and how the AudioNode
properties and methods can be used. You can find examples of such usage on any of the examples linked to on the Web Audio API landing page (for example Violent Theremin.)
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); var oscillator = audioCtx.createOscillator(); var gainNode = audioCtx.createGain(); oscillator.connect(gainNode); gainNode.connect(audioCtx.destination); oscillator.context; oscillator.numberOfInputs; oscillator.numberOfOutputs; oscillator.channelCount;
Specifications
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'AudioNode' in that specification. |
Working Draft |
Browser compatibility
Feature | Chrome | Edge | Firefox (Gecko) | Internet Explorer | Opera | Safari (WebKit) |
---|---|---|---|---|---|---|
Basic support | 10.0webkit | (Yes) | 25.0 (25.0) | No support | 15.0webkit 22 (unprefixed) |
(Yes) |
channelCount channelCountMode |
(Yes) webkit | (Yes) | (Yes) | No support | (Yes) | No support |
connect (AudioParam) |
(Yes) webkit | (Yes) | (Yes) | No support | (Yes) | No support |
Feature | Android | Edge | Firefox Mobile (Gecko) | Firefox OS (Gecko) | IE Phone | Opera Mobile | Safari Mobile |
---|---|---|---|---|---|---|---|
Basic support | ? | (Yes) | 25.0 | 1.2 | ? | ? | ? |
channelCount channelCountMode |
No support | (Yes) | (Yes) | (Yes) | No support | No support | No support |
connect (AudioParam) |
No support | (Yes) | (Yes) | (Yes) | No support | No support | No support |