tone
- Version 15.0.4
- Published
- 5.4 MB
- 2 dependencies
- MIT license
Install
npm i tone
yarn add tone
pnpm add tone
Overview
A Web Audio framework for making interactive music in the browser.
Index
Variables
Functions
- connect()
- connectSeries()
- connectSignal()
- dbToGain()
- disconnect()
- fanIn()
- Frequency()
- ftom()
- gainToDb()
- getContext()
- getDestination()
- getDraw()
- getListener()
- getTransport()
- immediate()
- intervalToFrequencyRatio()
- isArray()
- isBoolean()
- isDefined()
- isFunction()
- isNote()
- isNumber()
- isObject()
- isString()
- isUndef()
- loaded()
- Midi()
- mtof()
- now()
- Offline()
- setContext()
- start()
- Ticks()
- Time()
- TransportTime()
Classes
BaseContext
- addAudioWorkletModule()
- clearInterval()
- clearTimeout()
- createAnalyser()
- createAudioWorkletNode()
- createBiquadFilter()
- createBuffer()
- createBufferSource()
- createChannelMerger()
- createChannelSplitter()
- createConstantSource()
- createConvolver()
- createDelay()
- createDynamicsCompressor()
- createGain()
- createIIRFilter()
- createMediaElementSource()
- createMediaStreamDestination()
- createMediaStreamSource()
- createOscillator()
- createPanner()
- createPeriodicWave()
- createStereoPanner()
- createWaveShaper()
- currentTime
- decodeAudioData()
- destination
- draw
- getConstant()
- immediate()
- isOffline
- latencyHint
- listener
- lookAhead
- now()
- rawContext
- resume()
- sampleRate
- setInterval()
- setTimeout()
- state
- toJSON()
- transport
Context
- addAudioWorkletModule()
- clearInterval()
- clearTimeout()
- clockSource
- close()
- createAnalyser()
- createAudioWorkletNode()
- createBiquadFilter()
- createBuffer()
- createBufferSource()
- createChannelMerger()
- createChannelSplitter()
- createConstantSource()
- createConvolver()
- createDelay()
- createDynamicsCompressor()
- createGain()
- createIIRFilter()
- createMediaElementSource()
- createMediaStreamDestination()
- createMediaStreamSource()
- createOscillator()
- createPanner()
- createPeriodicWave()
- createStereoPanner()
- createWaveShaper()
- currentTime
- decodeAudioData()
- destination
- dispose()
- draw
- getConstant()
- getDefaults()
- immediate()
- isOffline
- latencyHint
- listener
- lookAhead
- name
- now()
- rawContext
- resume()
- sampleRate
- setInterval()
- setTimeout()
- state
- transport
- updateInterval
- workletsAreReady()
Param
- apply()
- cancelAndHoldAtTime()
- cancelScheduledValues()
- convert
- defaultValue
- dispose()
- exponentialApproachValueAtTime()
- exponentialRampTo()
- exponentialRampToValueAtTime()
- getDefaults()
- getValueAtTime()
- input
- linearRampTo()
- linearRampToValueAtTime()
- maxValue
- minValue
- name
- overridden
- rampTo()
- setParam()
- setRampPoint()
- setTargetAtTime()
- setValueAtTime()
- setValueCurveAtTime()
- targetRampTo()
- units
- value
Signal
- apply()
- cancelAndHoldAtTime()
- cancelScheduledValues()
- connect()
- convert
- dispose()
- exponentialApproachValueAtTime()
- exponentialRampTo()
- exponentialRampToValueAtTime()
- getDefaults()
- getValueAtTime()
- input
- linearRampTo()
- linearRampToValueAtTime()
- maxValue
- minValue
- name
- output
- overridden
- override
- rampTo()
- setRampPoint()
- setTargetAtTime()
- setValueAtTime()
- setValueCurveAtTime()
- targetRampTo()
- units
- value
Interfaces
Type Aliases
- AMSynthOptions
- AnalyserType
- AutomationEvent
- BaseAudioContextSubset
- BasicPlaybackState
- ContextLatencyHint
- DCMeterOptions
- EnvelopeCurve
- ExcludedFromBaseAudioContext
- FilterOptions
- FilterRollOff
- FrequencyUnit
- GreaterThanOptions
- GreaterThanZeroOptions
- InputNode
- LFOOptions
- MidSideMergeOptions
- MidSideSplitOptions
- MonoOptions
- NoiseType
- OmniOscillatorOptions
- OmniOscSourceType
- OnePoleFilterType
- OutputNode
- PlaybackState
- ToneAudioNodeOptions
- ToneBufferSourceCurve
- ToneEventCallback
- ToneOscillatorType
- WaveShaperMappingFn
Namespaces
Variables
variable Buffer
const Buffer: typeof ToneAudioBuffer;
Deprecated
Use ToneAudioBuffer
variable Buffers
const Buffers: typeof ToneAudioBuffers;
Deprecated
Use ToneAudioBuffers
variable BufferSource
const BufferSource: typeof ToneBufferSource;
Deprecated
Use ToneBufferSource
variable context
const context: BaseContext;
variable Destination
const Destination: DestinationClass;
The Destination (output) belonging to the global Tone.js Context.
See Also
DestinationClass Core
Deprecated
Use getDestination instead
variable Draw
const Draw: DrawClass;
variable Listener
const Listener: ListenerClass;
The ListenerClass belonging to the global Tone.js Context. Core
Deprecated
Use getListener instead
variable Master
const Master: DestinationClass;
Deprecated
Use getDestination instead
variable Transport
const Transport: TransportClass;
The Transport object belonging to the global Tone.js Context.
See Also
TransportClass Core
Deprecated
Use getTransport instead
variable version
const version: string;
Functions
function connect
connect: ( srcNode: OutputNode, dstNode: InputNode, outputNumber?: number, inputNumber?: number) => void;
Connect two nodes together so that signal flows from the first node to the second. Optionally specify the input and output channels.
Parameter srcNode
The source node
Parameter dstNode
The destination node
Parameter outputNumber
The output channel of the srcNode
Parameter inputNumber
The input channel of the dstNode
function connectSeries
connectSeries: (...nodes: InputNode[]) => void;
connect together all of the arguments in series
Parameter nodes
function connectSignal
connectSignal: ( signal: OutputNode, destination: InputNode, outputNum?: number, inputNum?: number) => void;
When connecting from a signal, it's necessary to zero out the node destination node if that node is also a signal. If the destination is not 0, then the values will be summed. This method insures that the output of the destination signal will be the same as the source signal, making the destination signal a pass through node.
Parameter signal
The output signal to connect from
Parameter destination
the destination to connect to
Parameter outputNum
the optional output number
Parameter inputNum
the input number
function dbToGain
dbToGain: (db: Decibels) => GainFactor;
Convert decibels into gain.
function disconnect
disconnect: ( srcNode: OutputNode, dstNode?: InputNode, outputNumber?: number, inputNumber?: number) => void;
Disconnect a node from all nodes or optionally include a destination node and input/output channels.
Parameter srcNode
The source node
Parameter dstNode
The destination node
Parameter outputNumber
The output channel of the srcNode
Parameter inputNumber
The input channel of the dstNode
function fanIn
fanIn: (...nodes: OutputNode[]) => void;
Connect the output of one or more source nodes to a single destination node
Parameter nodes
One or more source nodes followed by one destination node
Example 1
const player = new Tone.Player("https://tonejs.github.io/audio/drum-samples/conga-rhythm.mp3"); const player1 = new Tone.Player("https://tonejs.github.io/audio/drum-samples/conga-rhythm.mp3"); const filter = new Tone.Filter("G5").toDestination(); // connect nodes to a common destination Tone.fanIn(player, player1, filter);
function Frequency
Frequency: ( value?: TimeValue | Frequency, units?: FrequencyUnit) => FrequencyClass;
Convert a value into a FrequencyClass object. Unit
Example 1
const midi = Tone.Frequency("C3").toMidi(); console.log(midi);
Example 2
const hertz = Tone.Frequency(38, "midi").toFrequency(); console.log(hertz);
function ftom
ftom: (frequency: Hertz) => MidiNote;
Convert a frequency value to a MIDI note.
Parameter frequency
The value to frequency value to convert.
Example 1
Tone.ftom(440); // returns 69
function gainToDb
gainToDb: (gain: GainFactor) => Decibels;
Convert gain to decibels.
function getContext
getContext: () => BaseContext;
Returns the default system-wide Context Core
function getDestination
getDestination: () => DestinationClass;
The Destination (output) belonging to the global Tone.js Context.
See Also
DestinationClass Core
function getDraw
getDraw: () => DrawClass;
Get the singleton attached to the global context. Draw is used to synchronize the draw frame with the Transport's callbacks.
See Also
DrawClass Core
function getListener
getListener: () => ListenerClass;
The ListenerClass belonging to the global Tone.js Context. Core
function getTransport
getTransport: () => TransportClass;
The Transport object belonging to the global Tone.js Context.
See Also
TransportClass Core
function immediate
immediate: () => Seconds;
The current audio context time of the global Context without the Context.lookAhead
See Also
Context.immediate Core
function intervalToFrequencyRatio
intervalToFrequencyRatio: (interval: Interval) => number;
Convert an interval (in semitones) to a frequency ratio.
Parameter interval
the number of semitones above the base note
Example 1
Tone.intervalToFrequencyRatio(0); // 1 Tone.intervalToFrequencyRatio(12); // 2 Tone.intervalToFrequencyRatio(-12); // 0.5
function isArray
isArray: (arg: any) => arg is any[];
Test if the argument is an Array
function isBoolean
isBoolean: (arg: any) => arg is boolean;
Test if the argument is a boolean.
function isDefined
isDefined: <T>(arg: T | undefined) => arg is T;
Test if the arg is not undefined
function isFunction
isFunction: (arg: any) => arg is (a: any) => any;
Test if the arg is a function
function isNote
isNote: ( arg: any) => arg is | 'C0' | 'C1' | 'C2' | 'C3' | 'C4' | 'C5' | 'C6' | 'C7' | 'C8' | 'C9' | 'C10' | 'C11' | 'C-4' | 'C-3' | 'C-2' | 'C-1' | 'Cbb0' | 'Cbb1' | 'Cbb2' | 'Cbb3' | 'Cbb4' | 'Cbb5' | 'Cbb6' | 'Cbb7' | 'Cbb8' | 'Cbb9' | 'Cbb10' | 'Cbb11' | 'Cbb-4' | 'Cbb-3' | 'Cbb-2' | 'Cbb-1' | 'Cb0' | 'Cb1' | 'Cb2' | 'Cb3' | 'Cb4' | 'Cb5' | 'Cb6' | 'Cb7' | 'Cb8' | 'Cb9' | 'Cb10' | 'Cb11' | 'Cb-4' | 'Cb-3' | 'Cb-2' | 'Cb-1' | 'C#0' | 'C#1' | 'C#2' | 'C#3' | 'C#4' | 'C#5' | 'C#6' | 'C#7' | 'C#8' | 'C#9' | 'C#10' | 'C#11' | 'C#-4' | 'C#-3' | 'C#-2' | 'C#-1' | 'Cx0' | 'Cx1' | 'Cx2' | 'Cx3' | 'Cx4' | 'Cx5' | 'Cx6' | 'Cx7' | 'Cx8' | 'Cx9' | 'Cx10' | 'Cx11' | 'Cx-4' | 'Cx-3' | 'Cx-2' | 'Cx-1' | 'D0' | 'D1' | 'D2' | 'D3' | 'D4' | 'D5' | 'D6' | 'D7' | 'D8' | 'D9' | 'D10' | 'D11' | 'D-4' | 'D-3' | 'D-2' | 'D-1' | 'Dbb0' | 'Dbb1' | 'Dbb2' | 'Dbb3' | 'Dbb4' | 'Dbb5' | 'Dbb6' | 'Dbb7' | 'Dbb8' | 'Dbb9' | 'Dbb10' | 'Dbb11' | 'Dbb-4' | 'Dbb-3' | 'Dbb-2' | 'Dbb-1' | 'Db0' | 'Db1' | 'Db2' | 'Db3' | 'Db4' | 'Db5' | 'Db6' | 'Db7' | 'Db8' | 'Db9' | 'Db10' | 'Db11' | 'Db-4' | 'Db-3' | 'Db-2' | 'Db-1' | 'D#0' | 'D#1' | 'D#2' | 'D#3' | 'D#4' | 'D#5' | 'D#6' | 'D#7' | 'D#8' | 'D#9' | 'D#10' | 'D#11' | 'D#-4' | 'D#-3' | 'D#-2' | 'D#-1' | 'Dx0' | 'Dx1' | 'Dx2' | 'Dx3' | 'Dx4' | 'Dx5' | 'Dx6' | 'Dx7' | 'Dx8' | 'Dx9' | 'Dx10' | 'Dx11' | 'Dx-4' | 'Dx-3' | 'Dx-2' | 'Dx-1' | 'E0' | 'E1' | 'E2' | 'E3' | 'E4' | 'E5' | 'E6' | 'E7' | 'E8' | 'E9' | 'E10' | 'E11' | 'E-4' | 'E-3' | 'E-2' | 'E-1' | 'Ebb0' | 'Ebb1' | 'Ebb2' | 'Ebb3' | 'Ebb4' | 'Ebb5' | 'Ebb6' | 'Ebb7' | 'Ebb8' | 'Ebb9' | 'Ebb10' | 'Ebb11' | 'Ebb-4' | 'Ebb-3' | 'Ebb-2' | 'Ebb-1' | 'Eb0' | 'Eb1' | 'Eb2' | 'Eb3' | 'Eb4' | 'Eb5' | 'Eb6' | 'Eb7' | 'Eb8' | 'Eb9' | 'Eb10' | 'Eb11' | 'Eb-4' | 'Eb-3' | 'Eb-2' | 'Eb-1' | 'E#0' | 'E#1' | 'E#2' | 'E#3' | 'E#4' | 'E#5' | 'E#6' | 'E#7' | 'E#8' | 'E#9' | 'E#10' | 'E#11' | 'E#-4' | 'E#-3' | 'E#-2' | 'E#-1' | 'Ex0' | 'Ex1' | 'Ex2' | 'Ex3' | 'Ex4' | 'Ex5' | 'Ex6' | 'Ex7' | 'Ex8' | 'Ex9' | 'Ex10' | 'Ex11' | 'Ex-4' | 'Ex-3' | 'Ex-2' | 'Ex-1' | 'F0' | 'F1' | 'F2' | 'F3' | 'F4' | 'F5' | 'F6' | 'F7' | 'F8' | 'F9' | 'F10' | 'F11' | 'F-4' | 'F-3' | 'F-2' | 'F-1' | 'Fbb0' | 'Fbb1' | 'Fbb2' | 'Fbb3' | 'Fbb4' | 'Fbb5' | 'Fbb6' | 'Fbb7' | 'Fbb8' | 'Fbb9' | 'Fbb10' | 'Fbb11' | 'Fbb-4' | 'Fbb-3' | 'Fbb-2' | 'Fbb-1' | 'Fb0' | 'Fb1' | 'Fb2' | 'Fb3' | 'Fb4' | 'Fb5' | 'Fb6' | 'Fb7' | 'Fb8' | 'Fb9' | 'Fb10' | 'Fb11' | 'Fb-4' | 'Fb-3' | 'Fb-2' | 'Fb-1' | 'F#0' | 'F#1' | 'F#2' | 'F#3' | 'F#4' | 'F#5' | 'F#6' | 'F#7' | 'F#8' | 'F#9' | 'F#10' | 'F#11' | 'F#-4' | 'F#-3' | 'F#-2' | 'F#-1' | 'Fx0' | 'Fx1' | 'Fx2' | 'Fx3' | 'Fx4' | 'Fx5' | 'Fx6' | 'Fx7' | 'Fx8' | 'Fx9' | 'Fx10' | 'Fx11' | 'Fx-4' | 'Fx-3' | 'Fx-2' | 'Fx-1' | 'G0' | 'G1' | 'G2' | 'G3' | 'G4' | 'G5' | 'G6' | 'G7' | 'G8' | 'G9' | 'G10' | 'G11' | 'G-4' | 'G-3' | 'G-2' | 'G-1' | 'Gbb0' | 'Gbb1' | 'Gbb2' | 'Gbb3' | 'Gbb4' | 'Gbb5' | 'Gbb6' | 'Gbb7' | 'Gbb8' | 'Gbb9' | 'Gbb10' | 'Gbb11' | 'Gbb-4' | 'Gbb-3' | 'Gbb-2' | 'Gbb-1' | 'Gb0' | 'Gb1' | 'Gb2' | 'Gb3' | 'Gb4' | 'Gb5' | 'Gb6' | 'Gb7' | 'Gb8' | 'Gb9' | 'Gb10' | 'Gb11' | 'Gb-4' | 'Gb-3' | 'Gb-2' | 'Gb-1' | 'G#0' | 'G#1' | 'G#2' | 'G#3' | 'G#4' | 'G#5' | 'G#6' | 'G#7' | 'G#8' | 'G#9' | 'G#10' | 'G#11' | 'G#-4' | 'G#-3' | 'G#-2' | 'G#-1' | 'Gx0' | 'Gx1' | 'Gx2' | 'Gx3' | 'Gx4' | 'Gx5' | 'Gx6' | 'Gx7' | 'Gx8' | 'Gx9' | 'Gx10' | 'Gx11' | 'Gx-4' | 'Gx-3' | 'Gx-2' | 'Gx-1' | 'A0' | 'A1' | 'A2' | 'A3' | 'A4' | 'A5' | 'A6' | 'A7' | 'A8' | 'A9' | 'A10' | 'A11' | 'A-4' | 'A-3' | 'A-2' | 'A-1' | 'Abb0' | 'Abb1' | 'Abb2' | 'Abb3' | 'Abb4' | 'Abb5' | 'Abb6' | 'Abb7' | 'Abb8' | 'Abb9' | 'Abb10' | 'Abb11' | 'Abb-4' | 'Abb-3' | 'Abb-2' | 'Abb-1' | 'Ab0' | 'Ab1' | 'Ab2' | 'Ab3' | 'Ab4' | 'Ab5' | 'Ab6' | 'Ab7' | 'Ab8' | 'Ab9' | 'Ab10' | 'Ab11' | 'Ab-4' | 'Ab-3' | 'Ab-2' | 'Ab-1' | 'A#0' | 'A#1' | 'A#2' | 'A#3' | 'A#4' | 'A#5' | 'A#6' | 'A#7' | 'A#8' | 'A#9' | 'A#10' | 'A#11' | 'A#-4' | 'A#-3' | 'A#-2' | 'A#-1' | 'Ax0' | 'Ax1' | 'Ax2' | 'Ax3' | 'Ax4' | 'Ax5' | 'Ax6' | 'Ax7' | 'Ax8' | 'Ax9' | 'Ax10' | 'Ax11' | 'Ax-4' | 'Ax-3' | 'Ax-2' | 'Ax-1' | 'B0' | 'B1' | 'B2' | 'B3' | 'B4' | 'B5' | 'B6' | 'B7' | 'B8' | 'B9' | 'B10' | 'B11' | 'B-4' | 'B-3' | 'B-2' | 'B-1' | 'Bbb0' | 'Bbb1' | 'Bbb2' | 'Bbb3' | 'Bbb4' | 'Bbb5' | 'Bbb6' | 'Bbb7' | 'Bbb8' | 'Bbb9' | 'Bbb10' | 'Bbb11' | 'Bbb-4' | 'Bbb-3' | 'Bbb-2' | 'Bbb-1' | 'Bb0' | 'Bb1' | 'Bb2' | 'Bb3' | 'Bb4' | 'Bb5' | 'Bb6' | 'Bb7' | 'Bb8' | 'Bb9' | 'Bb10' | 'Bb11' | 'Bb-4' | 'Bb-3' | 'Bb-2' | 'Bb-1' | 'B#0' | 'B#1' | 'B#2' | 'B#3' | 'B#4' | 'B#5' | 'B#6' | 'B#7' | 'B#8' | 'B#9' | 'B#10' | 'B#11' | 'B#-4' | 'B#-3' | 'B#-2' | 'B#-1' | 'Bx0' | 'Bx1' | 'Bx2' | 'Bx3' | 'Bx4' | 'Bx5' | 'Bx6' | 'Bx7' | 'Bx8' | 'Bx9' | 'Bx10' | 'Bx11' | 'Bx-4' | 'Bx-3' | 'Bx-2' | 'Bx-1';
Test if the argument is in the form of a note in scientific pitch notation. e.g. "C4"
function isNumber
isNumber: (arg: any) => arg is number;
Test if the argument is a number.
function isObject
isObject: (arg: any) => arg is object;
Test if the given argument is an object literal (i.e.
{}
);
function isString
isString: (arg: any) => arg is string;
Test if the argument is a string.
function isUndef
isUndef: (arg: any) => arg is undefined;
Test if the arg is undefined
function loaded
loaded: () => Promise<void>;
Promise which resolves when all of the loading promises are resolved. Alias for static ToneAudioBuffer.loaded method. Core
function Midi
Midi: (value?: TimeValue, units?: FrequencyUnit) => MidiClass;
Convert a value into a FrequencyClass object. Unit
function mtof
mtof: (midi: MidiNote) => Hertz;
Convert a MIDI note to frequency value.
Parameter midi
The midi number to convert. The corresponding frequency value
Example 1
Tone.mtof(69); // 440
function now
now: () => Seconds;
The current audio context time of the global BaseContext.
See Also
Context.now Core
function Offline
Offline: ( callback: (context: OfflineContext) => Promise<void> | void, duration: Seconds, channels?: number, sampleRate?: number) => Promise<ToneAudioBuffer>;
Generate a buffer by rendering all of the Tone.js code within the callback using the OfflineAudioContext. The OfflineAudioContext is capable of rendering much faster than real time in many cases. The callback function also passes in an offline instance of Context which can be used to schedule events along the Transport.
Parameter callback
All Tone.js nodes which are created and scheduled within this callback are recorded into the output Buffer.
Parameter duration
the amount of time to record for. The promise which is invoked with the ToneAudioBuffer of the recorded output.
Example 1
// render 2 seconds of the oscillator Tone.Offline(() => { // only nodes created in this callback will be recorded const oscillator = new Tone.Oscillator().toDestination().start(0); }, 2).then((buffer) => { // do something with the output buffer console.log(buffer); });
Example 2
// can also schedule events along the Transport // using the passed in Offline Transport Tone.Offline(({ transport }) => { const osc = new Tone.Oscillator().toDestination(); transport.schedule(time => { osc.start(time).stop(time + 0.1); }, 1); // make sure to start the transport transport.start(0.2); }, 4).then((buffer) => { // do something with the output buffer console.log(buffer); }); Core
function setContext
setContext: ( context: BaseContext | AnyAudioContext, disposeOld?: boolean) => void;
Set the default audio context
Parameter context
Parameter disposeOld
Pass
true
if you don't need the old context to dispose it. Core
function start
start: () => Promise<void>;
Most browsers will not play _any_ audio until a user clicks something (like a play button). Invoke this method on a click or keypress event handler to start the audio context. More about the Autoplay policy [here](https://developers.google.com/web/updates/2017/09/autoplay-policy-changes#webaudio)
Example 1
document.querySelector("button").addEventListener("click", async () => { await Tone.start(); console.log("context started"); }); Core
function Ticks
Ticks: (value?: TimeValue, units?: TimeBaseUnit) => TicksClass;
Convert a time representation to ticks Unit
function Time
Time: (value?: TimeValue, units?: TimeBaseUnit) => TimeClass<Seconds>;
Create a TimeClass from a time string or number. The time is computed against the global Tone.Context. To use a specific context, use TimeClass
Parameter value
A value which represents time
Parameter units
The value's units if they can't be inferred by the value. Unit
Example 1
const time = Tone.Time("4n").toSeconds(); console.log(time);
Example 2
const note = Tone.Time(1).toNotation(); console.log(note);
Example 3
const freq = Tone.Time(0.5).toFrequency(); console.log(freq);
function TransportTime
TransportTime: (value?: TimeValue, units?: TimeBaseUnit) => TransportTimeClass;
TransportTime is a time along the Transport's timeline. It is similar to Tone.Time, but instead of evaluating against the AudioContext's clock, it is evaluated against the Transport's position. See [TransportTime wiki](https://github.com/Tonejs/Tone.js/wiki/TransportTime). Unit
Classes
class Abs
class Abs extends SignalOperator<ToneAudioNodeOptions> {}
Return the absolute value of an incoming signal.
Example 1
return Tone.Offline(() => { const abs = new Tone.Abs().toDestination(); const signal = new Tone.Signal(1); signal.rampTo(-1, 0.5); signal.connect(abs); }, 0.5, 1); Signal
class Add
class Add extends Signal {}
Add a signal and a number or two signals. When no value is passed into the constructor, Tone.Add will sum input and
addend
If a value is passed into the constructor, the it will be added to the input.Example 1
return Tone.Offline(() => { const add = new Tone.Add(2).toDestination(); add.addend.setValueAtTime(1, 0.2); const signal = new Tone.Signal(2); // add a signal and a scalar signal.connect(add); signal.setValueAtTime(1, 0.1); }, 0.5, 1); Signal
constructor
constructor(value?: number);
Parameter value
If no value is provided, will sum the input and addend.
constructor
constructor(options?: Partial<SignalOptions<'number'>>);
property addend
readonly addend: Param<'number'>;
The value which is added to the input signal
property input
readonly input: Gain<'gain'>;
property name
readonly name: string;
property output
readonly output: Gain<'gain'>;
property override
override: boolean;
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => SignalOptions<'number'>;
class AMOscillator
class AMOscillator extends Source<AMOscillatorOptions> implements ToneOscillatorInterface {}
An amplitude modulated oscillator node. It is implemented with two oscillators, one which modulators the other's amplitude through a gain node.
+-------------+ +----------+| Carrier Osc +>------> GainNode |+-------------+ | +--->Output+---> gain |+---------------+ | +----------+| Modulator Osc +>---++---------------+Example 1
return Tone.Offline(() => { const amOsc = new Tone.AMOscillator(30, "sine", "square").toDestination().start(); }, 0.2, 1); Source
constructor
constructor( frequency?: Frequency, type?: ToneOscillatorType, modulationType?: ToneOscillatorType);
Parameter frequency
The starting frequency of the oscillator.
Parameter type
The type of the carrier oscillator.
Parameter modulationType
The type of the modulator oscillator.
constructor
constructor(options?: Partial<AMConstructorOptions>);
property baseType
baseType: OscillatorType;
property detune
readonly detune: Signal<'cents'>;
property frequency
readonly frequency: Signal<'frequency'>;
property harmonicity
readonly harmonicity: Signal<'positive'>;
Harmonicity is the frequency ratio between the carrier and the modulator oscillators. A harmonicity of 1 gives both oscillators the same frequency. Harmonicity = 2 means a change of an octave.
Example 1
const amOsc = new Tone.AMOscillator("D2").toDestination().start(); Tone.Transport.scheduleRepeat(time => { amOsc.harmonicity.setValueAtTime(1, time); amOsc.harmonicity.setValueAtTime(0.5, time + 0.5); amOsc.harmonicity.setValueAtTime(1.5, time + 1); amOsc.harmonicity.setValueAtTime(1, time + 2); amOsc.harmonicity.linearRampToValueAtTime(2, time + 4); }, 4); Tone.Transport.start();
property modulationType
modulationType: ToneOscillatorType;
The type of the modulator oscillator
property name
readonly name: string;
property partialCount
partialCount: number;
property partials
partials: number[];
property phase
phase: number;
property type
type: ToneOscillatorType;
The type of the carrier oscillator
method asArray
asArray: (length?: number) => Promise<Float32Array>;
method dispose
dispose: () => this;
Clean up.
method getDefaults
static getDefaults: () => AMOscillatorOptions;
class AmplitudeEnvelope
class AmplitudeEnvelope extends Envelope {}
AmplitudeEnvelope is a Tone.Envelope connected to a gain node. Unlike Tone.Envelope, which outputs the envelope's value, AmplitudeEnvelope accepts an audio signal as the input and will apply the envelope to the amplitude of the signal. Read more about ADSR Envelopes on [Wikipedia](https://en.wikipedia.org/wiki/Synthesizer#ADSR_envelope).
Example 1
return Tone.Offline(() => { const ampEnv = new Tone.AmplitudeEnvelope({ attack: 0.1, decay: 0.2, sustain: 1.0, release: 0.8 }).toDestination(); // create an oscillator and connect it const osc = new Tone.Oscillator().connect(ampEnv).start(); // trigger the envelopes attack and release "8t" apart ampEnv.triggerAttackRelease("8t"); }, 1.5, 1); Component
constructor
constructor(attack?: Time, decay?: Time, sustain?: number, release?: Time);
Parameter attack
The amount of time it takes for the envelope to go from 0 to it's maximum value.
Parameter decay
The period of time after the attack that it takes for the envelope to fall to the sustain value. Value must be greater than 0.
Parameter sustain
The percent of the maximum value that the envelope rests at until the release is triggered.
Parameter release
The amount of time after the release is triggered it takes to reach 0. Value must be greater than 0.
constructor
constructor(options?: Partial<EnvelopeOptions>);
property input
input: Gain<'gain'>;
property name
readonly name: string;
property output
output: Gain<'gain'>;
method dispose
dispose: () => this;
Clean up
class AMSynth
class AMSynth extends ModulationSynth<AMSynthOptions> {}
AMSynth uses the output of one Tone.Synth to modulate the amplitude of another Tone.Synth. The harmonicity (the ratio between the two signals) affects the timbre of the output signal greatly. Read more about Amplitude Modulation Synthesis on [SoundOnSound](https://web.archive.org/web/20160404103653/http://www.soundonsound.com:80/sos/mar00/articles/synthsecrets.htm).
Example 1
const synth = new Tone.AMSynth().toDestination(); synth.triggerAttackRelease("C4", "4n");
Instrument
constructor
constructor(options?: RecursivePartial<ModulationSynthOptions>);
property name
readonly name: string;
method dispose
dispose: () => this;
class Analyser
class Analyser extends ToneAudioNode<AnalyserOptions> {}
Wrapper around the native Web Audio's [AnalyserNode](http://webaudio.github.io/web-audio-api/#idl-def-AnalyserNode). Extracts FFT or Waveform data from the incoming signal. Component
constructor
constructor(type?: AnalyserType, size?: number);
Parameter type
The return type of the analysis, either "fft", or "waveform".
Parameter size
The size of the FFT. This must be a power of two in the range 16 to 16384.
constructor
constructor(options?: Partial<AnalyserOptions>);
property channels
readonly channels: number;
The number of channels the analyser does the analysis on. Channel separation is done using Split
property input
readonly input: InputNode;
property name
readonly name: string;
property output
readonly output: OutputNode;
property size
size: number;
The size of analysis. This must be a power of two in the range 16 to 16384.
property smoothing
smoothing: number;
0 represents no time averaging with the last analysis frame.
property type
type: AnalyserType;
The analysis function returned by analyser.getValue(), either "fft" or "waveform".
method dispose
dispose: () => this;
Clean up.
method getDefaults
static getDefaults: () => AnalyserOptions;
method getValue
getValue: () => Float32Array | Float32Array[];
class AudioToGain
class AudioToGain extends SignalOperator<ToneAudioNodeOptions> {}
AudioToGain converts an input in AudioRange [-1,1] to NormalRange [0,1].
See Also
GainToAudio. Signal
class AutoFilter
class AutoFilter extends LFOEffect<AutoFilterOptions> {}
AutoFilter is a Tone.Filter with a Tone.LFO connected to the filter cutoff frequency. Setting the LFO rate and depth allows for control over the filter modulation rate and depth.
Example 1
// create an autofilter and start it's LFO const autoFilter = new Tone.AutoFilter("4n").toDestination().start(); // route an oscillator through the filter and start it const oscillator = new Tone.Oscillator().connect(autoFilter).start(); Effect
constructor
constructor(frequency?: Frequency, baseFrequency?: Frequency, octaves?: number);
Parameter frequency
The rate of the LFO.
Parameter baseFrequency
The lower value of the LFOs oscillation
Parameter octaves
The number of octaves above the baseFrequency
constructor
constructor(options?: Partial<AutoFilterOptions>);
property baseFrequency
baseFrequency: Frequency;
The minimum value of the filter's cutoff frequency.
property filter
readonly filter: Filter;
The filter node
property name
readonly name: string;
property octaves
octaves: number;
The maximum value of the filter's cutoff frequency.
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => AutoFilterOptions;
class AutoPanner
class AutoPanner extends LFOEffect<AutoPannerOptions> {}
AutoPanner is a Panner with an LFO connected to the pan amount. [Related Reading](https://www.ableton.com/en/blog/autopan-chopper-effect-and-more-liveschool/).
Example 1
// create an autopanner and start it const autoPanner = new Tone.AutoPanner("4n").toDestination().start(); // route an oscillator through the panner and start it const oscillator = new Tone.Oscillator().connect(autoPanner).start(); Effect
constructor
constructor(frequency?: Frequency);
Parameter frequency
Rate of left-right oscillation.
constructor
constructor(options?: Partial<AutoPannerOptions>);
property name
readonly name: string;
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => AutoPannerOptions;
class AutoWah
class AutoWah extends Effect<AutoWahOptions> {}
AutoWah connects a Follower to a Filter. The frequency of the filter, follows the input amplitude curve. Inspiration from [Tuna.js](https://github.com/Dinahmoe/tuna).
Example 1
const autoWah = new Tone.AutoWah(50, 6, -30).toDestination(); // initialize the synth and connect to autowah const synth = new Tone.Synth().connect(autoWah); // Q value influences the effect of the wah - default is 2 autoWah.Q.value = 6; // more audible on higher notes synth.triggerAttackRelease("C4", "8n"); Effect
constructor
constructor(baseFrequency?: Frequency, octaves?: number, sensitivity?: number);
Parameter baseFrequency
The frequency the filter is set to at the low point of the wah
Parameter octaves
The number of octaves above the baseFrequency the filter will sweep to when fully open.
Parameter sensitivity
The decibel threshold sensitivity for the incoming signal. Normal range of -40 to 0.
constructor
constructor(options?: Partial<AutoWahOptions>);
property baseFrequency
baseFrequency: Frequency;
The base frequency from which the sweep will start from.
property follower
follower: Time;
The follower's smoothing time
property gain
readonly gain: Signal<'decibels'>;
The gain of the filter.
property name
readonly name: string;
property octaves
octaves: number;
The number of octaves that the filter will sweep above the baseFrequency.
property Q
readonly Q: Signal<'positive'>;
The quality of the filter.
property sensitivity
sensitivity: number;
The sensitivity to control how responsive to the input signal the filter is.
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => AutoWahOptions;
class BaseContext
abstract class BaseContext extends Emitter<'statechange' | 'tick'> implements BaseAudioContextSubset {}
property currentTime
readonly currentTime: number;
property destination
readonly destination: Destination;
property draw
readonly draw: Draw;
property isOffline
readonly isOffline: boolean;
property latencyHint
abstract latencyHint: number | AudioContextLatencyCategory;
property listener
readonly listener: Listener;
property lookAhead
abstract lookAhead: number;
property rawContext
readonly rawContext: AnyAudioContext;
property sampleRate
readonly sampleRate: number;
property state
readonly state: AudioContextState;
property transport
readonly transport: Transport;
method addAudioWorkletModule
abstract addAudioWorkletModule: (_url: string) => Promise<void>;
method clearInterval
abstract clearInterval: (_id: number) => this;
method clearTimeout
abstract clearTimeout: (_id: number) => this;
method createAnalyser
abstract createAnalyser: () => AnalyserNode;
method createAudioWorkletNode
abstract createAudioWorkletNode: ( _name: string, _options?: Partial<AudioWorkletNodeOptions>) => AudioWorkletNode;
method createBiquadFilter
abstract createBiquadFilter: () => BiquadFilterNode;
method createBuffer
abstract createBuffer: ( _numberOfChannels: number, _length: number, _sampleRate: number) => AudioBuffer;
method createBufferSource
abstract createBufferSource: () => AudioBufferSourceNode;
method createChannelMerger
abstract createChannelMerger: ( _numberOfInputs?: number | undefined) => ChannelMergerNode;
method createChannelSplitter
abstract createChannelSplitter: ( _numberOfOutputs?: number | undefined) => ChannelSplitterNode;
method createConstantSource
abstract createConstantSource: () => ConstantSourceNode;
method createConvolver
abstract createConvolver: () => ConvolverNode;
method createDelay
abstract createDelay: (_maxDelayTime?: number | undefined) => DelayNode;
method createDynamicsCompressor
abstract createDynamicsCompressor: () => DynamicsCompressorNode;
method createGain
abstract createGain: () => GainNode;
method createIIRFilter
abstract createIIRFilter: ( _feedForward: number[] | Float32Array, _feedback: number[] | Float32Array) => IIRFilterNode;
method createMediaElementSource
abstract createMediaElementSource: ( _element: HTMLMediaElement) => MediaElementAudioSourceNode;
method createMediaStreamDestination
abstract createMediaStreamDestination: () => MediaStreamAudioDestinationNode;
method createMediaStreamSource
abstract createMediaStreamSource: ( _stream: MediaStream) => MediaStreamAudioSourceNode;
method createOscillator
abstract createOscillator: () => OscillatorNode;
method createPanner
abstract createPanner: () => PannerNode;
method createPeriodicWave
abstract createPeriodicWave: ( _real: number[] | Float32Array, _imag: number[] | Float32Array, _constraints?: PeriodicWaveConstraints | undefined) => PeriodicWave;
method createStereoPanner
abstract createStereoPanner: () => StereoPannerNode;
method createWaveShaper
abstract createWaveShaper: () => WaveShaperNode;
method decodeAudioData
abstract decodeAudioData: (_audioData: ArrayBuffer) => Promise<AudioBuffer>;
method getConstant
abstract getConstant: (_val: number) => AudioBufferSourceNode;
method immediate
abstract immediate: () => Seconds;
method now
abstract now: () => Seconds;
method resume
abstract resume: () => Promise<void>;
method setInterval
abstract setInterval: ( _fn: (...args: any[]) => void, _interval: Seconds) => number;
method setTimeout
abstract setTimeout: ( _fn: (...args: any[]) => void, _timeout: Seconds) => number;
method toJSON
toJSON: () => Record<string, any>;
class BiquadFilter
class BiquadFilter extends ToneAudioNode<BiquadFilterOptions> {}
Thin wrapper around the native Web Audio [BiquadFilterNode](https://webaudio.github.io/web-audio-api/#biquadfilternode). BiquadFilter is similar to Filter but doesn't have the option to set the "rolloff" value. Component
constructor
constructor(frequency?: Frequency, type?: BiquadFilterType);
Parameter frequency
The cutoff frequency of the filter.
Parameter type
The type of filter.
constructor
constructor(options?: Partial<BiquadFilterOptions>);
property detune
readonly detune: Param<'cents'>;
A detune value, in cents, for the frequency.
property frequency
readonly frequency: Param<'frequency'>;
The frequency of the filter
property gain
readonly gain: Param<'decibels'>;
The gain of the filter. Its value is in dB units. The gain is only used for lowshelf, highshelf, and peaking filters.
property input
readonly input: BiquadFilterNode;
property name
readonly name: string;
property output
readonly output: BiquadFilterNode;
property Q
readonly Q: Param<'number'>;
The Q factor of the filter. For lowpass and highpass filters the Q value is interpreted to be in dB. For these filters the nominal range is [−𝑄𝑙𝑖𝑚,𝑄𝑙𝑖𝑚] where 𝑄𝑙𝑖𝑚 is the largest value for which 10𝑄/20 does not overflow. This is approximately 770.63678. For the bandpass, notch, allpass, and peaking filters, this value is a linear value. The value is related to the bandwidth of the filter and hence should be a positive value. The nominal range is [0,3.4028235𝑒38], the upper limit being the most-positive-single-float. This is not used for the lowshelf and highshelf filters.
property type
type: BiquadFilterType;
The type of this BiquadFilterNode. For a complete list of types and their attributes, see the [Web Audio API](https://webaudio.github.io/web-audio-api/#dom-biquadfiltertype-lowpass)
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => BiquadFilterOptions;
method getFrequencyResponse
getFrequencyResponse: (len?: number) => Float32Array;
Get the frequency response curve. This curve represents how the filter responses to frequencies between 20hz-20khz.
Parameter len
The number of values to return The frequency response curve between 20-20kHz
class BitCrusher
class BitCrusher extends Effect<BitCrusherOptions> {}
BitCrusher down-samples the incoming signal to a different bit depth. Lowering the bit depth of the signal creates distortion. Read more about BitCrushing on [Wikipedia](https://en.wikipedia.org/wiki/Bitcrusher).
Example 1
// initialize crusher and route a synth through it const crusher = new Tone.BitCrusher(4).toDestination(); const synth = new Tone.Synth().connect(crusher); synth.triggerAttackRelease("C2", 2);
Effect
constructor
constructor(bits?: number);
constructor
constructor(options?: Partial<BitCrusherWorkletOptions>);
property bits
readonly bits: Param<'positive'>;
The bit depth of the effect 1 16
property name
readonly name: string;
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => BitCrusherOptions;
class Channel
class Channel extends ToneAudioNode<ChannelOptions> {}
constructor
constructor(volume?: number, pan?: number);
Parameter volume
The output volume.
Parameter pan
the initial pan
constructor
constructor(options?: Partial<ChannelOptions>);
property input
readonly input: InputNode;
property mute
mute: boolean;
Mute/unmute the volume
property muted
readonly muted: boolean;
If the current instance is muted, i.e. another instance is soloed, or the channel is muted
property name
readonly name: string;
property output
readonly output: OutputNode;
property pan
readonly pan: Param<'audioRange'>;
The L/R panning control. -1 = hard left, 1 = hard right. -1 1
property solo
solo: boolean;
property volume
readonly volume: Param<'decibels'>;
The volume control in decibels.
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => ChannelOptions;
method receive
receive: (name: string) => this;
Receive audio from a channel which was connected with send.
Parameter name
The channel name to receive audio from.
method send
send: (name: string, volume?: Decibels) => Gain<'decibels'>;
Send audio to another channel using a string.
send
is a lot like connect, except it uses a string instead of an object. This can be useful in large applications to decouple sections since send and receive can be invoked separately in order to connect an objectParameter name
The channel name to send the audio
Parameter volume
The amount of the signal to send. Defaults to 0db, i.e. send the entire signal
Returns
Returns the gain node of this connection.
class Chebyshev
class Chebyshev extends Effect<ChebyshevOptions> {}
Chebyshev is a waveshaper which is good for making different types of distortion sounds. Note that odd orders sound very different from even ones, and order = 1 is no change. Read more at [music.columbia.edu](http://music.columbia.edu/cmc/musicandcomputers/chapter4/04_06.php).
Example 1
// create a new cheby const cheby = new Tone.Chebyshev(50).toDestination(); // create a monosynth connected to our cheby const synth = new Tone.MonoSynth().connect(cheby); synth.triggerAttackRelease("C2", 0.4); Effect
constructor
constructor(order?: number);
Parameter order
The order of the chebyshev polynomial. Normal range between 1-100.
constructor
constructor(options?: Partial<ChebyshevOptions>);
property name
readonly name: string;
property order
order: number;
The order of the Chebyshev polynomial which creates the equation which is applied to the incoming signal through a Tone.WaveShaper. Must be an integer. The equations are in the form:
order 2: 2x^2 + 1order 3: 4x^3 + 3x1 100
property oversample
oversample: OverSampleType;
The oversampling of the effect. Can either be "none", "2x" or "4x".
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => ChebyshevOptions;
class Chorus
class Chorus extends StereoFeedbackEffect<ChorusOptions> {}
Chorus is a stereo chorus effect composed of a left and right delay with an LFO applied to the delayTime of each channel. When feedback is set to a value larger than 0, you also get Flanger-type effects. Inspiration from [Tuna.js](https://github.com/Dinahmoe/tuna/blob/master/tuna.js). Read more on the chorus effect on [Sound On Sound](http://www.soundonsound.com/sos/jun04/articles/synthsecrets.htm).
Example 1
const chorus = new Tone.Chorus(4, 2.5, 0.5).toDestination().start(); const synth = new Tone.PolySynth().connect(chorus); synth.triggerAttackRelease(["C3", "E3", "G3"], "8n");
Effect
constructor
constructor(frequency?: Frequency, delayTime?: number, depth?: number);
Parameter frequency
The frequency of the LFO.
Parameter delayTime
The delay of the chorus effect in ms.
Parameter depth
The depth of the chorus.
constructor
constructor(options?: Partial<ChorusOptions>);
property delayTime
delayTime: number;
The delayTime in milliseconds of the chorus. A larger delayTime will give a more pronounced effect. Nominal range a delayTime is between 2 and 20ms.
property depth
depth: number;
The depth of the effect. A depth of 1 makes the delayTime modulate between 0 and 2*delayTime (centered around the delayTime).
property frequency
readonly frequency: Signal<'frequency'>;
The frequency of the LFO which modulates the delayTime.
property name
readonly name: string;
property spread
spread: number;
Amount of stereo spread. When set to 0, both LFO's will be panned centrally. When set to 180, LFO's will be panned hard left and right respectively.
property type
type: ToneOscillatorType;
The oscillator type of the LFO.
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => ChorusOptions;
method start
start: (time?: Time) => this;
Start the effect.
method stop
stop: (time?: Time) => this;
Stop the lfo
method sync
sync: () => this;
Sync the filter to the transport.
See Also
method unsync
unsync: () => this;
Unsync the filter from the transport.
class Clock
class Clock<TypeName extends 'bpm' | 'hertz' = 'hertz'> extends ToneWithContext<ClockOptions> implements Emitter<ClockEvent> {}
A sample accurate clock which provides a callback at the given rate. While the callback is not sample-accurate (it is still susceptible to loose JS timing), the time passed in as the argument to the callback is precise. For most applications, it is better to use Tone.Transport instead of the Clock by itself since you can synchronize multiple callbacks.
Example 1
// the callback will be invoked approximately once a second // and will print the time exactly once a second apart. const clock = new Tone.Clock(time => { console.log(time); }, 1); clock.start(); Core
constructor
constructor(callback?: ClockCallback, frequency?: Frequency);
Parameter callback
The callback to be invoked with the time of the audio event
Parameter frequency
The rate of the callback
constructor
constructor(options: Partial<ClockOptions>);
property callback
callback: ClockCallback;
The callback function to invoke at the scheduled tick.
property emit
emit: (event: any, ...args: any[]) => this;
property frequency
frequency: TickSignal<TypeName>;
The rate the callback function should be invoked.
property name
readonly name: string;
property off
off: (event: ClockEvent, callback?: (...args: any[]) => void) => this;
property on
on: (event: ClockEvent, callback: (...args: any[]) => void) => this;
property once
once: (event: ClockEvent, callback: (...args: any[]) => void) => this;
property seconds
seconds: number;
The time since ticks=0 that the Clock has been running. Accounts for tempo curves
property state
readonly state: PlaybackState;
Returns the playback state of the source, either "started", "stopped" or "paused".
property ticks
ticks: number;
The number of times the callback was invoked. Starts counting at 0 and increments after the callback was invoked.
method dispose
dispose: () => this;
Clean up
method getDefaults
static getDefaults: () => ClockOptions;
method getSecondsAtTime
getSecondsAtTime: (time: Time) => Seconds;
Return the elapsed seconds at the given time.
Parameter time
When to get the elapsed seconds The number of elapsed seconds
method getStateAtTime
getStateAtTime: (time: Time) => PlaybackState;
Returns the scheduled state at the given time.
Parameter time
The time to query. The name of the state input in setStateAtTime.
Example 1
const clock = new Tone.Clock(); clock.start("+0.1"); clock.getStateAtTime("+0.1"); // returns "started"
method getTicksAtTime
getTicksAtTime: (time?: Time) => Ticks;
Get the clock's ticks at the given time.
Parameter time
When to get the tick value The tick value at the given time.
method getTimeOfTick
getTimeOfTick: (tick: Ticks, before?: number) => Seconds;
Get the time of the given tick. The second argument is when to test before. Since ticks can be set (with setTicksAtTime) there may be multiple times for a given tick value.
Parameter tick
The tick number.
Parameter before
When to measure the tick value from. The time of the tick
method nextTickTime
nextTickTime: (offset: Ticks, when: Time) => Seconds;
Get the time of the next tick
Parameter offset
The tick number.
method pause
pause: (time?: Time) => this;
Pause the clock. Pausing does not reset the tick counter.
Parameter time
The time when the clock should stop.
method setTicksAtTime
setTicksAtTime: (ticks: Ticks, time: Time) => this;
Set the clock's ticks at the given time.
Parameter ticks
The tick value to set
Parameter time
When to set the tick value
method start
start: (time?: Time, offset?: Ticks) => this;
Start the clock at the given time. Optionally pass in an offset of where to start the tick counter from.
Parameter time
The time the clock should start
Parameter offset
Where the tick counter starts counting from.
method stop
stop: (time?: Time) => this;
Stop the clock. Stopping the clock resets the tick counter to 0.
Parameter time
The time when the clock should stop.
Example 1
const clock = new Tone.Clock(time => { console.log(time); }, 1); clock.start(); // stop the clock after 10 seconds clock.stop("+10");
class Compressor
class Compressor extends ToneAudioNode<CompressorOptions> {}
Compressor is a thin wrapper around the Web Audio [DynamicsCompressorNode](http://webaudio.github.io/web-audio-api/#the-dynamicscompressornode-interface). Compression reduces the volume of loud sounds or amplifies quiet sounds by narrowing or "compressing" an audio signal's dynamic range. Read more on [Wikipedia](https://en.wikipedia.org/wiki/Dynamic_range_compression).
Example 1
const comp = new Tone.Compressor(-30, 3); Component
constructor
constructor(threshold?: number, ratio?: number);
Parameter threshold
The value above which the compression starts to be applied.
Parameter ratio
The gain reduction ratio.
constructor
constructor(options?: Partial<CompressorOptions>);
property attack
readonly attack: Param<'time'>;
The amount of time (in seconds) to reduce the gain by 10dB. 0 1
property input
readonly input: DynamicsCompressorNode;
property knee
readonly knee: Param<'decibels'>;
A decibel value representing the range above the threshold where the curve smoothly transitions to the "ratio" portion. 0 40
property name
readonly name: string;
property output
readonly output: DynamicsCompressorNode;
property ratio
readonly ratio: Param<'positive'>;
The amount of dB change in input for a 1 dB change in output. 1 20
property reduction
readonly reduction: number;
A read-only decibel value for metering purposes, representing the current amount of gain reduction that the compressor is applying to the signal. If fed no signal the value will be 0 (no gain reduction).
property release
readonly release: Param<'time'>;
The amount of time (in seconds) to increase the gain by 10dB. 0 1
property threshold
readonly threshold: Param<'decibels'>;
The decibel value above which the compression will start taking effect. -100 0
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => CompressorOptions;
class Context
class Context extends BaseContext {}
Wrapper around the native AudioContext. Core
constructor
constructor(context?: AnyAudioContext);
constructor
constructor(options?: Partial<ContextOptions>);
property clockSource
clockSource: TickerClockSource;
What the source of the clock is, either "worker" (default), "timeout", or "offline" (none).
property currentTime
readonly currentTime: number;
The current time in seconds of the AudioContext.
property destination
destination: Destination;
A reference to the Context's destination node.
property draw
draw: Draw;
This is the Draw object for the context which is useful for synchronizing the draw frame with the Tone.js clock.
property isOffline
readonly isOffline: boolean;
Indicates if the context is an OfflineAudioContext or an AudioContext
property latencyHint
readonly latencyHint: number | AudioContextLatencyCategory;
The type of playback, which affects tradeoffs between audio output latency and responsiveness. In addition to setting the value in seconds, the latencyHint also accepts the strings "interactive" (prioritizes low latency), "playback" (prioritizes sustained playback), "balanced" (balances latency and performance).
Example 1
// prioritize sustained playback const context = new Tone.Context({ latencyHint: "playback" }); // set this context as the global Context Tone.setContext(context); // the global context is gettable with Tone.getContext() console.log(Tone.getContext().latencyHint);
property listener
listener: Listener;
The listener
property lookAhead
lookAhead: number;
The amount of time into the future events are scheduled. Giving Web Audio a short amount of time into the future to schedule events can reduce clicks and improve performance. This value can be set to 0 to get the lowest latency. Adjusting this value also affects the updateInterval.
property name
readonly name: string;
property rawContext
readonly rawContext: AnyAudioContext;
The unwrapped AudioContext or OfflineAudioContext
property sampleRate
readonly sampleRate: number;
The current time in seconds of the AudioContext.
property state
readonly state: AudioContextState;
The current time in seconds of the AudioContext.
property transport
transport: Transport;
There is only one Transport per Context. It is created on initialization.
property updateInterval
updateInterval: number;
How often the interval callback is invoked. This number corresponds to how responsive the scheduling can be. Setting to 0 will result in the lowest practial interval based on context properties. context.updateInterval + context.lookAhead gives you the total latency between scheduling an event and hearing it.
method addAudioWorkletModule
addAudioWorkletModule: (url: string) => Promise<void>;
Add an AudioWorkletProcessor module
Parameter url
The url of the module
method clearInterval
clearInterval: (id: number) => this;
Clear the function scheduled by setInterval
method clearTimeout
clearTimeout: (id: number) => this;
Clears a previously scheduled timeout with Tone.context.setTimeout
Parameter id
The ID returned from setTimeout
method close
close: () => Promise<void>;
Close the context. Once closed, the context can no longer be used and any AudioNodes created from the context will be silent.
method createAnalyser
createAnalyser: () => AnalyserNode;
method createAudioWorkletNode
createAudioWorkletNode: ( name: string, options?: Partial<AudioWorkletNodeOptions>) => AudioWorkletNode;
Create an audio worklet node from a name and options. The module must first be loaded using addAudioWorkletModule.
method createBiquadFilter
createBiquadFilter: () => BiquadFilterNode;
method createBuffer
createBuffer: ( numberOfChannels: number, length: number, sampleRate: number) => AudioBuffer;
method createBufferSource
createBufferSource: () => AudioBufferSourceNode;
method createChannelMerger
createChannelMerger: (numberOfInputs?: number | undefined) => ChannelMergerNode;
method createChannelSplitter
createChannelSplitter: ( numberOfOutputs?: number | undefined) => ChannelSplitterNode;
method createConstantSource
createConstantSource: () => ConstantSourceNode;
method createConvolver
createConvolver: () => ConvolverNode;
method createDelay
createDelay: (maxDelayTime?: number | undefined) => DelayNode;
method createDynamicsCompressor
createDynamicsCompressor: () => DynamicsCompressorNode;
method createGain
createGain: () => GainNode;
method createIIRFilter
createIIRFilter: ( feedForward: number[] | Float32Array, feedback: number[] | Float32Array) => IIRFilterNode;
method createMediaElementSource
createMediaElementSource: ( element: HTMLMediaElement) => MediaElementAudioSourceNode;
method createMediaStreamDestination
createMediaStreamDestination: () => MediaStreamAudioDestinationNode;
method createMediaStreamSource
createMediaStreamSource: (stream: MediaStream) => MediaStreamAudioSourceNode;
method createOscillator
createOscillator: () => OscillatorNode;
method createPanner
createPanner: () => PannerNode;
method createPeriodicWave
createPeriodicWave: ( real: number[] | Float32Array, imag: number[] | Float32Array, constraints?: PeriodicWaveConstraints | undefined) => PeriodicWave;
method createStereoPanner
createStereoPanner: () => StereoPannerNode;
method createWaveShaper
createWaveShaper: () => WaveShaperNode;
method decodeAudioData
decodeAudioData: (audioData: ArrayBuffer) => Promise<AudioBuffer>;
method dispose
dispose: () => this;
Clean up. Also closes the audio context.
method getConstant
getConstant: (val: number) => AudioBufferSourceNode;
**Internal** Generate a looped buffer at some constant value.
method getDefaults
static getDefaults: () => ContextOptions;
method immediate
immediate: () => Seconds;
The current audio context time without the lookAhead. In most cases it is better to use now instead of immediate since with now the lookAhead is applied equally to _all_ components including internal components, to making sure that everything is scheduled in sync. Mixing now and immediate can cause some timing issues. If no lookAhead is desired, you can set the lookAhead to
0
.
method now
now: () => Seconds;
The current audio context time plus a short lookAhead.
Example 1
setInterval(() => { console.log("now", Tone.now()); }, 100);
method resume
resume: () => Promise<void>;
Starts the audio context from a suspended state. This is required to initially start the AudioContext.
See Also
method setInterval
setInterval: (fn: (...args: any[]) => void, interval: Seconds) => number;
Adds a repeating event to the context's callback clock
method setTimeout
setTimeout: (fn: (...args: any[]) => void, timeout: Seconds) => number;
A setTimeout which is guaranteed by the clock source. Also runs in the offline context.
Parameter fn
The callback to invoke
Parameter timeout
The timeout in seconds
Returns
ID to use when invoking Context.clearTimeout
method workletsAreReady
protected workletsAreReady: () => Promise<void>;
Returns a promise which resolves when all of the worklets have been loaded on this context
class Convolver
class Convolver extends ToneAudioNode<ConvolverOptions> {}
Convolver is a wrapper around the Native Web Audio [ConvolverNode](http://webaudio.github.io/web-audio-api/#the-convolvernode-interface). Convolution is useful for reverb and filter emulation. Read more about convolution reverb on [Wikipedia](https://en.wikipedia.org/wiki/Convolution_reverb).
Example 1
// initializing the convolver with an impulse response const convolver = new Tone.Convolver("./path/to/ir.wav").toDestination(); Component
constructor
constructor(url?: string | ToneAudioBuffer | AudioBuffer, onload?: () => void);
Parameter url
The URL of the impulse response or the ToneAudioBuffer containing the impulse response.
Parameter onload
The callback to invoke when the url is loaded.
constructor
constructor(options?: Partial<ConvolverOptions>);
property buffer
buffer: ToneAudioBuffer;
The convolver's buffer
property input
readonly input: Gain<'gain'>;
property name
readonly name: string;
property normalize
normalize: boolean;
The normalize property of the ConvolverNode interface is a boolean that controls whether the impulse response from the buffer will be scaled by an equal-power normalization when the buffer attribute is set, or not.
property output
readonly output: Gain<'gain'>;
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => ConvolverOptions;
method load
load: (url: string) => Promise<void>;
Load an impulse response url as an audio buffer. Decodes the audio asynchronously and invokes the callback once the audio buffer loads.
Parameter url
The url of the buffer to load. filetype support depends on the browser.
class CrossFade
class CrossFade extends ToneAudioNode<CrossFadeOptions> {}
Tone.Crossfade provides equal power fading between two inputs. More on crossfading technique [here](https://en.wikipedia.org/wiki/Fade_(audio_engineering)#Crossfading).
+---------++> input a +>--++-----------+ +---------------------+ | | || 1s signal +>--> stereoPannerNode L +>----> gain | |+-----------+ | | +---------+ |+-> pan R +>-+ | +--------+| +---------------------+ | +---> output +>+------+ | | +---------+ | +--------+| fade +>----+ | +> input b +>--++------+ | | |+--> gain |+---------+Example 1
const crossFade = new Tone.CrossFade().toDestination(); // connect two inputs Tone.to a/b const inputA = new Tone.Oscillator(440, "square").connect(crossFade.a).start(); const inputB = new Tone.Oscillator(440, "sine").connect(crossFade.b).start(); // use the fade to control the mix between the two crossFade.fade.value = 0.5; Component
constructor
constructor(fade?: number);
Parameter fade
The initial fade value [0, 1].
constructor
constructor(options?: Partial<CrossFadeOptions>);
property a
readonly a: Gain<'gain'>;
The input which is at full level when fade = 0
property b
readonly b: Gain<'gain'>;
The input which is at full level when fade = 1
property fade
readonly fade: Signal<'normalRange'>;
The mix between the two inputs. A fade value of 0 will output 100% crossFade.a and a value of 1 will output 100% crossFade.b.
property input
readonly input: undefined;
CrossFade has no input, you must choose either
a
orb
property name
readonly name: string;
property output
readonly output: Gain<'gain'>;
The output is a mix between
a
andb
at the ratio offade
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => CrossFadeOptions;
class DCMeter
class DCMeter extends MeterBase<DCMeterOptions> {}
DCMeter gets the raw value of the input signal at the current time.
Example 1
const meter = new Tone.DCMeter(); const mic = new Tone.UserMedia(); mic.open(); // connect mic to the meter mic.connect(meter); // the current level of the mic const level = meter.getValue(); Component
See Also
constructor
constructor(options?: Partial<ToneWithContextOptions>);
property name
readonly name: string;
method getValue
getValue: () => number;
Get the signal value of the incoming signal
class Delay
class Delay extends ToneAudioNode<DelayOptions> {}
Wrapper around Web Audio's native [DelayNode](http://webaudio.github.io/web-audio-api/#the-delaynode-interface). Core
Example 1
return Tone.Offline(() => { const delay = new Tone.Delay(0.1).toDestination(); // connect the signal to both the delay and the destination const pulse = new Tone.PulseOscillator().connect(delay).toDestination(); // start and stop the pulse pulse.start(0).stop(0.01); }, 0.5, 1);
constructor
constructor(delayTime?: Time, maxDelay?: Time);
Parameter delayTime
The delay applied to the incoming signal.
Parameter maxDelay
The maximum delay time.
constructor
constructor(options?: Partial<DelayOptions>);
property delayTime
readonly delayTime: Param<'time'>;
The amount of time the incoming signal is delayed.
Example 1
const delay = new Tone.Delay().toDestination(); // modulate the delayTime between 0.1 and 1 seconds const delayLFO = new Tone.LFO(0.5, 0.1, 1).start().connect(delay.delayTime); const pulse = new Tone.PulseOscillator().connect(delay).start(); // the change in delayTime causes the pitch to go up and down
property input
readonly input: DelayNode;
property maxDelay
readonly maxDelay: number;
The maximum delay time. This cannot be changed after the value is passed into the constructor.
property name
readonly name: string;
property output
readonly output: DelayNode;
method dispose
dispose: () => this;
Clean up.
method getDefaults
static getDefaults: () => DelayOptions;
class Distortion
class Distortion extends Effect<DistortionOptions> {}
A simple distortion effect using Tone.WaveShaper. Algorithm from [this stackoverflow answer](http://stackoverflow.com/a/22313408). Read more about distortion on [Wikipedia] (https://en.wikipedia.org/wiki/Distortion_(music)).
Example 1
const dist = new Tone.Distortion(0.8).toDestination(); const fm = new Tone.FMSynth().connect(dist); fm.triggerAttackRelease("A1", "8n"); Effect
constructor
constructor(distortion?: number);
Parameter distortion
The amount of distortion (nominal range of 0-1)
constructor
constructor(options?: Partial<DistortionOptions>);
property distortion
distortion: number;
The amount of distortion. Nominal range is between 0 and 1.
property name
readonly name: string;
property oversample
oversample: OverSampleType;
The oversampling of the effect. Can either be "none", "2x" or "4x".
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => DistortionOptions;
class DuoSynth
class DuoSynth extends Monophonic<DuoSynthOptions> {}
DuoSynth is a monophonic synth composed of two MonoSynths run in parallel with control over the frequency ratio between the two voices and vibrato effect.
Example 1
const duoSynth = new Tone.DuoSynth().toDestination(); duoSynth.triggerAttackRelease("C4", "2n"); Instrument
constructor
constructor(options?: RecursivePartial<DuoSynthOptions>);
property detune
readonly detune: Signal<'cents'>;
property frequency
readonly frequency: Signal<'frequency'>;
property harmonicity
harmonicity: Signal<'positive'>;
Harmonicity is the ratio between the two voices. A harmonicity of 1 is no change. Harmonicity = 2 means a change of an octave.
Example 1
const duoSynth = new Tone.DuoSynth().toDestination(); duoSynth.triggerAttackRelease("C4", "2n"); // pitch voice1 an octave below voice0 duoSynth.harmonicity.value = 0.5;
property name
readonly name: string;
property vibratoAmount
vibratoAmount: Param<'normalRange'>;
The amount of vibrato
property vibratoRate
vibratoRate: Signal<'frequency'>;
the vibrato frequency
property voice0
readonly voice0: MonoSynth;
the first voice
property voice1
readonly voice1: MonoSynth;
the second voice
method dispose
dispose: () => this;
method getDefaults
static getDefaults: () => DuoSynthOptions;
method getLevelAtTime
getLevelAtTime: (time: Time) => NormalRange;
class Emitter
class Emitter<EventType extends string = string> extends Tone {}
Emitter gives classes which extend it the ability to listen for and emit events. Inspiration and reference from Jerome Etienne's [MicroEvent](https://github.com/jeromeetienne/microevent.js). MIT (c) 2011 Jerome Etienne. Core