AudioProcessingEvent
Summary
This interface is a type of Event which is passed to the onaudioprocess event handler used by ScriptProcessorNode. The event handler processes audio from the input (if any) by accessing the audio data from the inputBuffer attribute. The audio data which is the result of the processing (or the synthesized data if there are no inputs) is then placed into the outputBuffer.
Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/.
Properties
- inputBuffer
An AudioBuffer containing the input audio data. It will have a number of channels equal to the numberOfInputChannels parameter of the createScriptProcessor() method. This AudioBuffer is only valid while in the scope of the onaudioprocess function. Its values will be meaningless outside of this scope.
Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/.
- node
The ScriptProcessorNode associated with this processing event.
Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/.
- outputBuffer
An AudioBuffer where the output audio data should be written. It will have a number of channels equal to the numberOfOutputChannels parameter of the createScriptProcessor() method. Script code within the scope of the onaudioprocess function is expected to modify the Float32Array arrays representing channel data in this AudioBuffer. Any script modifications to this AudioBuffer outside of this scope will not produce any audible effects.
Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/.
- playbackTime
The time when the audio will be played, in the same time coordinate system as AudioContext.currentTime. playbackTime allows for very tight synchronization between processing directly in JavaScript with the other events in the context’s rendering graph.
Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/.
Methods
No methods.
Events
No events.
Related specifications
- W3C Web Audio API
- W3C Editor’s Draft