📕 Web Audio Synthesis & Visualization → Snippets
Here you will find some 'recipes' and patterns that we'll be using during the workshop.
- Playing an Audio Tag
- Loading an Audio Buffer
- Playing an Audio Buffer
- Analysing Audio Waveform
- Analysing Audio Frequency
- Root Mean Squared Metering
- Indexing into the Frequency Array
- Disabling Builtin Play/Pause Controls
// Create <audio> tag
const audio = document.createElement("audio");
// set URL to the MP3 within your Glitch.com assets
audio.src = "path/to/music.mp3";
// To play audio through Glitch.com CDN
audio.crossOrigin = "Anonymous";
// Optional: enable looping so the audio never stops
audio.loop = true;
// Play audio
audio.play();
// If it's not already playing, resume audio context
audioContext.resume();
let audioContext;
let audioBuffer;
async function loadSound() {
// Re-use the same context if it exists
if (!audioContext) {
audioContext = new AudioContext();
}
// Re-use the audio buffer as a source
if (!audioBuffer) {
// Fetch MP3 from URL
const resp = await fetch("path/to/music.mp3");
// Turn into an array buffer of raw binary data
const buf = await resp.arrayBuffer();
// Decode the entire binary MP3 into an AudioBuffer
audioBuffer = await audioContext.decodeAudioData(buf);
}
}
This relies on the loadSound
function just described previously, as you can only play an audio buffer once it's been loaded and decoded asynchronously.
async function playSound() {
// Ensure we are all loaded up
await loadSound();
// Ensure we are in a resumed state
await audioContext.resume();
// Now create a new "Buffer Source" node for playing AudioBuffers
const source = audioContext.createBufferSource();
// Connect to gain (which will be analyzed and also sent to destination)
source.connect(audioContext.destination);
// Assign the loaded buffer
source.buffer = audioBuffer;
// Start (zero = play immediately)
source.start(0);
}
Browsers, by default, will play/pause <audio>
elements on keyboard controls, and also sometimes when you connect and disconnect bluetooth headphones. In many apps, you may want to override this.
// just ignore this event
navigator.mediaSession.setActionHandler("pause", () => {});
let data;
let analyserNode;
function setupAudio() {
/* ... create an audio 'source' node ... */
analyserNode = audioContext.createAnalyser();
signalData = new Float32Array(analyserNode.fftSize);
source.connect(analyserNode);
}
function draw() {
analyserNode.getFloatTimeDomainData(signalData);
/* now visualize ... */
}
let data;
let frequencyData;
function setupAudio() {
/* ... create an audio 'source' node ... */
analyserNode = audioContext.createAnalyser();
frequencyData = new Float32Array(analyserNode.frequencyBinCount);
source.connect(analyserNode);
}
function draw() {
analyserNode.getFloatFrequencyData(frequencyData);
/* now visualize ... */
}
Start with Analysing Audio Waveform snippet and then pass the data into the following function to get a signal between 0 and 1.
function rootMeanSquaredSignal(data) {
let rms = 0;
for (let i = 0; i < data.length; i++) {
rms += data[i] * data[i];
}
return Math.sqrt(rms / data.length);
}
If you have an array that represents a list of frequency bins (i.e. where the indices represent a frequency band in Hz and the array elements represent it's signal in Db) you can convert from Hz to an index and back like so:
// Convert the frequency in Hz to an index in the array
function frequencyToIndex(frequencyHz, sampleRate, frequencyBinCount) {
const nyquist = sampleRate / 2;
const index = Math.round((frequencyHz / nyquist) * frequencyBinCount);
return Math.min(frequencyBinCount, Math.max(0, index));
}
// Convert an index in a array to a frequency in Hz
function indexToFrequency(index, sampleRate, frequencyBinCount) {
return (index * sampleRate) / (frequencyBinCount * 2);
}