Introduction
The Web Audio API is one of the most powerful browser APIs. It allows analyzing, processing and generating audio in real time. Combined with Canvas, it opens the door to spectacular audio visualizations directly in the browser.
In this tutorial, we will build 3 different types of visualizations: a bar frequency spectrum, an oscilloscope waveform and a reactive radial circle. Each example is accompanied by the complete code and an interactive demo.
The Web Audio API requires a user interaction (click) before it can start, in accordance with the autoplay policies of modern browsers.
1. Basic setup
Before creating our visualizations, we need to configure the audio context and analyzer. The AnalyserNode is the heart of any audio visualization: it extracts frequency and waveform data in real time.
// Create audio context
const audioContext = new AudioContext();
const analyser = audioContext.createAnalyser();
// fftSize determines the analysis precision
// Higher value means finer analysis
analyser.fftSize = 256;
// Connect the microphone
navigator.mediaDevices.getUserMedia({ audio: true })
.then(stream => {
const source = audioContext.createMediaStreamSource(stream);
source.connect(analyser);
visualize();
});
Key AnalyserNode parameters
- fftSize: FFT (Fast Fourier Transform) size. Possible values: 32 to 32768. Determines the number of frequency bars (fftSize / 2)
- frequencyBinCount: Number of available data points (= fftSize / 2)
- smoothingTimeConstant: Temporal smoothing between 0 and 1. The higher the value, the smoother the transition
- minDecibels / maxDecibels: Decibel range for the visualization
2. Frequency bars
The frequency spectrum is the most classic audio visualization. Each bar represents the amplitude of a specific frequency band, from bass (left) to treble (right).
function drawBars(analyser, canvas) {
const ctx = canvas.getContext('2d');
const bufferLength = analyser.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
function draw() {
// Get frequency data
analyser.getByteFrequencyData(dataArray);
// Clear the canvas
ctx.fillStyle = '#0a0a0f';
ctx.fillRect(0, 0, canvas.width, canvas.height);
const barWidth = canvas.width / bufferLength;
dataArray.forEach((value, i) => {
const barHeight = value / 1.5;
// Color gradient based on frequency
const hue = (i / bufferLength) * 270 + 240;
ctx.fillStyle = `hsl(${hue}, 70%, 50%)`;
ctx.fillRect(
i * barWidth,
canvas.height - barHeight,
barWidth - 1,
barHeight
);
});
requestAnimationFrame(draw);
}
draw();
}
How it works
The getByteFrequencyData() method fills an array with the amplitudes of each frequency band, on a scale of 0 to 255. Here is the detail:
- requestAnimationFrame: Synchronizes rendering with the screen refresh rate (~60fps)
- barWidth: Each bar occupies an equal fraction of the canvas width
- HSL Color: The hue varies by position to create a natural color gradient
3. Waveform (Oscilloscope)
The waveform visualization displays the raw audio signal, like an oscilloscope. It's ideal for seeing the temporal structure of sound.
function drawWaveform(analyser, canvas) {
const ctx = canvas.getContext('2d');
const bufferLength = analyser.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
function draw() {
// Waveform data (not frequency)
analyser.getByteTimeDomainData(dataArray);
ctx.fillStyle = '#0a0a0f';
ctx.fillRect(0, 0, canvas.width, canvas.height);
// Draw the curve
ctx.lineWidth = 2;
ctx.strokeStyle = '#6366f1';
ctx.beginPath();
const sliceWidth = canvas.width / bufferLength;
dataArray.forEach((value, i) => {
const x = i * sliceWidth;
const y = (value / 128) * (canvas.height / 2);
i === 0
? ctx.moveTo(x, y)
: ctx.lineTo(x, y);
});
ctx.stroke();
requestAnimationFrame(draw);
}
draw();
}
getByteFrequencyData() returns the frequency spectrum (frequency domain), while getByteTimeDomainData() returns the raw waveform (time domain). Both use the same AnalyserNode.
4. Radial circle
The radial circle is a more artistic visualization where frequency bars are arranged in a circle. It's a very popular effect in music players and interactive experiences.
function drawRadial(analyser, canvas) {
const ctx = canvas.getContext('2d');
const bufferLength = analyser.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
const centerX = canvas.width / 2;
const centerY = canvas.height / 2;
const radius = 60;
function draw() {
analyser.getByteFrequencyData(dataArray);
ctx.fillStyle = 'rgba(10, 10, 15, 0.2)';
ctx.fillRect(0, 0, canvas.width, canvas.height);
for (let i = 0; i < bufferLength; i++) {
const angle = (i / bufferLength) * Math.PI * 2;
const barHeight = dataArray[i] / 3;
const hue = (i / bufferLength) * 360;
const x1 = centerX + Math.cos(angle) * radius;
const y1 = centerY + Math.sin(angle) * radius;
const x2 = centerX + Math.cos(angle) * (radius + barHeight);
const y2 = centerY + Math.sin(angle) * (radius + barHeight);
ctx.strokeStyle = `hsl(${hue}, 70%, 60%)`;
ctx.lineWidth = 2;
ctx.beginPath();
ctx.moveTo(x1, y1);
ctx.lineTo(x2, y2);
ctx.stroke();
}
requestAnimationFrame(draw);
}
draw();
}
The radial visualization is more costly to render. On mobile, reduce the fftSize to 128 and limit the number of drawn bars to maintain a smooth framerate.
5. Using an audio file
Rather than the microphone, you can also visualize an audio file or an <audio> element. This is often more convenient for sites with an integrated music player.
// Use an existing <audio> element
const audio = document.querySelector('audio');
const audioContext = new AudioContext();
const analyser = audioContext.createAnalyser();
// Create source from the audio element
const source = audioContext.createMediaElementSource(audio);
source.connect(analyser);
// Important: also connect to destination to hear the audio
analyser.connect(audioContext.destination);
// Start visualization when audio plays
audio.addEventListener('play', () => {
audioContext.resume();
drawBars(analyser, canvas);
});
When using createMediaElementSource(), don't forget to connect the analyzer to audioContext.destination, otherwise the audio will be muted! With the microphone, this connection is not necessary because we don't want audio feedback.
Best practices
Here are the essential recommendations for creating performant and accessible audio visualizers:
Performance
- Reduce fftSize on mobile: 128 or 64 are sufficient for smooth rendering
- Use requestAnimationFrame rather than setInterval to synchronize with the screen
- Avoid allocations in the loop: create your arrays outside the draw function
- Clean up resources: close the AudioContext and stop the microphone stream when the page is left
Accessibility and UX
- Request permission: microphone access must be triggered by a user click
- Handle errors: plan for the case where the user refuses microphone access
- Respect prefers-reduced-motion: offer a static version for sensitive users
Resource cleanup
// Properly stop the visualizer
function stopVisualizer(stream, audioContext, animationId) {
// Stop the microphone
stream.getTracks().forEach(track => track.stop());
// Close audio context
audioContext.close();
// Stop the animation
cancelAnimationFrame(animationId);
}
Conclusion
The Web Audio API opens a world of creative possibilities. By combining frequency analysis with Canvas, you can create unique visualizations that react to sound in real time.
The three types of visualization we've seen (bars, waveform, radial circle) are just the beginning. You can combine them, add particle effects, or even use WebGL for 3D renders.
Discover our interactive audio-based effects in our effects library, with one-click copyable code.