You reached this documentation probably because you’re developing an app
that uses the Web Audio API
and experienced some unexpected glitches such as popping noises from the
output. You might already be involved in a crbug.com
discussion and a Chrome engineer has asked you to upload “tracing data”. This
guide shows how to obtain the data so you can help engineers triage, and
eventually fix, the issue.
There are two tools that will help you when profiling Web Audio,
chrome://tracing and the WebAudio tab in Chrome DevTools.
When do you use
When mysterious “glitches” happen. Profiling the app with the tracing tools
gives you insights on:
- Time slices spent by specific function calls on different threads
- Audio callback timing in the timeline view
It usually shows the missed deadlines or the big garbage collection stops that
might cause unexpected audio glitches. This information is useful for triaging
a bug. Chromium engineers will ask for tracing data if a local reproduction of the issue
is not feasible. See The Trace Event Profiling Tool for
general instructions on how to trace.
When do you use the WebAudio tab?
When you want to get a feel for how the application performs in the real world.
DevTools shows you a running estimate of render capacity, which indicates
how the web audio rendering engine is handling render tasks over a given
render budget (for example, approximately 2.67ms @ 48KHz). If the capacity
goes near 100%, that means your app is likely to produce glitches because the
renderer is failing to finish the work in the render budget.
How to capture tracing data
The instructions below written are for Chrome 80 and later.
For best results, close all other tabs and windows, and disable extensions.
Alternatively you can either launch a new instance of Chrome
or use other builds from different release channels (e.g.
Beta or Canary). Once you have the browser ready, follow the steps below:
Open your application (web page) on a tab.
Open another tab and go to
Press the Record button and select Manually select settings.
Press the None buttons on both the Record Categories and
Disabled by Default Categories sections.
In the Record Categories section, select the following:
v8.execute(if you’re interested in
AudioWorkletJS code performance)
In the Disabled by Default Categories section, select the following:
audio-worklet(if you’re interested in where the
webaudio.audionode(if you need the detailed trace for each
Press the Record button at the bottom.
Go back to your application tab and redo the steps that triggered the issue.
When you have enough trace data, go back to the tracing tab and press Stop.
The tracing tab will visualize the result.
Press Save to save the tracing data.
How to analyze tracing data
The tracing data visualizes how Chrome’s web audio engine renders the audio.
The renderer has two different render modes: Native mode and
Worklet mode. Each mode uses a different threading model, so the tracing
results also differ.
In Native mode, the
AudioOutputDevice thread runs
all the web audio code. The
AudioOutputDevice is a real-time priority thread
originating from the browser’s Audio Service that is driven by the audio
hardware clock. If you see irregularity from the trace data in this lane,
it means the callback timing from the device may be jittery. The combination
of Linux and Pulse Audio is known to have this problem. See the following Chromium issues
for more details: #825823,
In Worklet Mode, which is characterized by one thread jump from
AudioOutputDevice to the
AudioWorklet thread, you
should see well-aligned traces in two thread lanes as shown below. When the
worklet is activated all the web audio operations are rendered by the
AudioWorklet thread. This thread is currently not a real-time priority one.
The common irregularity here is a big block caused by the garbage collection
or missed render deadlines. Both cases lead to glitches in the audio stream.
In both cases, the ideal tracing data is characterized by well-aligned audio
device callback invocations and render tasks being completed well within the
given render budget. The two screenshots above are great examples of the ideal
Learning from real-world examples
Example 1: Render tasks going beyond render budget
The screenshot below (Chromium issue #796330) is a
typical example of when code in
AudioWorkletProcessor takes too long and
goes beyond a given render budget. The callback timing is well behaved but
the audio processing function call of the Web Audio API failed to complete the
work before the next device callback.
- Reduce the workload of the audio graph by using fewer
- Reduce the workload of your code in the
- Increase the base latency of
Example 2: Significant garbage collection on the worklet thread
Unlike on the native audio rendering thread, garbage collection is managed
on the worklet thread. That means if your code does memory allocation/deallocation
(e.g. new arrays) it eventually triggers a garbage collection which
synchronously blocks the thread. If the workload of web audio operations and
garbage collection is bigger than a given render budget, it results in
glitches in the audio stream. The following screenshot is an extreme example of this
AudioWorkletProcessor implementation generates
Float32Array instances for
the input and output buffer every audio processing callback. This also
slowly builds up the memory usage over time. The team has a plan to improve
the design once the related specification is finalized.
- Allocate the memory up front and reuse it whenever possible.
- Use different design patterns based on
SharedArrayBuffer. Although this
is not a perfect solution, several web audio apps use a similar pattern with
SharedArrayBufferto run the intensive audio code. Examples:
Example 3: Jittery audio device callback from
The precise timing of audio callback is the most important thing for web audio.
This should be the most precise clock in your system. If the operating system
or its audio subsystem cannot guarantee a solid callback timing, all the
subsequent operations will be affected. The following image is an example
of jittery audio callback. Compared to the previous two images, the interval
between each callback varies significantly.
This is a known issue on Linux, which uses Pulse Audio as an audio
backend. This is still under investigation (Chromium issue #825823).
You also can use the DevTools tab specifically designed for web audio. This
is less comprehensive compared to the tracing tool, but it is useful if you
want to gauge the running performance of your application.
Access the panel by opening the Main Menu of
DevTools, then go to More tools > WebAudio.
This tab shows information about running instances of
Use it to see how the web audio renderer is performing on the page.
Since a page can have multiple
BaseAudioContext instances, the Context Selector
(which is the drop-down menu that says
realtime (4e1073) in the last screenshot),
allows you to choose what you want to inspect. The inspector
view shows the properties (e.g. sample rate, buffer size, channel count, and
context state) of a
BaseAudioContext instance that you select, and it
dynamically changes when properties change.
The most useful thing in this view is the status bar at the bottom. It is only
active when the selected
BaseAudioContext is an
AudioContext, which runs
in real-time. This bar shows the instantaneous audio stream quality of a
AudioContext and is updated every second. It provides the following
- Callback interval (ms): Displays the weighted mean/variance of callback
interval. Ideally the mean should be stable and the variance should be
close to zero. Otherwise the operating system’s audio infra might have
problems in deeper layers.
- Render Capacity (percent): Follows this formula: (time spent in actual
rendering / instantaneous callback interval) × 100. When the capacity
gets close to 100 percent, it means that the renderer is doing too much for a
given render budget, so you should consider doing less in the web audio code.
You can manullay trigger a garbage collector by clicking the trash can icon.
Debugging audio is hard. Debugging audio in the browser is even harder.
However, these tools can ease the pain by providing you with useful insights
on how the web audio code performs. In some cases, you may find that web
audio does not behave as it should – then do not be afraid to
file a bug on Chromium Bug Tracker. While filling out the information,
you can follow the guideline above and submit the tracing data you captured
with a reproducible test case. With this data the Chrome engineers will be able
to fix your bug much faster.
Photo by Jonathan Velasquez on Unsplash