How I got WebGL working in LibreWolf on an RTX 5070 Ti running CachyOS. What started as "why are my fans so loud watching YouTube" turned into a multi-session investigation that hit multiple layers of GPU configuration, a Wayland vs X11 decision, and separate WebGL blockers that all needed fixing.
Environment
| Component | Value |
|---|---|
| Distro | CachyOS (Arch-based, rolling) |
| Kernel | 7.0.2-2-cachyos |
| GPU | NVIDIA GeForce RTX 5070 Ti |
| Driver | 595.71 |
| Display server | Wayland (KDE Plasma) |
| Browser | LibreWolf 150.0-1 |
The symptom: fans screaming at 4K
I was watching 4K YouTube videos and the PC fans ramped up way more than they should. Fired up CoolerControl and confirmed it — the CPU was getting hammered. My RTX 5070 Ti was sitting there doing almost nothing while the CPU was decoding video in software.
My first instinct was that the browser had hardware acceleration off. That would explain everything — video decode piling onto the CPU, no GPU compositing, the works.
Opened about:support in LibreWolf and scrolled to the Graphics section. The
picture was bad. The browser was barely engaging the GPU. WebRender was
technically listed but hardware video decoding wasn't meaningfully active. The
GPU process existed but wasn't doing the heavy lifting. For all practical
purposes, the browser was running in software mode on a machine with a perfectly
good GPU.
Why "just turn on hardware acceleration" isn't simple
This is the part that makes the whole problem non-obvious. Getting a Firefox-based browser to actually use an NVIDIA GPU on Linux isn't one toggle. It's a chain of things, and if any link breaks, the browser silently falls back to software rendering.
Here's what the chain looks like:
1. VA-API driver
NVIDIA doesn't ship a VA-API driver. Firefox (and LibreWolf) use VA-API for
hardware video decoding on Linux, so a bridge is needed. That bridge is
libva-nvidia-driver (in the Arch/CachyOS repos — this is the same package as
nvidia-vaapi-driver, just renamed). Without it, hardware video decoding is
simply unavailable.
2. Environment variables
The VA-API driver needs environment variables set before the browser launches.
These aren't toggleable in about:config. They're system-level:
| Variable | Purpose |
|---|---|
LIBVA_DRIVER_NAME=nvidia | Tells libva to use the NVIDIA backend instead of looking for a default driver |
NVD_BACKEND=direct | Tells nvidia-vaapi-driver to use direct rendering (bypasses VDPAU, which has its own issues) |
MOZ_DISABLE_RDD_SANDBOX=1 | Firefox's RDD (Remote Data Decoder) sandbox isolates the video decoder in a sandbox that can block VA-API access. This disables that sandbox. |
If these aren't set, the browser finds no VA-API driver and falls back to software decode. No error, no warning. It just uses the CPU.
I set these via systemd user environment in
~/.config/environment.d/librewolf-hwdec.conf. This ensures they're available
to every process in the session — browser, terminal, launcher script,
everything. See
systemd.environment-generator(7).
3. Firefox preferences
Even with the driver and env vars, Firefox has its own set of prefs that need to be enabled:
jspref('media.ffmpeg.vaapi.enabled', true); // enable VA-API pref('media.hardware-video-decoding.enabled', true); // enable HW video decoding pref('media.hardware-video-decoding.force-enabled', true); // force it on pref('media.gpu-process-decoder', true); // use GPU process for decoding
4. GPU process / WebRender
The GPU compositor itself needs to be active:
jspref('gfx.webrender.all', true); pref('gfx.webrender.software', false); pref('layers.acceleration.force-enabled', true); pref('layers.gpu-process.enabled', true);
5. No GPU process block
On Wayland, the GPU process can be blocked by a FEATURE_FAILURE_WAYLAND entry
in Firefox's internal blocklist. On X11, this particular block doesn't apply.
6. WebGL (separate problem)
Even if everything above works, WebGL is independently blocked by two more systems (covered in the root cause section below).
The point is: there are at least six independent things that all need to be right. If any one of them is wrong, the browser falls back to software rendering. And it does this silently. No dialog, no warning. The fans just spin up.
The dual-mode launcher
To figure out which link in the chain was broken, I built a launcher script that could start LibreWolf in two modes — full GPU acceleration or pure CPU software rendering — sharing the same profile, bookmarks, and extensions. The idea was to toggle everything at once and compare behavior.
If GPU mode and CPU mode looked the same, the GPU path wasn't being reached. If GPU mode was better, I could start narrowing down which pref or env var made the difference.
Launcher script (~/.local/bin/librewolf-launcher)
bash#!/usr/bin/env bash # 1. Dynamically find the default profile CONF_DIR="$HOME/.config/librewolf/librewolf" if [ ! -d "$CONF_DIR" ]; then CONF_DIR="$HOME/.librewolf" fi # Find the default profile folder (installs.ini is the most reliable source) DEFAULT_PROFILE=$(awk -F= '/^Default=/{print $2}' "$CONF_DIR/installs.ini" 2>/dev/null | head -n 1) if [ -z "$DEFAULT_PROFILE" ]; then DEFAULT_PROFILE=$(grep -B 3 "Default=1" "$CONF_DIR/profiles.ini" | grep "Path=" | cut -d= -f2 | head -n 1) fi PROFILE_DIR="$CONF_DIR/$DEFAULT_PROFILE" USER_JS="$PROFILE_DIR/user.js" FLAG_FILE="/tmp/librewolf-runtime-mode-$USER" REQUESTED_MODE=$1 shift # 2. Check if already running if pgrep -x "librewolf" > /dev/null; then CURRENT_MODE=$(cat "$FLAG_FILE" 2>/dev/null) if [ "$REQUESTED_MODE" != "$CURRENT_MODE" ] && [ -n "$CURRENT_MODE" ]; then if command -v notify-send >/dev/null; then notify-send "LibreWolf" "Already running in ${CURRENT_MODE} mode. Opening tab." fi fi exec librewolf "$@" fi # 3. Apply settings and launch echo "$REQUESTED_MODE" > "$FLAG_FILE" TEMP_JS=$(mktemp) if [ "$REQUESTED_MODE" = "gpu" ]; then cat << 'innerEOF' > "$TEMP_JS" // LibreWolf GPU Max Profile user_pref("gfx.webrender.all", true); user_pref("gfx.webrender.software", false); user_pref("layers.acceleration.disabled", false); user_pref("layers.acceleration.force-enabled", true); user_pref("media.ffmpeg.vaapi.enabled", true); user_pref("media.hardware-video-decoding.force-enabled", true); user_pref("webgl.disabled", false); innerEOF # Env vars (LIBVA_DRIVER_NAME, NVD_BACKEND, MOZ_DISABLE_RDD_SANDBOX) # are set system-wide via ~/.config/environment.d/librewolf-hwdec.conf else # CPU Only Mode cat << 'innerEOF' > "$TEMP_JS" // LibreWolf CPU Only Profile user_pref("gfx.webrender.software", true); user_pref("gfx.webrender.force-disabled", true); user_pref("layers.acceleration.disabled", true); user_pref("webgl.disabled", true); user_pref("dom.webgpu.enabled", false); user_pref("layers.gpu-process.enabled", false); innerEOF export MOZ_DISABLE_GPU_PROCESS=1 export LIBGL_ALWAYS_SOFTWARE=1 export GALLIUM_DRIVER=llvmpipe fi mv "$TEMP_JS" "$USER_JS" exec librewolf "$@"
The script does three things:
- Finds the active LibreWolf profile dynamically
- Writes a
user.jswith the appropriate prefs for the requested mode - Exports the environment variables the browser needs before launching it (CPU mode only — GPU mode relies on environment.d, see below)
GPU mode's env vars (LIBVA_DRIVER_NAME=nvidia, NVD_BACKEND=direct,
MOZ_DISABLE_RDD_SANDBOX=1) are set system-wide via
~/.config/environment.d/librewolf-hwdec.conf rather than exported by the
script. The GPU block in the launcher only writes user.js prefs — the env vars
are already in the environment by the time the script runs. CPU mode sets
MOZ_DISABLE_GPU_PROCESS=1, LIBGL_ALWAYS_SOFTWARE=1, and
GALLIUM_DRIVER=llvmpipe directly in the script to force software rendering
through LLVMpipe, since those are only needed for the CPU mode path.
If the browser is already running in a different mode, it just opens a new tab in the existing instance (can't hot-swap the GPU path while the browser is running).
Desktop launchers (~/.local/share/applications/)
Two .desktop files so I could launch each mode from the app menu:
GPU launcher (librewolf-gpu.desktop):
ini[Desktop Entry] Name=LibreWolf (Max GPU) Comment=Full GPU acceleration -- WebRender + VA-API Exec=/home/indra/.local/bin/librewolf-launcher gpu %u Icon=librewolf Terminal=false Type=Application StartupWMClass=LibreWolf
CPU launcher (librewolf-cpu.desktop):
ini[Desktop Entry] Name=LibreWolf (CPU Only) Comment=No GPU -- software WebRender + CPU video decode Exec=/home/indra/.local/bin/librewolf-launcher cpu %u Icon=librewolf Terminal=false Type=Application StartupWMClass=LibreWolf
What the launcher showed
GPU mode and CPU mode looked essentially the same. The fans spun up the same
way, htop showed the same CPU load, and about:support didn't show a
meaningful difference in GPU engagement. The browser wasn't reaching the GPU
path at all, even with every preference set correctly and the right environment
variables exported.
That was the clue that the problem wasn't a missing pref or a wrong env var. Something else was blocking the GPU path entirely.
Trying X11
I'd been using Hyprland (Wayland) on my Omarchy setup and liked the ergonomics, but I was already considering switching to an i3-based window manager on X11. Before going that far, I decided to try KDE Plasma on X11 first since that's what this machine was already running. Wayland support for NVIDIA has been improving, but it's still where a lot of GPU-related weirdness lives. X11 is the more battle-tested path for NVIDIA on Linux.
I installed the required packages (plasma-x11-session, kwin-x11,
xorg-server), logged out of Wayland, and switched to the X11 session in SDDM.
What happened on X11
After switching to X11, some things improved immediately. WebRender kicked in properly. VA-API video decoding started working. The GPU process was active. For the first time, the browser was actually offloading work to the GPU.
But WebGL was still dead.
I dug into the configuration across three layers:
prefs.js(runtime-generated, changes when togglingabout:configsettings)librewolf.overrides.cfg(user config, survives updates)librewolf.cfg(system-level defaults at/usr/lib/librewolf/)
prefs.js also had some stale prefs from the Wayland setups that needed cleanup, but the real question was why WebGL specifically was dead.
The overrides.cfg at this stage
The overrides.cfg had GPU and VA-API sections built during earlier troubleshooting, but no WebGL-specific prefs:
js// VA-API hardware video decoding pref('media.ffmpeg.vaapi.enabled', true); pref('media.hardware-video-decoding.enabled', true); pref('media.hardware-video-decoding.force-enabled', true); pref('media.gpu-process-decoder', true); // GPU acceleration / WebRender pref('gfx.webrender.all', true); pref('gfx.webrender.software', false); pref('gfx.webrender.compositor', true); pref('layers.acceleration.disabled', false); pref('layers.acceleration.force-enabled', true); pref('layers.gpu-process.enabled', true); // Only had these two — no blocklist bypass, no prompt control pref('webgl.disabled', false); pref('dom.webgpu.enabled', true);
WebRender and VA-API were working. Hardware video decoding was available. GPU
process was active. But about:support still listed WebGL as "disabled."
That was the next puzzle.
Root cause: two independent blockers
WebGL wasn't off because of one setting. It was off because of two separate systems that each independently block it.
Blocker 1: LibreWolf's site permission system
Around LibreWolf 148.0.2, the project added a per-site WebGL permission system
controlled by librewolf.webgl.prompt. When this is true (the default), the
browser treats WebGL like a sensitive site permission — similar to camera or
microphone access — and prompts before allowing it on each site.
This sounds reasonable in theory. In practice, it caused widespread breakage. The NixOS community hit it first (nixpkgs#503889): their LibreWolf build was missing the translation files for the permission dialog, so the prompt couldn't render and WebGL just silently stayed off everywhere. Even on distros where the dialog does render, if it's never interacted with (or suppressed by some other config), the effect is the same — WebGL is disabled.
The fix is to set librewolf.webgl.prompt to false, which tells LibreWolf to
allow WebGL everywhere without asking.
This does weaken a fingerprinting protection. LibreWolf's features page explicitly lists WebGL as "a strong fingerprinting vector." The per-site permission system exists so sites can't silently use WebGL to fingerprint the GPU. By disabling the prompt, any site gets that access without asking. LibreWolf's Resist Fingerprinting (RFP), which is on by default, mitigates some of this by spoofing the WebGL renderer string and other parameters. But RFP doesn't block every WebGL fingerprinting vector. Anyone who wants per-site control can keep the prompt enabled and grant WebGL access manually per site — but then the prompt actually has to be interacted with.
Blocker 2: Firefox's internal GPU blocklist
Firefox (and therefore LibreWolf) ships a blocklist of GPU + driver combinations that Mozilla has flagged as problematic for certain features. NVIDIA on Linux has entries in this blocklist. The blocklist doesn't check the actual driver version and decide "yeah, this one's fine" — it blocks the combination categorically.
My driver (595.71) runs WebGL perfectly. The blocklist doesn't care. The pref
webgl.ignore-blocklist exists to bypass it, but it's not set by default, and
the overrides.cfg didn't have it.
That's why this was annoying. Fixing only the blocklist bypass leaves the site permission system blocking WebGL. Fixing only the site permission system leaves the blocklist blocking it. Each one looks like the answer if you only investigate one layer.
The fix
Added these prefs to the overrides.cfg:
js// WebGL / WebGPU pref('webgl.disabled', false); defaultPref('webgl.force-enabled', true); pref('webgl.ignore-blocklist', true); defaultPref('librewolf.webgl.prompt', false); pref('dom.webgpu.enabled', true);
This overrides.cfg was not pre-existing. It was built up over the course of this investigation — the VA-API and WebRender sections came from the initial GPU troubleshooting, the WebGL section came from discovering the two blockers. It's managed by chezmoi now so it persists across updates.
Preference reference
| Pref | Type | What it does | Why it's needed |
|---|---|---|---|
webgl.disabled | pref() | Master switch. false = WebGL on. | LibreWolf's librewolf.cfg already sets this to false, but I include it explicitly so the intent is clear in one place. |
webgl.force-enabled | defaultPref() | Forces WebGL on even for GPUs the blocklist would block. | Belt-and-suspenders with webgl.ignore-blocklist. The blocklist bypass is the targeted fix; this is the override that doesn't let the blocklist win even if it somehow still applies. |
webgl.ignore-blocklist | pref() | Skips the internal GPU blocklist check entirely. | This is the actual fix for the Firefox blocklist. Without it, NVIDIA on Linux gets blocked regardless of driver version. Uses pref() so it can't be silently overridden. |
librewolf.webgl.prompt | defaultPref() | Controls the per-site WebGL permission prompt. false = allow everywhere, no prompt. | Disables the site permission system (blocker 1). Caveat: LibreWolf considers WebGL a strong fingerprinting vector and the prompt is their mitigation — disabling it lets any site use WebGL without asking. RFP mitigates some of this. Uses defaultPref() so the prompt can be re-enabled later for per-site control. |
dom.webgpu.enabled | pref() | Enables WebGPU (the WebGL successor). | LibreWolf's librewolf.cfg disables this by default. Not part of the WebGL fix, but I wanted it on. |
The difference between pref() and defaultPref():
pref()enforces the value. It overrides anyabout:configvalue. Use this for settings that must not change silently.defaultPref()sets a default that can be overridden inabout:config. Use this where user preference should win.
I used pref() for webgl.ignore-blocklist because if the blocklist kicks in,
WebGL silently dies and I'd spend another afternoon wondering why. I used
defaultPref() for librewolf.webgl.prompt because someone might actually want
per-site control later.
Full config file
The complete librewolf.overrides.cfg:
js// LibreWolf GPU / Hardware Decoding Overrides // Managed by chezmoi — do not edit manually // VA-API hardware video decoding pref('media.ffmpeg.vaapi.enabled', true); pref('media.hardware-video-decoding.enabled', true); pref('media.hardware-video-decoding.force-enabled', true); pref('media.gpu-process-decoder', true); // GPU acceleration / WebRender pref('gfx.webrender.all', true); pref('gfx.webrender.software', false); pref('gfx.webrender.compositor', true); pref('layers.acceleration.disabled', false); pref('layers.acceleration.force-enabled', true); pref('layers.gpu-process.enabled', true); // WebGL / WebGPU pref('webgl.disabled', false); defaultPref('webgl.force-enabled', true); pref('webgl.ignore-blocklist', true); defaultPref('librewolf.webgl.prompt', false); pref('dom.webgpu.enabled', true);
Verification
After applying the fix, I restarted LibreWolf and checked about:support. Under
Graphics, these should appear:
| Field | Expected value | Before fix |
|---|---|---|
| GPU Process | Active, no FEATURE_FAILURE_* errors | Active (if on X11) |
| WebGL Renderer | NVIDIA GeForce RTX 5070 Ti (or similar) | — (disabled) |
| WebGL2 Renderer | NVIDIA GeForce RTX 5070 Ti (or similar) | — (disabled) |
| Compositing | WebRender | WebRender (was already working) |
| Hardware Video Decoding | Available | Depends on earlier fixes |
Test URLs:
- https://get.webgl.org/ — simple yes/no check
- https://webglsamples.org/aquarium/aquarium.html — actual WebGL rendering
If WebGL still shows as disabled after restarting, check:
- You edited the right file (chezmoi has a source copy and a deployed copy)
- No conflicting prefs in
prefs.jsfrom manualabout:configchanges (checkabout:configfor anywebgl.*orlibrewolf.webgl.*prefs set to unexpected values) - The browser actually restarted (not just a new window — fully quit and relaunch)
Known remaining issue: DMABUF_WEBGL
There's a separate blocklist entry for DMABUF_WEBGL, which is the DMA-BUF
based compositor path. This is a Wayland-specific optimization for GPU buffer
sharing between the compositor and the browser. NVIDIA on Linux is also
blocklisted for this feature, and webgl.ignore-blocklist doesn't cover it.
This doesn't affect WebGL functionality itself — it affects how efficiently buffers are shared between the compositor and the GPU process. On X11 this is irrelevant. On Wayland it might matter for performance, but since I moved to X11, I haven't dug into it further.
On Wayland, check about:support for DMABUF_WEBGL status. Fixing it would
likely require patching the blocklist entry in the Firefox source or waiting for
NVIDIA to improve their DMABUF support.
References
- LibreWolf Settings and overrides docs
— official docs for
librewolf.overrides.cfg, including the WebGL permission section - NixOS nixpkgs#503889 — LibreWolf WebGL breakage from the per-site permission system's missing translation files
- Firefox Bug 1563854 — WebGL accidentally blacklisted for NVIDIA proprietary driver, shows how the blocklist works in practice
- Mozilla Wiki: Blocklisting/Blocked Graphics Drivers — background on Firefox's GPU blocklist system
- SitePoint: Enable WebGL for Blocked Graphics Cards
— general guide to
webgl.force-enabledand the blocklist bypass