How To Detect a WebKit-Based Browser Using JavaScript
Published 13/12/2024
Contents
1 Introduction
Any web developer who likes to make use of modern web standards and who has tested their website on Safari or any other WebKit-based browser[1][1]: Such as any iOS or iPadOS browser. Apple mandates that any iOS browser must use WebKit as its browser engine. Europe’s Digital Markets Act is now forcing Apple to allow other browser engines on iOS and iPadOS, however, as usual, Apple is testing the limits of malicious compliance. will have encountered WebKit-specific bugs. Workarounds for such bugs can often be implemented with JavaScript, but usually involve making compromises when it comes to the quality of the page’s layout or its performance. Therefore, these JS-based workarounds should only be applied when the user is actually impaired by a WebKit-based browser. This raises the question of how to detect whether this is the case. While a quick internet search reveals some answers, they all appear to be outdated and no longer work on modern browsers, or rely on jQuery.
2 Detecting WebKit
Before we get into how to detect a WebKit-based browser, it should be noted that browser detection should only be used to work around browser-specific bugs, not to check for the existence of features, which should be done using feature detection instead. Note also that providing different HTML depending on the visitor’s browser is usually considered bad practice. See this MDN article on user agent detection for more information.
Detecting WebKit is easier said than done. In an ideal world, each
browser would be honest about what it is, and a simple query of navigator.userAgent
would provide all the information we need. Unfortunately, browsers are
highly dishonest pieces of software and all pretend to be each other
when they’re not. Take Chromium’s userAgent for example (on
Linux):
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36
A reasonable person might ask themselves why Chromium’s
userAgent starts with Mozilla and contains the
strings AppleWebKit, KHTML,
Gecko, and Safari. The answer to that question
is a rather long and
convoluted story, for those interested.
To detect WebKit, we need to find out what makes the
userAgents of WebKit-based browsers unique. As a first
step, we could check for the presence of AppleWebKit in the
userAgent. Unfortunately, this will also return true for
Chromium-based browsers, as per the above userAgent.
Therefore, we need to include a check to make sure the
userAgent does not contain Chrome. Since
Chrome on iOS also uses WebKit, a simple check for the presence of
AppleWebKit and the absence of Chrome may not
suffice, as iOS- and iPadOS-based browsers may subsequently evade this
detection mechanism. While the current Chrome
userAgents for iOS don’t seem to contain the string
Chrome, it is still worth making sure we return true for all iOS and iPadOS browsers. This leaves us
with the following JS code:
function isWebKit() {
const ua = navigator.userAgent;
// As far as I can tell, Chromium-based desktop browsers are the only browsers
// that pretend to be WebKit-based but aren't.
return (/AppleWebKit/.test(ua) && !/Chrome/.test(ua)) ||
/\b(iPad|iPhone|iPod)\b/.test(ua);
}
if (isWebKit()) {
// Apply workarounds for WebKit bugs.
}
I tested this with the userAgents of most modern
browsers, and haven’t yet found a case where this doesn’t work, but if
you do, please let me know!