Stealth Mode
The stealth feature replaces the default reqwest HTTP client with rquest, which spoofs TLS fingerprints and HTTP/2 settings to mimic real browsers. Combined with BrowserConfig::stealth(), it also patches JavaScript APIs (navigator, plugins, webGL) that bot-detection systems probe.
Build requirements
The stealth feature compiles BoringSSL from source. You need cmake and nasm on your build machine:
Installation
HTTP-Level Stealth
StealthHttpFetcher sends requests with a realistic TLS ClientHello and HTTP/2 SETTINGS frame:
use kumo::prelude::*;
CrawlEngine::builder()
.stealth(StealthProfile::Chrome131) // spoof Chrome 131 TLS/H2 fingerprint
.run(MySpider)
.await?;
Available profiles:
| Profile | Mimics |
|---|---|
StealthProfile::Chrome131 | Chrome 131 on Windows 10 |
StealthProfile::Firefox128 | Firefox 128 LTS |
StealthProfile::Safari18 | Safari 18 on macOS Sequoia |
StealthProfile::Edge127 | Microsoft Edge 127 |
Browser-Level Stealth
When combined with the browser feature, BrowserConfig::stealth() also patches JavaScript APIs:
CrawlEngine::builder()
.browser(
BrowserConfig::new()
.stealth() // patch navigator, plugins, webGL, etc.
)
.run(MySpider)
.await?;
When to Use Stealth
Use stealth when the target site:
- Returns 403 or CAPTCHA to requests with non-browser TLS fingerprints (common with Cloudflare, Akamai, PerimeterX)
- Checks
navigator.webdriverornavigator.pluginsin JavaScript
For most sites, standard HTTP with a realistic User-Agent header is sufficient. Stealth adds significant build time and is only needed for bot-detection-hardened sites.