Eight steps from white page to night reading.
No black box, no AI hand-waving. Just deterministic color math executed in milliseconds on every page you visit.
On document_start, LumenShade scans every element and reads its computed color, background, border, fill, stroke, and shadows — including pseudo-elements.
Every color is converted from sRGB into OKLab, then OKLCH — a perceptually uniform color space designed to match how human vision actually perceives light.
Each color is tagged: page background, surface, primary text, muted text, border, brand accent, link, or media. Role determines treatment.
Backgrounds get a deep warm dark (L≈0.16, low chroma). Text moves to soft paper (L≈0.94). Brand colors keep hue and chroma — only lightness shifts to land on the right side of legible.
After remap, each text-on-background pair is scored using APCA (the algorithm replacing WCAG). If a pair fails, lightness is bumped until it passes.
Images, videos, canvas, and iframes are excluded from color remapping. Instead, a soft brightness/contrast filter takes the harsh edge off — your photos still look like your photos.
A minimal background-color rule injects synchronously so the page never flashes white before remap finishes.
A debounced MutationObserver re-runs classification on dynamically added nodes — single-page apps stay consistent.
HSL is mathematically convenient but perceptually broken — equal changes in lightness don't look equal to your eyes. A pure yellow at HSL lightness 50% looks far brighter than a pure blue at the same value. OKLCH was designed in 2020 by Björn Ottosson to fix this: equal L means equal perceived brightness, regardless of hue. That's why the same algorithm produces uniformly comfortable results across every color on every site.