Using Browser DevTools for Web Scraping: The Complete Guide
Browser DevTools (Developer Tools) is a built-in set of debugging and inspection tools in web browsers. For web scraping, DevTools lets you inspect page structure, find CSS selectors, monitor network requests, and test selectors before writing code.
Opening DevTools
- •Chrome/Edge: F12 or Ctrl+Shift+I (Cmd+Option+I on Mac)
- •Firefox: F12 or Ctrl+Shift+I
- •Safari: Cmd+Option+I (enable in Preferences → Advanced first)
The 4 Tabs Scrapers Use Most
1. Elements Tab (Inspect HTML)
Right-click any element on the page → "Inspect" to jump to its HTML. Use this to:- •Understand the page structure
- •Find class names and IDs for selectors
- •See how data is organized in the DOM
2. Network Tab (Find APIs)
The most powerful tab for scrapers. Filter by XHR/Fetch to see API calls:- •Click a request to see its URL, headers, and response
- •Right-click → "Copy as cURL" to get the exact request
- •Find the JSON APIs behind JS-rendered pages
3. Console Tab (Test Selectors)
Test CSS selectors before writing Python code:// Find all product cards
document.querySelectorAll(".product-card")
// Get text of first title
document.querySelector(".product-card .title").textContent
// Count results
document.querySelectorAll(".product-card").length
4. Application Tab (Cookies & Storage)
View and edit cookies, local storage, and session storage. Useful for understanding authentication.Pro Workflows
Copy a Selector
Right-click an element in the Elements tab → Copy → Copy selector (Warning: auto-generated selectors are often fragile — simplify them)Copy as cURL
In the Network tab, right-click a request → Copy → Copy as cURL Then convert to Python:# Use https://curlconverter.com to convert cURL to Python requests