AI-Powered Reverse Engineering: Uncovering Hidden APIs in Minutes with Browser DevTools

![Main article image](


alt="Article illustration 1"
loading="lazy">

) Public data portals—from government dashboards to interactive maps—often bury valuable datasets behind slick JavaScript interfaces. Without download buttons or API docs, developers resort to manual inspection in Chrome DevTools, sifting through network requests for hidden endpoints. This reverse engineering can consume entire afternoons, spotting XHR/Fetch calls amid asset loads, testing parameters, and documenting pagination logic. Enter AI coding agents paired with browser automation. A tutorial by data journalist Rui Barros demonstrates how Claude, Anthropic's AI model, leverages Chrome DevTools MCP (Model Context Protocol) to automate the process. In minutes, it navigates portals, interacts with filters, monitors traffic, and generates full API documentation plus client code—eliminating the grunt work for developers, analysts, and journalists. ## The Hidden API Challenge in Modern Web Apps Most data portals mimic REST APIs under the hood, serving data incrementally to avoid browser overload. Rui Barros uses the PokéAPI as an analogy:
https://pokeapi.co/api/v2/pokemon/1  # Returns Bulbasaur JSON
https://pokeapi.co/api/v2/pokemon/25 # Returns Pikachu JSON

Change the ID, and you get structured data. Real-world portals like the European Air Quality Index (EEA) do the same but hide endpoints. Manual discovery involves: 1. Opening DevTools > Network tab. 2. Interacting with pagination, searches, or filters. 3. Filtering for XHR/Fetch requests. 4. Reverse-engineering undocumented parameters. AI flips this script by simulating human interaction at scale. ## Chrome DevTools MCP: AI Meets Browser Automation MCP bridges large language models like Claude with external tools, here enabling programmatic Chrome control. The workflow requires: - Claude Pro subscription ($20/month). - Node.js v16+. - Chrome browser. Installation is straightforward:
npm install -g @anthropic-ai/claude-code
claude-code auth
claude mcp add chrome-devtools npx chrome-devtools-mcp@latest

Verification opens Chrome to a test URL, confirming AI-driven navigation. ## Building a Reusable `/discover-api` Command The magic lies in Claude Code's custom slash commands. Create `.claude/commands/discover-api.md`:
---
description: Reverse engineer and document hidden APIs in any web portal
---

I need you to reverse engineer the API for {{url}}.

1. Open the URL in Chrome using Chrome DevTools MCP
2. Take a snapshot of the page structure
3. Interact with search filters, date pickers, pagination controls
4. Monitor all network requests while you interact
5. Identify the API endpoints being called
6. For each endpoint, document: Full URL pattern, HTTP method, parameters, response structure, pagination
7. Create example requests using curl and {{language:R}}

Focus on endpoints that return data (JSON). Ignore assets.

Output: `API_DOCUMENTATION.md` and code examples in specified language.

Invoke with:
claude
/discover-api url=https://airindex.eea.europa.eu/AQI/index.html language=Python

## Real-World Results: EEA Air Quality Portal Applied to Europe's air quality map, Claude uncovered: - **Station Metadata**: All monitoring stations with coordinates. - **Hourly AQI Data**: Historical and forecast values. - **Geocoding**: Location search. - **Map Tiles**: ArcGIS rasters. - **Boundaries**: GeoJSON polygons. It produced `API_DOCUMENTATION.md` with schemas, examples, and edge cases, plus 10 R scripts for fetching, joining, visualizing, and exporting data—complete with Haversine distance calculations and ggplot2 plots.
# Example: Fetch current AQI
get_aqi_data <- function(station_id) {
  url <- sprintf("https://airindex.eea.europa.eu/api/stations/%s/aqi", station_id)
  # Error handling, parsing, etc.
}

Implications for Developers and Data Teams

This technique scales beyond journalism. Developers building integrations can bypass scraping in favor of clean API calls. Security researchers gain rapid endpoint mapping for vulnerability hunting. Data engineers automate ETL from undocumented sources.

Limitations exist: Authentication, CAPTCHAs, or server-side rendering block it, reverting to Playwright/Selenium. Barros emphasizes ethical use—respect rate limits, review ToS, and prioritize public data.

As AI agents integrate deeper into dev workflows, tools like Chrome DevTools MCP signal a shift. Reverse engineering, once a black art, becomes a one-command routine, freeing engineers to focus on value over drudgery.

Source: Finding Hidden APIs Using AI by Rui Barros, November 8, 2025.