This module provides an automated way to crawl, inspect, and fuzz API endpoints to generate a high-quality OpenAPI specification with latency metrics.
- Crawler (
src/scanner/crawler.py): Explores a website, renders JavaScript (Playwright), and captures all network traffic. - Prober (
src/scanner/prober.py): Actively probes discovered endpoints with multiple requests to gather statistical performance data. - Core (
mitmproxy2swagger): Converts the captured traffic (HAR/Flow) into an OpenAPI Spec.
Run the full active scan in one click:
uv run scanner https://example.comThis will automatically:
- Crawl the website to discover endpoints.
- Start a proxy (mitmdump) in the background.
- Probe/Fuzz the endpoints through the proxy.
- Generate the final OpenAPI spec (
final_spec.yaml).
Using Headers (e.g., Bearer Token):
uv run scanner https://api.example.com \
--header "Authorization: Bearer YOUR_TOKEN"Using Cookies (e.g., Session ID):
uv run scanner https://dashboard.example.com \
--cookie "session_id=xyz123"You can still customize the run:
uv run scanner https://example.com \
--depth 3 \
--proxy-port 8081 \
--final-spec my_api.yaml- Python 3.10+
- uv package manager
- Playwright (
uv run playwright install)
-
Install
uv(if not already installed):curl -LsSf https://astral.sh/uv/install.sh | sh -
Sync dependencies:
uv sync
-
Install Playwright browsers:
uv run playwright install
To ensure code quality, we use ruff and pre-commit.
-
Install pre-commit hooks:
uv run pre-commit install
-
Run linting manually (optional):
uv run ruff check . uv run ruff format .