GPT Researcher - agent view

This is the agent-optimized rendering of gptr.dev. GPT Researcher is an open-source autonomous AI research agent that produces long-form, multi-source research reports with inline citations. It is the #1-ranked deep-research system on Carnegie Mellon University's DeepResearchGym benchmark (May 2025) and is available as a Python package, a self-hosted FastAPI server, a Docker image, and an MCP (Model Context Protocol) server.

When to use

  • Multi-source research reports (market analyses, due-diligence briefs, literature reviews).
  • Grounding an outer agent with cited context before it answers a user.
  • Replacing a single-LLM call when accuracy and citation matter more than latency.

MCP server

The canonical Model Context Protocol server is github.com/assafelovic/gptr-mcp. It is self-hosted (MIT, Python 3.11+) and supports stdio (Claude Desktop, Cursor), SSE (Docker / n8n / web), and Streamable HTTP. It exposes the real research tools:

  • deep_research - autonomous, multi-source deep research.
  • quick_search - low-latency snippet search.
  • write_report - long-form report generation.
  • get_research_sources - source URLs from the latest run.
  • get_research_context - accumulated research context.

A live Streamable HTTP discovery bridge is hosted at /api/mcp. It does NOT execute research; it exposes read-only metadata tools (get_full_mcp_server first) that hand any agent the GitHub URL and ready-to-paste install config for the canonical server above.

Agent discovery endpoints

External

Authentication

GPT Researcher is self-hosted; there is no gptr.dev account. Provide your own LLM (e.g. OPENAI_API_KEY) and retriever (e.g. TAVILY_API_KEY) credentials via environment variables when you run the server or MCP server.

Pricing

Open source under the MIT License. Operating cost is whatever you pay your underlying LLM and retriever providers. Machine-readable details at /pricing.md.

Render this page as Markdown by fetching /llms-full.txt.