/developers
Developer hub
GPT Researcher ships across runtimes and clients: a Python SDK, a Docker image, a Model Context Protocol server, and an OpenAPI 3.1 REST spec. Pick the integration path that matches your stack.
Python
pip install gpt-researcher
from gpt_researcher import GPTResearcher
import asyncio
async def main():
r = GPTResearcher(query="your topic", report_type="research_report")
await r.conduct_research()
print(await r.write_report())
asyncio.run(main())
ReferenceMCP (Streamable HTTP)
claude_desktop_config.json or mcp-remote
{
"mcpServers": {
"gptr-public": {
"url": "https://gptr.dev/api/mcp"
}
}
}
ReferenceSelf-hosted MCP (full)
git clone https://github.com/assafelovic/gptr-mcp
git clone https://github.com/assafelovic/gptr-mcp
cd gptr-mcp
pip install -r requirements.txt
python server.py # stdio
# or:
docker compose up -d # SSE on :8000/sse
ReferenceREST API
OpenAPI 3.1 - any language
curl -X POST https://your-self-hosted-server/research \
-H "Content-Type: application/json" \
-d '{"query":"What are the strategic risks for NVIDIA?","report_type":"research_report"}'
ReferenceVerified integrations
GPT Researcher integrates as a tool / MCP server with the major AI agent platforms. Each link below points to the canonical install instructions.
Agent-discovery endpoints
These files let any AI agent discover, understand, and connect to GPT Researcher without a human in the loop. They are kept in sync with the live MCP server at /api/mcp.