What is the Promptwatch MCP?
The Promptwatch MCP (Model Context Protocol) server lets AI assistants like Claude, Cursor, and other MCP-compatible clients query your Promptwatch data directly. Ask questions about your brand visibility, citations, AI traffic, and more without leaving your AI workflow. With the MCP server, you can:- Query brand visibility across ChatGPT, Claude, Gemini, Perplexity, and AI Overviews
- Analyze citation sources to understand where LLMs get their information
- Track AI-referred traffic to your website
- Compare competitors in AI search results
- Identify content gaps and get recommendations
Getting Started
Step 1: Get Your API Key
- Go to Settings → API Keys in your project dashboard
- Click Create API Key
- Copy and securely store your API key
Step 2: Connect Your MCP Client
The Promptwatch MCP server is remotely hosted, no installation required. Just add it to your client configuration.Cursor
Cursor
Claude Code
Claude Code
Run the following command in your terminal:Replace
YOUR_API_KEY with the API key from Step 1.Verify the server was added:Claude Desktop
Claude Desktop
Claude Desktop does not support remote MCP servers directly. Use the Replace
mcp-remote proxy as a workaround.Add to your claude_desktop_config.json:YOUR_API_KEY with the API key from Step 1.Where to find the config fileYou can open it directly from the Claude Desktop app via Settings > Developer > Edit Config. To navigate manually, it lives at:- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Other MCP Clients
Other MCP Clients
Any MCP-compatible client can connect using the remote server URL:If your client does not support remote MCP servers, use the Replace
mcp-remote proxy:YOUR_API_KEY with the API key from Step 1.Step 3: Start Using It
Once configured, ask your AI assistant questions like:- “How visible is my brand in AI search results?”
- “Which sources are LLMs citing when they mention us?”
- “Compare our AI visibility against competitors”
- “What content gaps should I address?”
- “How much traffic are we getting from AI tools?”