/ 目錄 / 演練場 / Tavily MCP
● 官方 tavily-ai 🔑 需要你的金鑰

Tavily MCP

作者 tavily-ai · tavily-ai/tavily-mcp

Tavily MCP 為您的代理提供網路搜尋、頁面提取、網站地圖和爬行功能 - 已經針對 LLM 進行了格式化,因此您不會在抓取的標記上浪費令牌。

Tavily 是一個專為 AI 代理設計的搜尋 API:答案以帶有來源的乾淨文字形式返回,而不是 50 KB 的 HTML。 MCP 伺服器提供四種工具(搜尋、提取、地圖、爬網),您可以將它們組合成真正的研究工作流程。 需要 tavilly.com 提供的免費 API 金鑰。 在 Claude Desktop、Cursor、Windsurf、Claude Code 中開箱即用 — 透過 npx 安裝。

為什麼要用

核心特性

即時演示

實際使用效果

tavily-mcp.replay ▶ 就緒
0/0

安裝

選擇你的客戶端

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "tavily-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "tavily-mcp@latest"
      ],
      "env": {
        "TAVILY_API_KEY": "tvly-..."
      }
    }
  }
}

開啟 Claude Desktop → Settings → Developer → Edit Config。儲存後重啟應用。

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "tavily-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "tavily-mcp@latest"
      ],
      "env": {
        "TAVILY_API_KEY": "tvly-..."
      }
    }
  }
}

Cursor 使用與 Claude Desktop 相同的 mcpServers 格式。專案級設定優先於全域。

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "tavily-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "tavily-mcp@latest"
      ],
      "env": {
        "TAVILY_API_KEY": "tvly-..."
      }
    }
  }
}

點擊 Cline 側欄中的 MCP Servers 圖示,然後選 "Edit Configuration"。

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "tavily-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "tavily-mcp@latest"
      ],
      "env": {
        "TAVILY_API_KEY": "tvly-..."
      }
    }
  }
}

格式與 Claude Desktop 相同。重啟 Windsurf 生效。

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "tavily-mcp",
      "command": "npx",
      "args": [
        "-y",
        "tavily-mcp@latest"
      ]
    }
  ]
}

Continue 使用伺服器物件陣列,而非映射。

~/.config/zed/settings.json
{
  "context_servers": {
    "tavily-mcp": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "tavily-mcp@latest"
        ]
      }
    }
  }
}

加入 context_servers。Zed 儲存後熱重載。

claude mcp add tavily-mcp -- npx -y tavily-mcp@latest

一行命令搞定。用 claude mcp list 驗證,claude mcp remove 移除。

使用場景

實戰用法: Tavily MCP

回答有關模型截止後發生的事情的問題

👤 任何使用 Claude 提出需要新鮮資訊的問題的人 ⏱ ~5 min beginner

何時使用: 您詢問 2026 年版本、最近的 CVE、新的定價頁面或當今的市場 - 模型不知道,必須去查看。

前置條件
  • 泰維利 API 金鑰 — 在 tavilly.com 上註冊(免費套餐 = 1,000 次通話/月)
  • 泰利MCP已安裝 — Paste the config block above into your client's MCP settings
步驟
  1. Ask directly
    What did Anthropic ship in Claude Sonnet 4.7 this month? Use Tavily to find the announcement and summarize with sources.✓ 已複製
    → Agent calls tavily_search, returns a summary with linked sources
  2. Drill in on one source
    The second source looks most authoritative — use tavily_extract to pull its full text and quote the exact line about context window.✓ 已複製
    → Direct quote with URL + paragraph number

結果: 目前,一輪引用的答案 - 無需手動谷歌搜尋。

注意事項
  • Search returned SEO junk first — Add site hints: '... from anthropic.com or anthropic's official blog'
  • Summaries drift from sources — Require direct quotes — 'paraphrase but preserve numbers, dates, names exactly'
搭配使用: filesystem · memory

Do a competitive product scan in one session

👤 PMs, founders, marketers ⏱ ~30 min intermediate

何時使用: You need a one-page brief on every competitor for a given feature category by end of day.

步驟
  1. Discover competitors
    Using Tavily, find the top 8 products competing with us in 'AI-native CRM for SMB'. For each, return name, URL, one-liner, founding year.✓ 已複製
    → Structured 8-row table with source links
  2. Map each site
    For each competitor, tavily_map their site to find their pricing and features pages. Return the URLs.✓ 已複製
    → 2 URLs per competitor
  3. Extract pricing
    tavily_extract each pricing page and build a comparison grid: plan name, monthly price, top 3 differentiators.✓ 已複製
    → Clean grid; cells cite the pricing page URL

結果: A sharable brief with sources — ready for a PMM slide in 30 minutes.

注意事項
  • Pricing JS-rendered and extract misses it — Fall back to tavily_crawl with render=on, or hit the /pricing sitemap directly
搭配使用: filesystem

Write a tutorial with live-verified links

👤 Technical writers, DevRel ⏱ ~25 min intermediate

何時使用: You're publishing a how-to and every external link has to resolve to the right content today.

步驟
  1. Collect candidate refs
    Using tavily_search, find the top 5 canonical docs pages for 'OAuth 2.1 PKCE flow'. Prefer RFCs and vendor docs over blogs.✓ 已複製
    → 5 URLs with a short rationale each
  2. Verify each one
    tavily_extract each URL. For each, confirm the page still covers PKCE and flag any that look redirected or stale.✓ 已複製
    → Per-URL live verdict
  3. Embed in the draft
    Rewrite my draft tutorial to cite only the verified URLs, with anchor text that matches the page's actual heading.✓ 已複製
    → Updated draft; every link text matches the real page heading

結果: Published tutorial with zero dead links and accurate anchor text.

搭配使用: filesystem

組合

與其他 MCP 搭配,撬動十倍槓桿

tavily-mcp + filesystem

Search, extract into disk, then analyze locally without re-fetching

Search Tavily for recent OWASP top-10 sources, extract them, save to /research/owasp/, then compare the content offline.✓ 已複製
tavily-mcp + memory

Build a research journal that persists between sessions

For each Tavily search, save a one-line note and the URLs to memory under 'project:acme'. Next session, reuse.✓ 已複製
tavily-mcp + context7

Tavily for web context + Context7 for library docs — don't confuse them

Use Context7 for docs questions; Tavily for news, blog posts, and anything not in library indexes.✓ 已複製

工具

此 MCP 暴露的能力

工具輸入參數何時呼叫成本
tavily_search query: str, max_results?: int, search_depth?: 'basic'|'advanced', include_domains?: str[] Primary tool — one query, LLM-ready snippets with URLs 1 API call
tavily_extract urls: str[], extract_depth?: 'basic'|'advanced' You already have a URL and want clean text — no HTML, no ads 1 API call per URL
tavily_map url: str, max_depth?: int, categories?: str[] Discover a site's structure — useful before extract/crawl 1 API call
tavily_crawl url: str, max_depth?: int, limit?: int, instructions?: str Broad ingest of a small site or doc section — expensive, prefer extract when you already know the URLs Multiple API calls (one per page)

成本與限制

運行它的成本

API 配額
Free tier = 1,000 API calls/month; scaling plans from $30/mo
每次呼叫 Token 數
Returns ~500–5000 tokens of clean content — much less than raw HTML would
費用
Free tier covers individual daily use; heavy workflows need paid
提示
Prefer tavily_search over tavily_crawl — search is one call, crawl is N. Only crawl when you truly need breadth.

安全

權限、密鑰、影響範圍

憑證儲存: TAVILY_API_KEY in env var (set in the MCP config's env block)
資料出站: Queries and URLs you pass are sent to api.tavily.com. Don't paste proprietary info into the query string.

故障排查

常見錯誤與修復

401 Unauthorized

Double-check TAVILY_API_KEY in your MCP config. The env block lives inside the server config, not at top level.

驗證: Call any Tavily tool; if the error persists, rotate the key in tavily.com dashboard
Empty results despite a real query

Switch search_depth from 'basic' to 'advanced' for niche topics; add include_domains to bias toward authoritative sources

驗證: Repeat with search_depth: 'advanced'
tavily_extract returns paywalled gibberish

Tavily follows robots.txt and respects paywalls. For paywalled content, note it's unreachable — don't try to bypass.

429 Rate limit

Free tier = 60 RPM. Space out calls, or upgrade at tavily.com. The MCP auto-backs-off once, then surfaces the error.

驗證: Check usage in tavily.com dashboard

替代方案

Tavily MCP 對比其他方案

替代方案何時用它替代權衡
exa-mcp-serverYou want Exa's neural/embedding-style search and similarityDifferent result quality on different query types; both are LLM-tuned
fetchYou have the URL and just want HTML → markdown, no searchNo search; no structured extract; you handle the URL discovery
perplexity-askYou want a research-level answer synthesized by Perplexity, not raw resultsHigher latency, higher cost, less composable — it's one big tool
firecrawl-mcp-serverYou need heavy crawling or JS rendering as a core workflowPricier; overkill for one-off lookups

更多

資源

📖 閱讀 GitHub 上的官方 README

🐙 查看未解決的 issue

🔍 瀏覽全部 400+ MCP 伺服器和 Skills