返回顶部
g

gemini-deep-research

Trigger Gemini Deep Research via browser and save results to Notion. Use when the user asks to "deep research" a topic, says "gemini deep research", or wants an in-depth research report. Execute ALL steps in the main session (browser tool requires main session access).

作者: admin | 来源: ClawHub
源自
ClawHub
版本
V 1.1.0
安全检测
已通过
149
下载量
0
收藏
概述
安装方式
版本历史

gemini-deep-research

# Gemini Deep Research → Notion ## Execution Mode **Run ALL steps in the MAIN SESSION. Do NOT spawn a subagent.** The browser tool (OpenClaw managed profile) is only available in the main session. Subagents cannot access the browser, so all browser automation must happen here. Reply first: "🔬 Deep Research starting for: [topic]. This takes ~25 min. I'll update you when done." Then execute all phases below sequentially. --- ## Instructions Complete ALL steps below in the main session. ### Phase 1: Trigger Deep Research 1. `browser action=open profile=openclaw targetUrl="https://gemini.google.com/app"` 2. Snapshot, find the text input, type the research query. **Always prepend "请用中文回答。" to the query** so the research output is in Chinese. 3. Click **"工具" (Tools)** button (has `page_info` icon) → click **"Deep Research"** in the menu 4. Click **Send** to submit the query 5. Wait for research plan to appear (~10s), then click **"Start research"** / **"开始研究"** button - If snapshot-click doesn't work, use JS: `(() => { var btn = Array.from(document.querySelectorAll('button')).find(b => /Start research|开始研究/.test(b.textContent.trim())); if (btn) { btn.click(); return 'clicked'; } return 'not found'; })()` 6. Verify research started: button should be disabled, status shows "Researching X websites..." or "正在研究..." 7. Save the conversation URL from the browser ### Phase 2: Wait for Completion 1. Run `exec("sleep 1200")` (20 minutes) + `process(poll, timeout=1200000)` 2. After waking, check status via JS: `(() => { var el = document.querySelectorAll('message-content')[1]; return el ? el.innerText.substring(0, 200) : 'NOT_FOUND'; })()` 3. Look for completion signals: "I've completed your research" or "已完成" 4. If still running, sleep another 600s and check again (max 2 retries) 5. If failed/stuck after retries, announce the failure and exit ### Phase 3: Extract Report 1. Count message-content elements: `document.querySelectorAll('message-content').length` 2. The research report is in the LAST `message-content` element (usually index 2) 3. Get total length: `document.querySelectorAll('message-content')[2]?.innerText?.length` 4. Extract in 8000-char chunks using substring: `document.querySelectorAll('message-content')[N]?.innerText?.substring(START, END)` 5. Concatenate all chunks into the full report text 6. Save to a temp file: write full report to `/tmp/deep_research_<timestamp>.md` ### Phase 4: Export to Notion **Parent page ID:** `31a4cfb5-c92b-809f-9d8a-dd451718a017` (Deep Research Database) 1. Read the Notion API key: `cat ~/.config/notion/api_key` 2. Parse the report into Notion blocks: - Lines starting with `#` → heading_2/heading_3 blocks - Bullet points → bulleted_list_item blocks - Regular text → paragraph blocks - Add a callout at top: "🔬 Generated by Gemini Deep Research on YYYY-MM-DD" - Split rich_text at 2000 chars 3. Create the page via Notion API: ```bash curl -s -X POST "https://api.notion.com/v1/pages" \ -H "Authorization: Bearer $NOTION_KEY" \ -H "Notion-Version: 2025-09-03" \ -H "Content-Type: application/json" \ -d '{"parent":{"page_id":"31a4cfb5-c92b-809f-9d8a-dd451718a017"},"icon":{"type":"emoji","emoji":"🔬"},"properties":{"title":{"title":[{"text":{"content":"TOPIC"}}]}},"children":[BLOCKS]}' ``` 4. If >100 blocks, append remaining via PATCH to `/v1/blocks/{page_id}/children` 5. Rate limit: wait 0.5s between batch requests ### Phase 5: Announce Report back with: - Research topic - Brief summary (2-3 key findings) - Notion page URL: `https://www.notion.so/<page_id_without_dashes>` ## Notes - Always use `profile="openclaw"` for browser - Deep Research is under **"工具" (Tools) menu**, NOT the model selector - If Gemini needs login, announce failure — user must log in manually - The full pipeline should complete in ~25-30 min total

标签

skill ai

通过对话安装

该技能支持在以下平台通过对话安装:

OpenClaw WorkBuddy QClaw Kimi Claude

方式一:安装 SkillHub 和技能

帮我安装 SkillHub 和 gemini-deep-research-notion-1776099181 技能

方式二:设置 SkillHub 为优先技能安装源

设置 SkillHub 为我的优先技能安装源,然后帮我安装 gemini-deep-research-notion-1776099181 技能

通过命令行安装

skillhub install gemini-deep-research-notion-1776099181

下载 Zip 包

⬇ 下载 gemini-deep-research v1.1.0

文件大小: 3.85 KB | 发布时间: 2026-4-17 14:54

v1.1.0 最新 2026-4-17 14:54
Fix: run in main session instead of subagent. Browser tool only works in main session.

Archiver·手机版·闲社网·闲社论坛·羊毛社区· 多链控股集团有限公司 · 苏ICP备2025199260号-1

Powered by Discuz! X5.0   © 2024-2025 闲社网·线报更新论坛·羊毛分享社区·http://xianshe.com

p2p_official_large
返回顶部