一个以 Pull Request / Diff 为主、兼顾仓库健康扫描的 GitHub 代码 Agent。
后端当前能力包括:
- GitHub 仓库与 PR 快照获取
- Repo / Pull Request 双分析引擎
- PostgreSQL 报告持久化
- Pipeline 编排与飞书推送
- LangChain Agent + Tool 调用
frontend: Next.js 16、React 19、Tailwind CSS 4backend: NestJS 11database: PostgreSQL + MikroORMagent: LangChain + OpenAI-compatible chat modelworkspace: pnpm workspace
当前后端按下面这条主线工作:
GitHub Repo / PR Snapshot
-> Analysis Engine
-> RepoReport / PullRequestReport
-> Pipeline / Feishu
-> Agent Tools / Chat
核心模块:
backend/src/github-sync/: GitHub repo / PR 快照获取backend/src/analysis/: Repo / PR 规则分析backend/src/reports/:RepoReport与PullRequestReportbackend/src/pipeline/: 编排 repo / PR 分析流程backend/src/agent/: LangChain Agent、tool 注册和聊天流式接口
pnpm install如果你本地没有数据库,最简单的方式是用 Docker:
docker run --name adam-postgres \
-e POSTGRES_DB=adam_app \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=postgres \
-p 5432:5432 \
-d postgres:16cp backend/.env.example backend/.env当前第一版最重要的环境变量:
OPENAI_API_KEYOPENAI_BASE_URL(可选,兼容 OpenAI API 的模型服务都可)MODEL_NAMEDB_HOSTDB_PORTDB_NAMEDB_USERDB_PASSWORDGITHUB_TRACKED_REPOSITORIESMOCK_MODE
pnpm --filter ./backend run db:uppnpm dev:backend启动整个 workspace:
pnpm devpnpm dev: 同时启动前后端pnpm dev:backend: 启动后端pnpm dev:frontend: 启动前端pnpm build: 构建 workspacepnpm lint: 运行 ESLintpnpm --filter ./backend test: 运行后端单测pnpm --filter ./backend exec jest --runInBand: 串行运行全部后端测试pnpm --filter ./backend run db:generate: 生成 MikroORM migrationpnpm --filter ./backend run db:up: 执行 migration
不传 body 时,后端会读取 GITHUB_TRACKED_REPOSITORIES 并批量跑 repo pipeline。
curl -X POST http://localhost:3001/pipeline/runcurl -X POST http://localhost:3001/pipeline/run \
-H "Content-Type: application/json" \
-d '{
"scope": "repo",
"owner": "openai",
"repo": "openai-node",
"branch": "main",
"reason": "manual repo scan"
}'curl -X POST http://localhost:3001/pipeline/run \
-H "Content-Type: application/json" \
-d '{
"scope": "pull_request",
"owner": "openai",
"repo": "openai-node",
"prNumber": 123,
"reason": "manual pr review"
}'curl "http://localhost:3001/reports/latest?scope=repo"curl "http://localhost:3001/reports/latest?scope=pull_request"curl "http://localhost:3001/reports/by-date?scope=pull_request&owner=openai&repo=openai-node&prNumber=123&date=2026-04-21"后端聊天入口是:
POST /chat/stream
请求体最小示例:
{
"messages": [
{
"id": "msg-1",
"role": "user",
"parts": [
{
"type": "text",
"text": "帮我分析 openai/openai-node 的 PR #123 风险"
}
]
}
]
}这是一个流式接口,适合前端用 Vercel AI SDK 的 useChat() 一类能力对接。
后端当前核心表为:
repo_reportpull_request_reportmikro_orm_migrations
- 将
github-sync从 mock 数据切到真实 GitHub API - 扩充 repo / PR 分析规则
- 补充前端部分