What you get
- Docker container + PostgreSQL — runs on any cloud, any VM, or bare metal
- No AWS required — unless you choose Amazon Bedrock as your LLM provider
- You control the LLM — pick the provider that fits your needs
- Always free — no MergeWatch billing, you pay your LLM provider directly
LLM providers
You choose which LLM powers your reviews by setting theLLM_PROVIDER environment variable:
| Provider | LLM_PROVIDER | Description |
|---|---|---|
| Anthropic (default) | anthropic | Direct Anthropic API. Requires ANTHROPIC_API_KEY. |
| LiteLLM | litellm | Proxy to 100+ LLM providers (OpenAI, Azure, Google, etc.). |
| Amazon Bedrock | bedrock | AWS Bedrock. Requires AWS credentials. |
| Ollama | ollama | Run models locally. Fully air-gapped. |
Architecture
The Docker image (ghcr.io/santthosh/mergewatch:latest) runs an Express server that:
- Receives GitHub webhooks on
/webhook(port 3000) - Validates the HMAC-SHA256 signature
- Fetches the PR diff from GitHub
- Runs five review agents in parallel against your chosen LLM
- Posts review comments and a Check Run back to the PR
- Stores the review record in PostgreSQL
Self-hosted vs Managed SaaS
| Self-Hosted | Managed SaaS | |
|---|---|---|
| Who runs it | You | MergeWatch |
| Infrastructure | Docker + Postgres | Lambda + DynamoDB + Bedrock |
| Data isolation | Complete — your servers only | Diff transits MergeWatch infra |
| LLM choice | Anthropic, LiteLLM, Bedrock, Ollama | Bedrock (Claude Sonnet) |
| Cost | Free + your LLM bill | First 20 PRs free, then usage-based |
| Setup | ~10 min | ~2 min |
