Sponsored by Deepsite.site

Inbox Zero AI MCP

Created By
elie2229 months ago
An MCP that helps you manage your email. For example, as it which emails need replies or follow ups. Offers functionality beyond basic Gmail functionality.
Content

[![](apps/web/app/opengraph-image.png)](https://www.getinboxzero.com)

Inbox Zero - Your AI Email Assistant ====================================

Open source email app to reach inbox zero fast.

[Website](https://www.getinboxzero.com/)
·
[Discord](https://www.getinboxzero.com/discord)
·
[Issues](https://github.com/elie222/inbox-zero/issues)

About

There are two parts to Inbox Zero:

  1. An AI email assistant that helps you spend less time on email.
  2. Open source AI email client.

If you're looking to contribue to the project, the email client is the best place to do this.

[![Deploy with Vercel](https://vercel.com/button)\](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Felie222%2Finbox-zero&env=NEXTAUTH\_SECRET,GOOGLE\_CLIENT\_ID,GOOGLE\_CLIENT\_SECRET,GOOGLE\_ENCRYPT\_SECRET,GOOGLE\_ENCRYPT\_SALT,UPSTASH\_REDIS\_URL,UPSTASH\_REDIS\_TOKEN,GOOGLE\_PUBSUB\_TOPIC\_NAME,DATABASE\_URL)

Thanks to Vercel for sponsoring Inbox Zero in support of open-source software.

Features

  • **AI Personal Assistant:** Manages your email for you based on a plain text prompt file. It can take any action a human assistant can take on your behalf (Draft reply, Label, Archive, Reply, Forward, Mark Spam, and even call a webhook).
  • **Reply Zero:** Track emails that need your reply and those awaiting responses.
  • **Smart Categories:** Categorize everyone that's ever emailed you.
  • **Bulk Unsubscriber:** Quickly unsubscribe from emails you never read in one-click.
  • **Cold Email Blocker:** Automatically block cold emails.
  • **Email Analytics:** Track your email activity with daily, weekly, and monthly stats.

Learn more in our [docs](https://docs.getinboxzero.com).

Feature Screenshots

![AI Assistant](.github/screenshots/email-assistant.png)![Reply Zero](.github/screenshots/reply-zero.png)
_AI Assistant__Reply Zero_
![Gmail Client](.github/screenshots/email-client.png)![Bulk Unsubscriber](.github/screenshots/bulk-unsubscriber.png)
_Gmail client__Bulk Unsubscriber_

Demo Video

[![Inbox Zero demo](/video-thumbnail.png)](http://www.youtube.com/watch?v=hfvKvTHBjG0)

Built with

Feature Requests

To request a feature open a [GitHub issue](https://github.com/elie222/inbox-zero/issues). If you don't have a GitHub account you can request features [here](https://www.getinboxzero.com/feature-requests). Or join our [Discord](https://www.getinboxzero.com/discord).

Getting Started for Developers

We offer a hosted version of Inbox Zero at [https://getinboxzero.com\](https://getinboxzero.com). To self-host follow the steps below.

Contributing to the project

You can view open tasks in our [GitHub Issues](https://github.com/elie222/inbox-zero/issues). Join our [Discord](https://www.getinboxzero.com/discord) to discuss tasks and check what's being worked on.

[ARCHITECTURE.md](./ARCHITECTURE.md) explains the architecture of the project (LLM generated).

Requirements

Setup

[Here's a video](https://youtu.be/hVQENQ4WT2Y) on how to set up the project. It covers the same steps mentioned in this document. But goes into greater detail on setting up the external services.

The external services that are required are:

You also need to set an LLM, but you can use a local one too:

To enable Bulk Unsubscriber, Analytics and Smart Categories you will also need to set:

We use Postgres for the database. For Redis, you can use [Upstash Redis](https://upstash.com/) or set up your own Redis instance.

You can run Postgres & Redis locally using `docker-compose`

```bash docker-compose up -d # -d will run the services in the background ```

Create your own `.env` file:

```bash cp apps/web/.env.example apps/web/.env cd apps/web pnpm install ```

Set the environment variables in the newly created `.env`. You can see a list of required variables in: `apps/web/env.ts`.

The required environment variables:

  • `NEXTAUTH_SECRET` -- can be any random string (try using `openssl rand -hex 32` for a quick secure random string)
  • `GOOGLE_CLIENT_ID` -- Google OAuth client ID. More info [here](https://next-auth.js.org/providers/google)
  • `GOOGLE_CLIENT_SECRET` -- Google OAuth client secret. More info [here](https://next-auth.js.org/providers/google)
  • `GOOGLE_ENCRYPT_SECRET` -- Secret key for encrypting OAuth tokens (try using `openssl rand -hex 32` for a secure key)
  • `GOOGLE_ENCRYPT_SALT` -- Salt for encrypting OAuth tokens (try using `openssl rand -hex 16` for a secure salt)
  • `UPSTASH_REDIS_URL` -- Redis URL from Upstash. (can be empty if you are using Docker Compose)
  • `UPSTASH_REDIS_TOKEN` -- Redis token from Upstash. (or specify your own random string if you are using Docker Compose)
  • `TINYBIRD_TOKEN` -- (optional) Admin token for your Tinybird workspace (be sure to create an instance in the GCP `us-east4` region. This can also be changed via your `.env` if you prefer a different region). You can also decide to disabled Tinybird and then the analytics and bulk unsubscribe features will be disabled. Set `NEXT_PUBLIC_DISABLE_TINYBIRD=true` if you decide to disable Tinybird.

When using Vercel with Fluid Compute turned off, you should set `MAX_DURATION=300` or lower. See Vercel limits for different plans [here](https://vercel.com/docs/functions/configuring-functions/duration#duration-limits).

To run the migrations:

```bash pnpm prisma migrate dev ```

To run the app locally:

```bash pnpm run dev ```

Or from the project root:

```bash turbo dev ```

Open [http://localhost:3000\](http://localhost:3000) to view it in your browser. To upgrade yourself to admin visit: [http://localhost:3000/admin\](http://localhost:3000/admin).

Supported LLMs

For the LLM, you can use Anthropic, OpenAI, or Anthropic on AWS Bedrock. You can also use Ollama by setting the following enviroment variables:

```sh OLLAMA_BASE_URL=http://localhost:11434/api NEXT_PUBLIC_OLLAMA_MODEL=phi3 ```

Note: If you need to access Ollama hosted locally and the application is running on Docker setup, you can use `http://host.docker.internal:11434/api\` as the base URL. You might also need to set `OLLAMA_HOST` to `0.0.0.0` in the Ollama configuration file.

You can select the model you wish to use in the app on the `/settings` page of the app.

Setting up Google OAuth and Gmail API

You need to enable these scopes in the Google Cloud Console:

```plaintext https://www.googleapis.com/auth/userinfo.profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/gmail.modify https://www.googleapis.com/auth/gmail.settings.basic https://www.googleapis.com/auth/contacts ```

Setting up Tinybird

Follow the instructions [here](./packages/tinybird/README.md) to setup the `pipes` and `datasources`.

Optional: If you want to store AI usage stats in Tinybird too, then do the same in `/packages/tinybird-ai-analytics`.

Set up push notifications via Google PubSub to handle emails in real time

Follow instructions [here](https://developers.google.com/gmail/api/guides/push).

  1. [Create a topic](https://developers.google.com/gmail/api/guides/push#create\_a\_topic)
  2. [Create a subscription](https://developers.google.com/gmail/api/guides/push#create\_a\_subscription)
  3. [Grant publish rights on your topic](https://developers.google.com/gmail/api/guides/push#grant\_publish\_rights\_on\_your\_topic)

Set env var `GOOGLE_PUBSUB_TOPIC_NAME`. When creating the subscription select Push and the url should look something like: `https://www.getinboxzero.com/api/google/webhook?token=TOKEN\` or `https://abc.ngrok-free.app/api/google/webhook?token=TOKEN\` where the domain is your domain. Set `GOOGLE_PUBSUB_VERIFICATION_TOKEN` in your `.env` file to be the value of `TOKEN`.

To run in development ngrok can be helpful:

```sh ngrok http 3000

or with an ngrok domain to keep your endpoint stable (set `XYZ`):

ngrok http --domain=XYZ.ngrok-free.app 3000 ```

And then update the webhook endpoint in the [Google PubSub subscriptions dashboard](https://console.cloud.google.com/cloudpubsub/subscription/list).

To start watching emails visit: `/api/google/watch/all`

Watching for email updates

Set a cron job to run these: The Google watch is necessary. The Resend one is optional.

```json "crons": [ { "path": "/api/google/watch/all", "schedule": "0 1 * * *" }, { "path": "/api/resend/summary/all", "schedule": "0 16 * * 1" } ] ```

[Here](https://vercel.com/guides/how-to-setup-cron-jobs-on-vercel#alternative-cron-providers) are some easy ways to run cron jobs. Upstash is a free, easy option. I could never get the Vercel `vercel.json`. Open to PRs if you find a fix for that.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
ChatWiseThe second fastest AI chatbot™
Playwright McpPlaywright MCP server
WindsurfThe new purpose-built IDE to harness magic
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Amap Maps高德地图官方 MCP Server
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
CursorThe AI Code Editor
DeepChatYour AI Partner on Desktop
Tavily Mcp
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Serper MCP ServerA Serper MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.