This fork of the original Graphiti MCP Server is tailored for containerized deployments via Coolify, Docker Compose, or similar platforms.
Graphiti is a framework for building and querying temporally-aware knowledge graphs, specifically tailored for AI agents operating in dynamic environments. Unlike traditional retrieval-augmented generation (RAG) methods, Graphiti continuously integrates user interactions, structured and unstructured enterprise data, and external information into a coherent, queryable graph.
This fork streamlines Graphiti's MCP server into a Docker-first setup optimized for deployment orchestration, including Neo4j integration and SSE-based MCP transport.
- Dockerfile tailored for
uv-based Python builds - Neo4j + Graphiti containerized setup via Docker Compose
- Preconfigured environment variable support with
.env - Out-of-the-box compatibility with Coolify
- Healthcheck-based service orchestration
- Docker + Docker Compose
- OpenAI API key
git clone https://github.com/tn-py/graphiti-mcpserver-coolify.git
cd graphiti-mcpserver-coolifycp .env.example .envSet your OpenAI key and model preferences inside .env:
OPENAI_API_KEY=your_openai_key
MODEL_NAME=gpt-4.1-minidocker compose up --buildThis will:
- Launch Neo4j (v5.26) with dev memory presets
- Launch Graphiti MCP server
- Bind Graphiti to port
8000(mapped to host port3010)
Access the MCP SSE endpoint:
http://localhost:3010/sse
.
├── Dockerfile # Graphiti MCP build
├── docker-compose.yml # Multi-service stack (Neo4j + Graphiti)
├── .env.example # Env config template
├── graphiti_mcp_server.py # Entry point
└── pyproject.toml # Python deps
To deploy on Coolify:
- Connect your GitHub repo containing this fork
- Ensure
docker-compose.ymlis in the root - Set your environment variables via Coolify UI or
.env - Deploy the app
Coolify will:
- Build the Graphiti service using the
Dockerfile - Launch Neo4j in a linked container
- Expose Graphiti on a public port (e.g.,
3010)
Configure your MCP-compatible client like Cursor or Claude to point to:
{
"mcpServers": {
"graphiti-memory": {
"transport": "sse",
"url": "http://localhost:3010/sse"
}
}
}All upstream functionality remains intact:
add_episodesearch_nodes,search_factsget_episodes,delete_episodeclear_graph,get_status
- Docker Engine
- Python 3.10+ (for local development)
- Neo4j 5.26+
- OpenAI API key
This fork retains the original license from the Graphiti project.
Forked from the excellent work by the team at Zep.