-
Intro (who/what)
- Intro: This video shows how to add knowledge graphs to the Rag (RAG) template in N8N using Graffiti MCP, Neo4j, and OpenAI for relational querying and memory.
-
Key idea and value proposition
- Core idea: Combine a vector DB with a knowledge graph so the agent can traverse relationships between entities, not just retrieve individual chunks.
- Value: Improves relational reasoning and navigation (e.g., company → executives) while keeping fast, chunk-based retrieval for simple queries.
-
Core components and terms
- RAG pipeline with chunking into bite-sized pieces stored in a vector DB (e.g., Postgres PGVector, Quadrant, Pinecone)
- Knowledge graph to store entities/relationships
- Graffiti MCP (MCP server) to manage graph operations
- Neo4j as the graph database
- N8N (self-hosted) as the workflow orchestrator
- OpenAI as the LLM provider for entity/relationship extraction
- MCP server and MCP nodes in N8N
-
Two MCP nodes and their roles
- Add memory: inserts a document's extracted entities/relationships into the knowledge graph
- Search memory: queries the knowledge graph (used by the agent for relational questions)
-
How it fits the workflow (data flow; when to query graph vs. vector DB)
- Data flow: source data (e.g., Google Drive) → chunked → stored in vector DB; simultaneously, Graffiti MCP extracts entities/relationships to populate the knowledge graph
- When to query:
- Use vector DB for single-entity or general overview queries
- Use knowledge graph for relational queries (e.g., how two companies work together, who is an executive, etc.)
- Agent prompts can specify when to prefer graph vs. vector searches
-
Quick setup overview (prereqs and high-level steps)
- Prereqs:
- Self-hosted N8N
- Docker setup for Graffiti MCP server + Neo4j
- OpenAI API key (and configurable model)
- Access to host.docker.in networking
- Consider firewall rules (secure, not open to everyone)
- High-level steps:
- Clone Graffiti MCP repo and set up environment variables (OpenAI API key, Neo4j password, host URLs)
- Run docker-compose up -d for Graffiti MCP + Neo4j
- Verify containers with docker ps and docker logs; adjust ports if needed (e.g., 8000/8030 mapping)
- Modify N8N docker-compose to expose host.docker.in; add required firewall rules to permit N8N → Graffiti MCP
- Install the N8N community MCP node and configure an MCP client to talk to Graffiti MCP (host.docker.in + port)
- Add two MCP tools in N8N: add memory and search memory
- Test by inserting a document into the knowledge graph and performing a relational query
- Note: the setup emphasizes Graffiti for NLP-driven extraction and the MCP to expose graph tools to N8N
- Prereqs:
-
Demo highlights or examples mentioned
- Vector DB demo: getting an overview of a mock company from the vector store
- Knowledge graph demo: querying Dr. Tanaka and Dr. Chen via Graffiti MCP (relational search)
- End-to-end test: uploading a Google Drive file, adding it to the knowledge graph, and running the LLM-based extraction to populate entities/relationships
- Mention that graph operations can be slower/lucrative than pure vector queries due to the relational processing
-
Pros, cons, and guidance on when to use knowledge graphs vs. a pure vector DB
- Pros:
- Enables relational queries and navigation between entities
- More powerful for complex, relationship-based reasoning
- Can search paths and edges between nodes (e.g., org charts, partnerships)
- Cons:
- Slower and more expensive due to LLM-driven extraction and graph workloads
- More setup complexity and maintenance (MCP, Neo4j, security)
- Guidance:
- Use knowledge graphs when data is highly relational and you need traversals or graph-based reasoning
- Use a pure vector DB for large-scale, fast similarity search without strong relational needs
- For mixed workloads, run a hybrid: graph for relational queries, vector DB for fast chunk retrieval
- Pros:
-
Takeaways and potential next steps
- Takeaway: Knowledge graphs add relational power to RAG in N8N with Graffiti MCP, but require careful setup and awareness of latency/cost
- Next steps you might explore:
- Deeper comparisons between graph-based vs. vector-based retrieval for your data
- More advanced graph tools (e.g., edge navigation, complex queries) in MCP
- Extensions to other data sources and LLM providers
- further content on AI agents, RAG, and knowledge graphs integration