ConversationKGMemory

1. What is ConversationKGMemory?

ConversationKGMemory builds a knowledge graph (KG) from a conversation.

Instead of storing plain facts, it stores relationships in the form:

(subject) — (relation) — (object)

Example:

(John) — lives_in — (Toronto)
(John) — works_at — (Google)

2. Why does it exist?

Entity memory stores flat summaries:

“John lives in Toronto and works at Google.”

That becomes hard to reason over.

KG memory solves:

  • Multi-hop reasoning

  • Relationship queries

  • Structured long-term memory

In short:

Remember facts as relationships, not sentences.


3. Real-world analogy

Think of:

  • Entity memory → notebook paragraphs

  • KG memory → a graph database (like Neo4j)

KG memory allows asking:

  • “Where does John live?”

  • “Who works at Google?”

  • “Which city does John live in?”


4. Minimal working example (Gemini)


5. What does it store internally?

You can inspect the graph:

Example output:

This is a true graph structure, not free text.


6. How does it work internally?

Each conversation turn:

  1. LLM extracts entities

  2. LLM infers relationships

  3. Triples are added to the graph

  4. Relevant triples are injected into the next prompt

So the LLM:

  • Builds the graph

  • Queries the graph

  • Reasons over relationships


7. Key characteristics

Feature
KG Memory

Stores

Subject-predicate-object

Structure

Graph

Multi-hop reasoning

Token usage

Low

Conversation flow


8. Entity Memory vs KG Memory

Feature
Entity
KG

Data type

Text facts

Triples

Reasoning

Weak

Strong

Structure

Flat

Graph

Query flexibility

Medium

High


9. Common mistakes

❌ Expecting it to remember chat wording ❌ Using it for short conversations ❌ Using it without structured questions

KG memory shines when relationships matter.


10. When should you use it?

Use ConversationKGMemory when:

  • Relationships are important

  • You need long-term structured memory

  • You want reasoning across facts

Avoid when:

  • Chat nuance matters

  • You need verbatim recall

  • Conversations are short


11. One-line mental model

ConversationKGMemory = conversation → knowledge graph

Last updated