LLM With Knowledge Graphs
LLM With Knowledge Graphs
Interpreted by:
John Tan Chong Min
Knowledge
Graphs and
LLMs
Retrieval Augmented Generation (RAG)
• <Context 1>
• <Context 2>
• <Context 3>
• <Query>
Problems with LLMs
• Base LLM may not be able to recall all knowledge it is trained on
https://www.youtube.com/watch?v=8NRvhGm9EOg
Knowledge Graphs and Causal Relations
• The relation between Source and Destination can be a causal link
Tired Awake
Drink Coffee
Can Knowledge Graph be viewed as a tool/memory?
• Perhaps a more efficient way to get data than querying large corpuses of text
• Extracting relevant parts of a Knowledge Graph can serve as a way to retrieve context
Get Information
Knowledge
LLM Agent Graph Database
Informs
Knowledge Graphs and LLMs: 3 Approaches
Approach 1
KG-Augmented LLMs
Approach 1: KG-augmented LLMs
• KG as Text: Pass in Knowledge Graph (Generative Agents: Interactive
Simulacra of Human Behavior, Park et al, 2023) as text
The subject is the entity being described, the predicate is the property of the
subject that is being described, and the object is the valuae of the property.
LLM zero-shot/few-shot prompting for KG generation
EXAMPLE
It's a state in the US. It's also the number 1 producer of gold in the US.
Output: (Nevada, is a, state), (Nevada, is in, US), (Nevada, is the number 1 producer of, gold)
END OF EXAMPLE
EXAMPLE
I'm going to the store.
Output: NONE
END OF EXAMPLE
EXAMPLE
Oh huh. I know Descartes likes to drive antique scooters and play the mandolin.
Output: (Descartes, likes to drive, antique scooters),(Descartes, plays, mandolin)
END OF EXAMPLE
EXAMPLE
{text}
Output:
LLMs as text encoders for Knowledge Graph Embeddings
• KG built based on embedding space
Approach 3
LLMs and KG two-way interaction
Approach 3: Synergy between Knowledge Graphs and LLMs
Knowledge Graphs for Fact-Checking!
• We can perhaps use knowledge graphs to perform fact-checking
• Here it is done as pre-training, but we can/should also use KGs during inference
https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/indexes/graph.py
https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/chains/graph_qa/base.py
Step 1: Generate Triplets from Context
Generated Knowledge Graph
Steps 2-4: Extract and use relevant KG for query
Compare to just feeding in context directly
How I would do it
Strict JSON Framework
Step 1: Generate Knowledge Graph from context
Generated Graph
Compare with LangChain
• Should we utilise embedding space too for the Knowledge Graph, or just
for the LLM?