Read: Emerging trends: a gentle introduction to RAG
- Project: My Research Project
- Task List: General Research
- Start Date:
- Deadline:
- Created By: Stuart Mathews
- Assigned Users: Stuart Mathews
This is a paper that discusses how RAG is an antidote to the static offline nature of LMMs and how they can be used to provide context and dynamic content, which can also combat hallucination (reliance only on training data).
This is co-written by the same researcher (Walid S. Saba) who wrote:
- LLMs' Understanding of Natural Language Revealed
- Stochastic LLMs do not Understand Language: Towards Symbolic, Explainable and Ontologically Based LLMs
Reference:
Church, K.W. et al. (2024) ‘Emerging trends: a gentle introduction to RAG’, Natural Language Engineering, 30(4), pp. 870–881. Available at: https://doi.org/10.1017/S1351324924000044.