RecallM: An Architecture for Temporal Context Understanding and Question Answering

07/06/2023
by   Brandon Kynoch, et al.
0

The ideal long-term memory mechanism for Large Language Model (LLM) based chatbots, would lay the foundation for continual learning, complex reasoning and allow sequential and temporal dependencies to be learnt. Creating this type of memory mechanism is an extremely challenging problem. In this paper we explore different methods of achieving the effect of long-term memory. We propose a new architecture focused on creating adaptable and updatable long-term memory for AGI systems. We demonstrate through various experiments the benefits of the RecallM architecture, particularly the improved temporal understanding of knowledge it provides.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset