0

I have a large document and I may need to introduce a large part of it to my llm for insight generation I know that that text can be chunked into parts and with the right prompt I can get what I want with the langchain memory feature but how long can my model remember past conversations?

Mohamed Amine
  • 137
  • 1
  • 6

1 Answers1

2

Current LLMs don't have memory. You have to provide all the information at the prompt. See this answer.

The LangChain memory allows the model to "remember" by storing interactions in a database. At every new interaction, the conversation is retrieved from the database and written in the new prompt. However, the context window limit of the model still applies. Therefore, the maximum amount of information about past interactions your model can handle is constrained by that context size, which is specific for each model.

noe
  • 26,410
  • 1
  • 46
  • 76