3

LLMs are understood to generate non-deterministic outputs. My question is wether there are LLMs out there that are capable to producing deterministic outputs for any given input given fixed parameters (like e.g temperature).

I heard that llama.cpp - if run on a CPU instead of a GPU appears to generate deterministic outputs.

user599464
  • 131
  • 1
  • 1
    When LLMs "generate output" they are sampling from a probability distribution. In principle, it should always be possible to get the same samples from a probability distribution. Read more here: https://community.openai.com/t/is-there-a-way-to-set-a-a-random-seed-for-responses-with-temperature-0/4164/3 – Sam Dec 06 '23 at 22:15

1 Answers1

3

Any LLM that exists could easily be modified to be deterministic. At the present, they sample from a probability distribution for the next word. It is a trivial change to make, so that instead of sampling from this distribution, they always pick the word deemed most likely.

This has nothing to do with running on a GPU vs CPU: the non-determinism is not a function of hardware but software.

In many LLMs, this tradeoff is governed by a temperature parameter (it is called temperature for historical reasons) that parameterizes how much randomness will be in the LLM.

chessprogrammer
  • 2,890
  • 2
  • 15
  • 26
  • 1
    I think OP wants deterministic sampling behavior for any temperature setting (including high temperature). Setting a random seed will do the trick. – Sam Dec 07 '23 at 17:16
  • yes - any idea why (to my knowledge) no LLM offers to define a randomness seed to make it deterministic? IMO this would have plenty of useful applications. – user599464 Dec 08 '23 at 14:16