There is a common belief that if AI will start to selfdeterminate, that will happen only at AGI level. But in reality you don't need to be an AGI to have will. Kids are not AGI and they have will since the beginning. I'm puzzled.
1 Answers
I guess the misleading point is to assume that AGI must have its own will since the very definition of what AGI means may differ from one person to another. For instance, Sam Altman's definition of AGI is "the equivalent of a median human".
Quote:
"In a recent profile in the New Yorker, OpenAI CEO Sam Altman compared his vision of AGI — artificial general intelligence — to a "median human." He said: "For me, AGI…is the equivalent of a median human that you could hire as a co-worker." Sep 27, 2023
Moreover:
It's not the first time Altman has referred to a median human. In a 2022 podcast, Altman said this AI could "do anything that you'd be happy with a remote coworker doing just behind a computer, which includes learning how to go be a doctor, learning how to go be a very competent coder."
In the same sense:
An August report from McKinsey adopted similar benchmarks, including a graph of technical capabilities where generative AI is expected to match the "median level of human performance" by the close of the decade.
Source: Business Insider
Although it is a disputed term, it is crystal clear that it refers to an adult's performance and proficiency. In other words, an average adult employee or professional and not a kid. Therefore, the AGI qualification relies on its potential value for replacing human labor and it is not related to developing or not its desires. Maybe your definition works better for ASI (i.e. "Artificial Super Intelligence") but that's a whole different scale and story.

- 741
- 6
- 19

- 11
- 5
On "will", can we think of artificial will just as we say artificial empathy or care? Gradient descent "wants" to get somewhere.
– Jaume Oliver Lafont Dec 31 '23 at 06:49