0

There is a common belief that if AI will start to selfdeterminate, that will happen only at AGI level. But in reality you don't need to be an AGI to have will. Kids are not AGI and they have will since the beginning. I'm puzzled.

eugenio
  • 21
  • 1
  • What do you mean by "selfdeterminate"? What is the relationship between will and "selfdeterminate"? Saying that kids are not AGI is also very debatable. Some kids maybe can do more things than certain adults. So, it seems that this question is not very clear and also is based on a possibly wrong assumption. – nbro Dec 31 '23 at 03:59
  • I feel intelligence has more to do with growth than with size. In a kid, realizing some relationship they did not know before displays intelligence as a smile and spark in the eyes. A kid solving a problem through a path different from the one they were taught is a genuine sign of intelligence, no matter how tiny the problem. A smart kid surprises an average adult that knows what the kid learns on their own. So what about AGI as Artificial Growing Intelligence?

    On "will", can we think of artificial will just as we say artificial empathy or care? Gradient descent "wants" to get somewhere.

    – Jaume Oliver Lafont Dec 31 '23 at 06:49
  • I am not sure this is a common belief amongst AI researchers and enthusiasts, as stated in the question. Could you please link a book, article or published statement that backs up your assertion, preferably not a single person's opinion piece on AI (if you only have single relatively unknown AI bloggers to quote on this belief, then I'd like to see more than one before I could accept this is a "common" belief). You can use [edit] to add your sources. This would be useful as then someone answering could explain the logic of the poster (plus any limitations). – Neil Slater Jan 01 '24 at 22:26

1 Answers1

1

I guess the misleading point is to assume that AGI must have its own will since the very definition of what AGI means may differ from one person to another. For instance, Sam Altman's definition of AGI is "the equivalent of a median human".

Quote:

"In a recent profile in the New Yorker, OpenAI CEO Sam Altman compared his vision of AGI — artificial general intelligence — to a "median human." He said: "For me, AGI…is the equivalent of a median human that you could hire as a co-worker." Sep 27, 2023

Moreover:

It's not the first time Altman has referred to a median human. In a 2022 podcast, Altman said this AI could "do anything that you'd be happy with a remote coworker doing just behind a computer, which includes learning how to go be a doctor, learning how to go be a very competent coder."

In the same sense:

An August report from McKinsey adopted similar benchmarks, including a graph of technical capabilities where generative AI is expected to match the "median level of human performance" by the close of the decade.

Source: Business Insider

Although it is a disputed term, it is crystal clear that it refers to an adult's performance and proficiency. In other words, an average adult employee or professional and not a kid. Therefore, the AGI qualification relies on its potential value for replacing human labor and it is not related to developing or not its desires. Maybe your definition works better for ASI (i.e. "Artificial Super Intelligence") but that's a whole different scale and story.

Hiren Namera
  • 741
  • 6
  • 19