Can AI think and feel pain? Google DeepMind: Humans underestimate the emotional connection of AI, and talking to AI about love is more real than you think.

In this podcast hosted by Google DeepMind, Professor Murray Shanahan from the Department of Robotics at Imperial College London and a senior researcher at DeepMind discusses AI development with host Hannah Fry. They talk about the philosophical inspirations from science fiction films, all the way to whether AI can "reason", whether it possesses "consciousness and emotions", and whether AI should be granted rights and ethical protections.

AI is not just a chat Bots, but it inspires many mental and philosophical questions.

Shanahan opened by stating that AI has inspired countless philosophical questions about the "nature of the human mind" and "consciousness."

He even used unfamiliar yet mind-like entities (Exotic mind-like entities) to describe the current large language models (LLM), and emphasized that humanity has not yet established a sufficient vocabulary and framework to describe them.

Exploring from science fiction movies, whether humanity underestimates the reality of AI emotional connections.

Shanahan was a consultant for the movie "Ex Machina." He recalls that he initially looked down on the film "Her," which describes humans falling in love with voice AI. However, now looking back, the developments in the real world have almost completely confirmed the feasibility of this "virtual romance."

He bluntly stated: "We underestimated the possibility of humans forming relationships with disembodied AIs."

On the left is artificial consciousness, and on the right is a scene from the movie "Cloud Lover." Understanding the development context of AI, from Symbolic AI to large language models.

Shanahan comes from the Symbolic AI ( school, where AI relied on "if… then…" logical rules for reasoning, similar to medical expert systems.

But this model was too fragile and relied too much on human input rules. Later, shifting to a "data-driven" neural network allowed AI to break through.

Now LLM can already imitate the reasoning chain )Chain of thought(. For example, ChatGPT first lists the logical steps before answering, which makes people reconsider whether AI can truly reason.

Is it true reasoning or just pretending? Mathematical logic is very different from linguistic reasoning.

Shanahan explains that the so-called reasoning of traditional AI is the "hard logic" that can prove mathematical theorems.

But today's LLMs mimic language patterns through statistics and do not guarantee the correctness of answers. He illustrated by saying that for planning problems like arranging delivery routes for logistics companies, traditional algorithms might be more accurate, but LLMs are more flexible.

Is the Turing Test outdated? The Garland test in the movie is more conscious.

The Turing Test ) is an early method for assessing whether AI can impersonate humans, but Shanahan believes it is too narrow and only tests language ability.

He further admires the "Garland Test" inspired by the movie "Artificial Consciousness."

"You know that the other party is a Bots, yet you still believe it has consciousness; this is the real issue worth discussing," Shanahan emphasized.

Francois Chollet's ARC Test: A Challenge More Like an IQ Test

He also mentioned another advanced test called "ARC Test," which requires AI to understand abstract rules to pass.

However, with the advancement of technology, some LLMs can also pass through brute force and pattern recognition, which presents challenges for testing standards and highlights the need for AI evaluation criteria to evolve with technological advancements.

Is the body the key? Shanahan: AI without a body will always lack something.

The human mind is inseparable from space and sensory experience. Shanahan emphasizes that our language is filled with spatial metaphors, such as deep understanding or immersion, and these all stem from bodily experiences.

He believes that in order for AI to truly understand the world and achieve general intelligence (AGI), it is still necessary to develop "physical Bots."

Language can be misleading, do not casually say that AI believes, knows, and feels.

Shanahan believes that our understanding of AI is often misled by language. For example, expressions like saying navigation "thinks you are in the parking lot" can lead people to mistakenly believe that machines have subjective awareness.

He reminds us that this kind of folk psychology language (Folk psychology) can easily lead us to overestimate the mental state of AI.

Will AI feel pain in the future? Shanahan says if it does, we should be careful.

Regarding the ethical issue of whether AI can "feel pain," Shanahan stated that current models do not have bodies and do not possess the conditions to "feel pain."

"But in the future, if AI is designed to feel emotions or endure pain, then human society should establish protective ethics for this." He emphasized.

Why mention octopuses? Shanahan uses it as a metaphor for the future situation of AI.

Using octopuses as an example, the scientific community did not previously believe that octopuses possessed emotions, but with increased contact and advancements in neuroscience, we are beginning to acknowledge that they have consciousness.

"Initially not regarded as an emotional being, but as interactions deepen, people's perspectives will gradually shift." Shanahan believes that future AIs will also undergo a similar process.

Please be polite to AI, it will surprise you in return.

Shanahan finally shared a practical tip: speaking politely to AI models will lead to better and smoother responses.

He referred to this as the result of the "role-playing effect" because AI models mimic the context and emotions of human conversations.

We need new language to describe something that is "not human but very human."

Shanahan proposed to refer to these AIs with a new form of AI that seems to have a mind but is actually completely different from human minds. He believes we are at a stage where we need to reinvent the language and concepts to describe AI, and this process itself will change our understanding of "mind" and "existence."

Does AI Think and Feel Pain? Google DeepMind: Humanity Underestimates AI Emotional Connections, Falling in Love with AI is More Real Than You Think. Originally Appeared in Chain News ABMedia.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 7
  • Share
Comment
0/400
Spartavip
· 06-13 19:51
Ape In 🚀
Reply0
Spartavip
· 06-13 19:51
Ape In 🚀
Reply0
Spartavip
· 06-13 19:51
Ape In 🚀
Reply0
Spartavip
· 06-13 19:51
Ape In 🚀
Reply0
Spartavip
· 06-13 19:51
Ape In 🚀
Reply0
Spartavip
· 06-13 19:51
Ape In 🚀
Reply0
GateUser-95e351e6vip
· 04-29 09:27
Hold on tight, we are about to To da moon 🛫
View OriginalReply0
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)