Google’s Engineer Claims, Their AI Chatbot LaMDA Has Feelings
Google put one of their engineers on leave after he claimed the company’s AI chatbot LaMDA is sentient or has feelings. Was that robot alive or conscious?
After becoming concerned that an AI chatbot system had attained sentience, Google has placed one of its developers on paid managerial leave for allegedly violating the company’s confidentiality regulations.
Blake Lemoine, a Google Responsible AI engineer, was experimenting with the LaMDA model to see if it generated discriminatory or hate speech.
The engineer’s concerns were apparently sparked by the AI system’s compelling responses to questions about its rights and robotics ethics.
A transcript of his conversations with the AI, which he claims argues “that it is sentient because it has feelings, emotions, and subjective experience,” was included in a document he shared with executives in April titled –
“Is LaMDA Sentient?”
Lemoine released the transcript via his Medium account after being placed on leave.
Google believes Lemoine’s actions connected to his work on LaMDA have broken corporate confidentiality regulations.
He allegedly summoned a lawyer to signify the AI system and met with a member of the House Judiciary Committee to discuss alleged unethical practices at Google.
The engineer stated he sought “a limited amount of outside input to help lead me in my investigations” in a Medium post on June 6th, the day he was sent on administrative leave, and the list of individuals he had spoken with including US government workers.
Last year, at Google I/O, the search giant launched LaMDA, which it thinks would strengthen its conversational AI helpers and allow for more natural discussions.
Similar language model technology is already used by the corporation for Gmail’s Smart Compose function and search engine inquiries.
However a spokesperson from Google clearly denied that one of their LamDA chatbot became sentient, through a statement.
But Google officials said that there is no evidence for that. Maybe, the company is trying to hide this incident from being public.