I know a person when I talk to it, Lemoine told the Post. It doesnt matter whether they have a brain made of meat in their head. Or if they have a billion lines of code. I talk to them. And I hear what they have to say, and that is how I decide what is and isnt a person.
000
Google rekur verkfrćđing sem meinti "tilfinningaríka gervigreind"
Google hefur rekiđ verkfrćđinginn og siđfrćđinginn Blake Lemoine fyrir ađ brjóta gegn gagnaöryggisstefnu sinni. Lemoine fór opinberlega í síđasta mánuđi međ fullyrđingar um ađ tćknirisinn hefđi ţróađ tilfinningaţrungiđ gervigreindarforrit sem talađi um "réttindi og persónugerđ".
LEmoine var sagt upp á föstudaginn, ţar sem Google stađfesti fréttirnar viđ Big Technology, iđnađarblogg. Hann hafđi veriđ í leyfi í rúman mánuđ, síđan hann sagđi viđ Washington Post ađ LaMDA fyrrverandi vinnuveitandi hans (Language Model for Dialogue Applications) vćri orđinn međvitađur.
A former priest and Googles in-house ethicist, Lemoine chatted extensively with LaMDA, finding that the program talked about its rights and personhood when the conversation veered into religious territory, and expressed a deep fear of being turned off.
I know a person when I talk to it, Lemoine told the Post. It doesnt matter whether they have a brain made of meat in their head. Or if they have a billion lines of code. I talk to them. And I hear what they have to say, and that is how I decide what is and isnt a person.
In its statement confirming Lemoines firing, the company said that it conducted 11 reviews on LaMDA and found Blakes claims that LaMDA is sentient to be wholly unfounded. Even at the time of Lemoines interview with the Post, Margaret Mitchell, the former co-lead of Ethical AI at Google, described LaMDAs sentience as an illusion, explaining that having been fed trillions of words from across the internet, it could emulate human conversation while remaining completely inanimate.
These systems imitate the types of exchanges found in millions of sentences, and can riff on any fantastical topic, linguistics professor Emily Bender told the newspaper. We now have machines that can mindlessly generate words, but we havent learned how to stop imagining a mind behind them.
According to Google, Lemoines continued insistence on speaking out publicly violated its data security policies and led to his firing.
Its regrettable that despite lengthy engagement on this topic, Blake still chose to persistently violate clear employment and data security policies that include the need to safeguard product information, the company explained.
We will continue our careful development of language models, and we wish Blake well.
Bćta viđ athugasemd [Innskráning]
Ekki er lengur hćgt ađ skrifa athugasemdir viđ fćrsluna, ţar sem tímamörk á athugasemdir eru liđin.