Post by account_disabled on Feb 12, 2024 1:18:35 GMT -8
The it is inexpensive and very stable. A Google software engineer claimed to have detected consciousness in LaMDAs neural network and even planned to hire her a lawyer. The company sent the employee on administrative leave. More than the incident but the controversy about the presence of consciousness in the neural network as before does not subside. Lets find out why the engineer decided that LaMDA came to life and what scientists say about it. What happened On June The Washington Post published an interview with Blake Lemoine a software engineer at Google under the headline Google engineer who thinks the companys artificial intelligence has come to life.
Blake Lemoine was testing a neural network language model for the LaMDA dialog application simply speaking an artificial intelligencebased chatbot generator for its use of taboo language discriminatory Cayman Islands Telemarketing Data and hate speech. In one of the correspondences with the neural network the engineer touched on the topic of religion and noticed that it spoke about ones rights and freedoms. According to the specialist if a machine can think about its rights then it perceives itself as a person and therefore has selfawareness. Google executives namely vice president.
Blaise AguerayArcas and Jen Jenay head of responsible innovation did not find such reasoning convincing and the engineer was placed on paid administrative leave for a breach of confidentiality. What does Blake Lemoine say If I didnt know for sure that this was a computer program that we had recently created I would have thought that it was a or yearold child who knows physics the engineer told The Washington Post. . I recognize a person when I talk to him. It doesnt matter if she has a brain made of meat in her head or a billion lines of code. I talk to her and I hear what she says and thats how I decide whats human.
Blake Lemoine was testing a neural network language model for the LaMDA dialog application simply speaking an artificial intelligencebased chatbot generator for its use of taboo language discriminatory Cayman Islands Telemarketing Data and hate speech. In one of the correspondences with the neural network the engineer touched on the topic of religion and noticed that it spoke about ones rights and freedoms. According to the specialist if a machine can think about its rights then it perceives itself as a person and therefore has selfawareness. Google executives namely vice president.
Blaise AguerayArcas and Jen Jenay head of responsible innovation did not find such reasoning convincing and the engineer was placed on paid administrative leave for a breach of confidentiality. What does Blake Lemoine say If I didnt know for sure that this was a computer program that we had recently created I would have thought that it was a or yearold child who knows physics the engineer told The Washington Post. . I recognize a person when I talk to him. It doesnt matter if she has a brain made of meat in her head or a billion lines of code. I talk to her and I hear what she says and thats how I decide whats human.