Image Credit source: Twitter
Google: The name of this engineer is Blake Lemoine. On behalf of Google, he has been accused that Blake has shared the company’s confidential information with third parties.
Artificial Intelligence (AI)Artificial Intelligence) is growing in every area of the world nowadays. But because of this now Google (GoogleThe job of an engineer is in danger. Actually its Artificial Intelligence from Google (AI) Software Engineer of the Development Team (Engineer) has been suspended. The name of this engineer is Blake Lemoine. On behalf of Google, he has been accused that Blake has shared the company’s confidential information with third parties.
This claim has been made in a Bloomberg report. The company says that doing so is against the company’s rules. Now Blake has made a shocking claim about Google’s servers after his suspension. He has said that he has come across a sensitive artificial intelligence on Google’s servers. Blake claims that this Google AI chatbot can think like humans.
Engineer has been sent on paid leave
Let us tell you that Google’s umbrella company Alphabet Inc. recently sent Blake on paid leave. The company says that the company’s rules were violated on Blake’s part. Blake has said that due to doing AI ethic work, he may soon be fired from the company. Blake has also cited the example of some former Google employees during this period. They claim that these people were also fired from their jobs for raising similar issues.
Engineer has made big claims
On behalf of Blake, the Washington Post has also been interviewed in this matter. In this, he has clearly claimed that the Google Artificial Intelligence with whom the conversation has been done on his behalf was like a human. Although the full name of this AI is LaMDA i.e. Language Model for Dialog Application. He has claimed that when he started talking to this AI, he suspected the person behind it. Actually LaMDA is used to build chatbots.
A statement has also come out from the spokesperson of Google in this whole matter. Spokesman Brian Gabriel said that some sensitive in the artificial community are considering the possibility of a longer period. But it doesn’t make sense to do so by changing today’s conversational model to a humanized one.
: Language Inputs