The way AI learns is similar to how humans learn – feed it information and it will take that information and make decisions based on it. It’s useful because computers can work faster than humans and be more objective when presented with information, but it’s also a bit scary when you think about what else it can be used for.

In fact, it seems that Microsoft is thinking that AI could be used to mimic a specific person, so much so that according to a discovery by Protocol, the company has filed a patent where they are considering making a chatbot that could talk like a person you know, like a dead loved one, for example.

According to the patent’s description, “In aspects, social data (e.g., images, voice data, social media posts, electronic messages, written letters, etc.) about the specific person may be accessed. The social data may be used to create or modify a special index in the theme of the specific person’s personality.”

We’ve come across various instances in the past where social media accounts have been hacked and used to send spam or malicious links, but sometimes it’s quite obvious that the person isn’t who they say they are. However, by using AI and feeding it information, it could lead to next-generation levels of identity theft.

We admit that it might be a bit nostalgic to be able to “talk” to a dead loved one online where the chatbot can mimic that person realistically, but it still feels kind of weird and a little bit wrong.

Filed in General. Read more about , and . Source: inputmag

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading