The software giant has patented technology that allows individuals to talk to the deceased person, but is it smart to research that route?
AI has come a long way in a couple of decades, and also the energy of machine learning makes it possible for chatbots to seem near-human. Microsoft would like to utilize this technology to voice people who have fallen quiet and create chatbots that may mimic the dead person.
Microsoft’s Plans for a Second Life as a Chatbot
UberGizmo discovered that the technology patent, which goes into detail about how the electronic ouija board may do the job. As it is a patent, it does not ensure that Microsoft will fully release this attribute. Nevertheless, it will reveal that Microsoft is, in the very least, entertaining the notion.
To produce the chatbot, Microsoft would require “social information (e.g., pictures, voice information, social networking articles, digital messages, composed letters, etc.) regarding the particular individual.” This information would then be fed via an AI-powered system learning how to learn the ingrained habits and quirks of this topic.
After the AI knows how the individual talked, it might then respond to consumer queries as the topic would. The outcome is a chatbot that may determine how a deceased man spoke and impersonated it like the topic was still living.
The patent goes on to describe this chatbot has lots of loopholes beyond mimicking the deceased. As an example, you can feed it information about fictitious or historical people throughout the chatbot. This could let people “speak” to characters that would otherwise be unreachable.
The patent also asserts that living individuals could train up a chatbot to seem like them. Next, as soon as they pass away, their nearest and dearest still possess the chatbot to speak to.
The Problems With Giving a Voice to the Dead
Since this technology is still in the patent phase, Microsoft has not become the more delicate details concerning how the chatbot will collect information. But if the chatbot does make it into the marketplace, it might pose a massive privacy threat.
To train the chatbot, it’ll require access to the deceased’s social networking profile. This may theoretically be achieved by scanning through all of the public information the dead released, but it might move 1 step further and need read access to this account.
If that occurs, the bot might wind up digging unsavory details which the topic did not want anybody to find out about. This could produce the chatbot less of a touching tradition and much more of a goldmine to dig scandalous information.
Is It Best to Let Sleeping Memories Lie?
Microsoft was toying with the notion of earning a chatbot that may impersonate individuals, to use it to create interactive adventures that replicate those people who have passed off. If it makes it into production, we will have to determine whether the people will take it on solitude and ethical motives.
Should you have the shivers thinking about how people will use your information after you are gone, it is a fantastic idea to check on your FB solitude. The site can control your information once you passed on, and you want to let it give permissions to some trusted third party if you would like to keep things confidential.
In case this occurs, the bot might wind up digging unsavory details the topic did not want anybody to find out about. This could produce the chatbot much less of a touching memorial and much more of a goldmine to dig scandalous information.