Photo Credit: Microsoft
On the surface, the announcement of a new chatbot patent would not seem to be earth-shattering news. After all, haven’t companies been using chatbots for years now? But when the company holding the patent is Microsoft, and the person behind Microsoft is Bill Gates, the case becomes a little more interesting. Microsoft’s patent for ChatbotYou, first filed in 2017 but only now approved in 2021, could further the line between the real and virtual worlds, thanks in large part to its reliance on machine learning to analyze every possible scrap of data people are leaving behind on social media. In fact, social media, once viewed as a way of making us all more social and more connected, could actually be making human-to-human interaction obsolete. Interacting with bots would just be a whole lot more interesting.
How Chatbot You works
According to the Microsoft patent, ChatbotYou would essentially “train” on social media data. In other words, data scientists at Microsoft would feed a machine learning algorithm as much data as they can about you, based on your online social media interactions. This would include public posts on platforms such as Facebook or Twitter, videos you’ve created on YouTube, voice recordings you’ve left behind, forum posts on Reddit, and private messages you might have exchanged on unencrypted messaging platforms.
The algorithm would then combine demographic data from similar types of social media users to come up with possible responses and answers to questions it might receive. Over time, thanks to machine learning, the chatbot would become smarter and smarter, and would one day be able to anticipate questions and how to answer them the same way you would. Using something called “crowd-based perceptions,” it might even be able to guess how you would think about a certain issue or topic, or how you might respond to a certain political candidate’s views.
Possible futuristic scenarios
That’s just the base case scenario, of course. The Microsoft patent goes further than this – it suggests that these chatbots might take some type of 2D or 3D form (chatbot robots?), could be used to create a variety of fictional, historical and celebrity characters, and might even be able to create “younger” or “older” versions of yourself. Imagine a recent college graduate being able to ask its 40-year-old self: What are some of the things that I should be looking out for over the next 20 years? Or imagine your bored current-day self interacting with a cool, celebrity version of yourself (sort of like your best Instagram self on steroids).
And things could get stranger still. Some have even warned of a “Frankenstein monster of machine learning,” in which various entities create completely unapproved chatbots, based on what’s in the public domain. Imagine different versions of chatbots, all purporting to be famous people or celebrities. Or imagine nefarious uses of chatbots, in which criminals and scammers use them to access your personal accounts. For example, once a chatbot knows everything about you, it might become possible for it to start posting on social media platforms without your knowledge or approval, to start conversations with random contacts on your mobile phone, or even guess username/password combos for your bank accounts. There are, in fact, a whole host of privacy, defamation, trademark, copyright and possibly even national security implications (if these chatbots start to take on political personas).
The real blurring of reality
We already live in a time of augmented reality filters, virtual reality gaming, holograms and AI-powered text-generation tools. We live in a time when machine learning is making the creation of Deep Fakes easier and easier. And we now live in a time when AI-powered robots like Sophia are now going into mass production. So it wouldn’t surprise us at all if new individual-based chatbots, powered by machine learning and social media, rocket into the mainstream faster than ever imagined. We’re already conditioned to think of chatbots as relatively harmless, so it may be too late before any of us realize what just happened…