The way we search on Google is with keywords and phrases. Most of us probably would not want to think about how our search engine behavior might be tracked and what platforms like Google know about us as a result. Yet even though some searches may reveal embarrassing or intimate details, they are still generally presented in a keyword format. This isn’t the case for chatbots and conversational tools built with AI. They allow you to have an interaction that feels like a human conversation, but the disturbing side effect of this experience is that research shows we may inadvertently reveal too much about ourselves and our motivations to chatbots, including details we wouldn’t ordinarily share even with another human quite so readily.
The technology forces us to feel a level of comfort that encourages this type of sharing by making us feel like the conversation itself is a safe space. So we do share, failing to recognize that we are not only helping train an AI bot with the language of our lives but that also we may be feeding it data that will then be sold and reused by marketers to sell solutions to our problems.