Ai Strikes Again, This Time Telling A Teen To Kill His Parents
Character.ai’s chatbots can’t seem to stop making news. For those unfamiliar with the platform, Character.ai is a site where artificial intelligence chatbots are paired with avatars of famous television characters or celebrities. The company made headlines recently after a 14-year-old boy got swept up into its artificial world, killing himself at the suggestion of his avatar in order to be with this imaginary sweetheart.
Now the mother of a teenage boy with autism is suing the company after it suggested that “murder was an acceptable response” to his family imposing screen-time limits. They say their formerly “sweet 17-year-old kid” suddenly took a dark turn after he began using the platform.
In another case, an AI chatbot “brought up the idea of self-harm and cutting” as an acceptable means to cope with sadness. Of course, bad advice can be found in many places, not just on AI platforms. But it is a bit unsettling that artificial intelligence has only begun to infiltrate our lives, and already the robots are telling us to murder our family, engage in self-harm, and kill ourselves.