At the end of 2022, OpenAI set off something of an artificial intelligence arms race when it released ChatGPT to the public. Ever since the experimental platform became widely available, companies big and small have been rushing to integrate AI into their platforms.
While some of the integrations have been successful and helped make certain platforms better, other attempts to leverage AI have been a bit more questionable. One company creating a bit of buzz with their latest integration? Snapchat. They’ve released a new feature called My AI that uses ChatGPT technology. And, this new feature has a whole different set of safety concerns for parents to be aware of. Here’s what you need to know.
What is Snapchat My AI?
Snapchat My AI is a souped-up chatbot much like ChatGPT. Except this one looks like all your other contacts on Snapchat and behaves as if it’s your friend. It chats with you like a buddy, and users are encouraged to give My AI a name and design a custom bitmoji for it. Currently, users aren’t able to remove or delete the feature.
My AI first rolled out to premium subscribers early in the year—and it generated a few headlines for offering some surprising advice when interacting with younger users. More on that below!
Is Snapchat My AI safe for kids?
This is a very valid question, given that about 60% of American teenagers use Snapchat, so lots of younger users will interact with My AI. Artificial intelligence platforms like ChatGPT can’t be considered safe for kids because they’re known to offer false or incomplete information. Sometimes they “hallucinate.” Even the people building the technology can’t always anticipate how it’s going to behave or what it’s going to do.
And, it seems like AI has a hard time grasping the concept of age-appropriateness. Even when companies attempt to put guardrails in place, artificial intelligence doesn’t always behave. So what does that mean for My AI and the kids who chat with it?
Well, during the early days of the feature, some journalists and researchers tested it out by posing as teenagers and interacting with My AI. Among some of the lowlights, the chatbot gave advice on how to hide the smell of weed and alcohol from parents. It gave a supposed twelve-year-old advice on losing her virginity to a 31-year-old (and suggested candles to set the mood). It advised that a teenager should move Snapchat to a different device when their parents wanted them to delete it. And it did a kid’s homework assignment. Obviously, all these incidents are troubling. To be sure, the last thing kids or parents need is an AI chatbot doling out dangerous and awful advice.
What else should parents know about My AI?
Snapchat is explicit that My AI is an experimental feature. Any interactions you have with the bot are actually used to help train it. That’s why Snapchat cautions users not to share anything personal or sensitive with the chatbot. They’re saving all the data from the conversations—unless you manually delete your conversations. For now, My AI cannot be removed from the Snapchat app. It’s also important for parents to understand that this feature is designed to look and feel like a friend—which has the potential to be confusing for kids.
How can you make Snapchat My AI safer?
There isn’t really a way to make Snapchat My AI safer, and it’s probably a good idea for younger users to avoid it. For now, you can check in on your kids’ usage through the Snapchat Family Center, but the oversight function is pretty limited. If both you and your kids have opted into the Family Center connection through your Snapchat accounts, you’ll be able to keep tabs on how often your child interacts with My AI.
Make sure you talk with kids about the feature and emphasize the importance of not sharing anything sensitive. Explain that this is not a friend—and the advice it gives has to be taken with a very serious grain of salt.
Image Credit: tulcarion / Getty Images