How to Protect Your Kids Online—and What You Need to Know About the App Landscape

The parent panic surrounding apps isn’t unwarranted. Being a kid in today’s online world can be incredibly dangerous: The rise of anonymous messaging apps and apps designed to meet strangers, combined with the ubiquity of camera-capable phones, has resulted in a landscape land-mined with perils, from online bullying to the sexual predators who use apps to meet unsuspecting children. Which is to say nothing of the apps that are just simply age-inappropriate for kids because their content may be too violent or otherwise mature. In every case, it’s all too easy for a child to gain access.

Of course, children are going to be online. Most are going to have a phone at some point. They are going to download apps and participate in them. The question is: How do we keep them safe? SaferKid, an educational alert system, has proposed a solution. Here’s how it works: You connect your kids’ devices to SaferKid’s system, and the company scans those devices regularly to see if your kids have downloaded any apps of concern. They send you alerts, explaining what’s concerning about a particular app (i.e. high risk of sexting), and offer advice on how to talk about the app with your children. They don’t see/show what your kids are doing on those apps, so it isn’t full-on Big Brother, and because you involve your kids in the process, it doesn’t feel so much like one-sided spying. Below, SaferKid’s CEO Cheyenne Ehrlich shares his advice for parenting around apps.

A Q&A with Cheyenne Ehrlich


What’s the scope of the issue here—how many problematic apps are on the market, and which should parents be most concerned about?


To keep it simple, roughly forty percent of American teens have an app that would allow predators to reach them without anyone knowing, and there were 4.4 million online child sex crimes reported in 2015.

We regularly update a list of some of the leading apps of concern on our website. This list changes from time to time. These are by no means the only apps parents should be concerned about.

As far as the total number of problematic apps: If you want to limit the issue to apps that have mechanics which have led to child sex crimes, we’re in the tens of thousands. But overall SaferKid has identified more than 200,000 apps that parents would probably want to be aware of, depending on their child’s age and individual situation.

But parents shouldn’t be racing to memorize a list of 200,000 apps that they should look out for. The real focus should be: What are the apps your child is exploring, and are they dangerous or developmentally inappropriate? That’s a much smaller and more manageable list.

“Most app companies are really small, and do not have child safety teams of any kind.”

There are two issue to keep in mind, even with this list: First, apps get new features all the time, so you need to see if an app updates with new features that make it more dangerous. Second, children and teens try new apps all the time. Very frequently, small app companies launch a product, spend $5,000 to test market it, often directly to children, and then forget about it entirely to work on something else. Most app companies are really small, and do not have child safety teams of any kind. As an example of how small app companies can be, when Instagram was acquired by Facebook for one billion dollars, they had thirty-five million users and only thirteen employees.


What’s your criteria for determining if an app is age-appropriate or not?


At the root of our process is a lengthy, ongoing analysis of the mechanics that give rise to meeting strangers, bullying, and sexting, along with a number of other concerns. (We talk about mechanics because changing how an app works changes what people do on it, and it changes the outcomes associated with using that app.) We have nearly twenty categories to which we apply risk level ratings, again with an emphasis on the risk (none, low, or high) of meeting strangers, bullying, sexting, and encountering adult content.

We eventually arrive at an age rating: One age rating is the suggested minimum age to use an app. The other is the minimum age at which a child or teen can use an an app without a parent knowing. Parents should be aware of all social networks and messaging apps their children use, even if they are age appropriate.

This is because there have been many reports of predators who meet a child on one app and then get them to install another—often more anonymous—app. So, knowing a child has gotten a new messaging product or joined a new “social network” gives parents a chance to do their job and ask what attracted the child to it. It lets you tease out what’s driving their behavior change, and head things off at the pass if necessary, and it’s why we added the second age rating.


What advice do you have for parents who are trying to decide if a particular app is age appropriate for their child?


Obviously if it’s an app a child absolutely shouldn’t use, such as one with a history of child sex crimes, the decision is pretty clear. But for apps like social networks and messaging products, it’s helpful to reframe your thinking.

The most common mistake parents make is to approach the question in the same way as asking if a TV show or movie is appropriate. The question is not, “Is it okay for my child to watch this?” The question is, “Is it okay for my child to go to this new place and do this activity?” Apps are places where people do things. And once you think about them the way you would your child going over to someone else’s house, you are in a better position to ask about who owns that place and what people do there. Since most app companies do not have child safety teams, it’s important to be comfortable with your child engaging in an unsupervised activity at this particular place.

“If you look at the app on the phone, and even if you open it, it looks like a calculator until you enter your special numeric password.”

It’s critical to know that some apps are designed to be deceptive. For example, at a high school in Colorado, a large number of students were playing a game where they were earning “points” based on how many naked photos of their classmates they could collect. To keep the game secret, they were all using a sexting app designed to look like a calculator. If you look at the app on the phone, and even if you open it, it looks like a calculator until you enter your special numeric password. The largest maker of these calculator sexting products has millions of users—and there are over fifty competitors making similar products.


Can you take us through some scenarios where predators have used apps to target children? What can happen?


Generally, children and teens become victims of predators on apps as part of a five-step process:

  1. The child gets a risky app.

  2. The child develops an attachment to the app.

  3. The child meets a stranger on the app.

  4. The stranger grooms the child.

  5. Something bad happens.

Here are three of many, many tragic stories we have reviewed in our research here:

  • Last summer, a fourteen-year-old boy went on an app and met a man in his twenties. While the boy’s single mother was at work, the man came over and sexually assaulted the boy. Afterwards, the man told the boy he was HIV positive.

  • A thirty-one-year-old man was arrested in Florida after he befriended a total of 350 different young girls online while pretending to be a fifteen-year-old boy. In each case, he got them to expose their breasts on camera and secretly recorded them. After this, he got them to make more sexually explicit photos and videos for him by threatening to expose the recordings on social media if they did not do it.

  • A few years ago, an app received reports of several different children being raped by predators who met them on the app. The app maker responded by making changes, including not allowing anyone to talk to a minor within fifty miles of the minor. By late 2015, the fifty-mile radius restriction had been removed, and people were using the app to assault children again. The app was recently acquired for tens of millions of dollars.


How does SaferKid technology work to notify parents about kids on problematic apps? What are the important next steps in helping your kids be safe online?


SaferKid is designed to let parents step in and intervene at step one of the above five-step process. The idea is that you become aware of the situation before there is a risk, when you can still nip it in the bud.

If you connect your child or teen’s device to SaferKid, we regularly (once an hour or so) scan the apps on that device. If we find a new app of concern, we send you an alert with what you need to know. We give parents just the right amount of information to talk to their child, whether it’s about an app that is exposing them to too much violence, or whether the child is exploring drug culture, and so on.

“Once you get a notification, it’s your turn to step in and parent. We don’t do that part.”

Once you get a notification, it’s your turn to step in and parent. We don’t do that part. But we will arm you with information to help coach you through what the concern is with the app, and give you some ideas on how to talk to your child about what’s going on before your child develops an attachment to the app. Of course, if you’re talking to a six-year-old who is using an app that has too much violence on it, the conversation is going to go differently than if you’re talking to a thirteen-year-old who has downloaded an app designed to meet strangers. But here are a couple of general guiding principles:

  • The first thing you should do is take a deep breath and not freak out. Children are going to explore new things—that’s completely normal, and talking to them about it can be, too.

  • Focus on the fact that the problem isn’t your child, it’s the apps (and potentially the strangers using those apps). So the ideal message is along the lines of: “I trust you. I just don’t trust all of the apps and all of the other people using them.”

  • I’ve also found that role playing can be helpful. For example: Ask your child to pretend to send you a text message, posing as a forty-year-old. Together, look at how you can sign up for an app as someone else. Even really smart kids might not see how an app could be risky on the surface, but this kind of role playing can help them to better understand and to feel more astute about the situation.

Parenting around mobile phones doesn’t have to be scary or hard. Whether you use SaferKid or not, keep an eye on the apps on your child’s phone. And stay in communication with your children. That’s really the key here.


Can SaferKid show parents their children’s activity on a given app? Or block an app of concern from a kid’s device?


We believe that engendering trust is the key to great communication with your child. So we don’t erode trust by showing your child’s activity on these apps. It’s not about spying on your children. It’s about helping keep them safe. Period.

This limit also makes it easy for you to communicate with your children about SaferKid, because they can look at the website and see that we don’t track what they do. We empower children to be free to explore—by letting you know, at the right time, if they are touching something that requires you to step in and parent. At least that’s the goal, and we’re regularly scouring apps to find new ones parents should know about.


Big picture, what kind of change could make the world of apps safer for kids?


The biggest underlying problem here is age verification on apps designed for meeting strangers. We need strong federal laws that restrict how minors can meet strangers online, and we need to hold app makers responsible when this is not strictly enforced.

A fourteen-year-old can’t walk into a bar and “check a box” saying they are twenty-one and buy a drink. If they did, the bar owner would lose their license and maybe go to jail. The same should apply with apps. If a thirteen-year-old joins a high-risk hookup app and gets sexually assaulted, the app maker should be liable. Instead, by law in the U.S., the app maker is not liable. And while it might make it slightly harder for all of us to sign up for these apps if we had age verification, there are many technological solutions that could be implemented.

“If a thirteen-year-old joins a high-risk hookup app and gets sexually assaulted, the app maker should be liable.”

We also need to make sure that companies outside the U.S. are required to quickly respond to U.S. subpoenas when crimes against children are being investigated. This is a larger topic of discussion, but it has been a big issue with international app makers when a predator has chatted with a child on their app, and then the child has gone missing.

We are reaching out to people in Congress to educate them about the changes we need to see in the law, and we’d love to have others reach out to their representatives as well. [Click here for your state representatives and here for federal.]

For more information, visit SaferKid.com.

Related: Sex Ed For Kids