Lori Moylan, Public Policy Manager for Meta, talks social media safety

Lori Moylan, Public Policy Manager for Meta

Juneau, Alaska (KINY) – On Friday, Lori Moylan joined Dano on Capital Chat to talk about Internet safety for youth.

Moylan said how to start the conversation.

“I would encourage all the parents out there to know that it has to be an ongoing dialogue with your teenagers. They don’t always understand that the internet is forever right and that, you know, written words on a screen can actually really hurt someone’s feelings. So making sure that you’re having that ongoing dialogue with them and checking in in a way that’s approachable and not judgmental. That really does respect the role that you know technology is going to play in their lives in the future. Then that’s the best place to start.”

She said the teenage years are when important habits are formed.

“Your teenage years are incredibly formative for the habits that you’re going to have into adulthood. Right? And we certainly know that as adults, we’re on our computers all the time, right? We’re on our phones all the time, whether it’s for work or to connect with friends and family that we’re not seeing all the time. And so now is really the time you know, to talk to your kids and make sure that they’re learning those healthy habits that they’ll need when they’re adults. I have four kids of my own two of who are teenagers and who have phones and tablets and laptops. So yeah there’s definitely a lot to think through to make sure your kids are learning the healthy habits that they’ll need to know to follow them online into adulthood.”

Moylan commented on new tools parents can use to control their children’s screen exposure.

“We’ve recently introduced some new tools to help you to jumpstart these conversations and give parents more control and insight into the things that their kids are doing on our apps. And so things like helping them set screen time limits on Instagram, right? So they’re only on the app at certain times. I know when my kids go to bed, I like to just take the phone away at the end of the night. But some parents also just use the screen time limits to turn it off and make sure that they’re not accessing Instagram, you know, messaging with their friends all night, when they should be back to school. This is a thing adults can struggle with too. I think it’s important to make sure that we’re teaching our children that we’re both modeling as examples. Also that we’re helping them use the types of tools like screen time limits so that they can be sure that they’re having sort of the proper balance between the time they’re spending online and the time they’re spending during other activities.”

Moylan said it’s not just about the time kids spend online, but what they are doing while online.

“It’s not just about the quantity of time that your kids spend online, right? It’s also about the quality. So one of the features that we also introduced in our latest version of the sort of Parental Control Center. It’s an option that if your teen reports someone for something like bullying or harassment on Instagram, they can share that with you so that you can see that they reported someone. That’s a really great way to start a conversation with your kid, right? They might not necessarily feel comfortable coming up to you and starting a conversation themselves. In saying, my friends in school said something mean to me online. They can notify their parents so that their parents can see that that they read this report against another account. That gives the parent the opportunity to start that conversation.”

Moylan explained how Meta handles posts that violate community standards.

“For certain types of content, like certain types of things that violate our community standards, we’re actually able to use machine learning to identify it and remove it. Usually, before people even see it. So a great example of that is photos that contain nudity. That’s the thing that’s easier to sort of train on a machine, an algorithm to look at, and understand that it’s nudity and take it down before people see it. For things that are more difficult, where we can’t be confident that the machine learning is going to work, because obviously, we don’t want to take down content that does deserve to be on the platform. So for things that are more complicated than that, yes, we have people. We have hundreds and hundreds of reviewers, frontline reviewers that first see a post if they have difficulty, they can escalate it to supervisors above them. It’s really expansive and can be an extensive process to make sure that we get these decisions, right. And obviously, if we get a decision wrong or you feel we’ve gotten a decision wrong, there’s an appeal option, right? If your content gets removed and you think that we made the wrong decision, you can always appeal it.”

Moylan pointed parents in the right direction to parental controls.

“If you go to familycenter.instagram.com. We have lots of resources there to tell you more about the parental controls that we offer. A lot of which we just discussed, as well as an education hub that has lots of reading material and things designed for teens and parents and educators. Where you can explore topics like safety and privacy, right? What types of privacy features should I be sure that I have on my account? Right? How can I make sure that people aren’t going to be hacking into my accounts or that people who I wouldn’t want to follow me might start to follow me, or things like how to push back on bullying, right? There’s lots of really great resources in there for parents and teens to go through so that they can make sure that they’re again like learning those healthy online habits early on.”

Leave a Comment