Raising Digital Citizens: Advice from a Tech Ethicist

We sat down with the founder of All Tech Is Human to learn more about the role of social media in our lives. David Ryan Polgar helps us understand how parents can raise their kids to become responsible digital citizens.

David Ryan Polgar is one of the many voices calling for ethical, humane technology. This attorney and college professor has been focusing on tech ethics and digital citizenship since 2012, when he began noticing the profound ways that social media was affecting how we saw the world, found employment and digested the news.

Recognizing the need to challenge dominant ways of thinking in technology, he cofounded the Digital Citizenship Summit in 2015. Since then, he founded All Tech Is Human, which acts as a think-tank and accelerator for tech consideration and a hub for the responsible tech movement.

We sat down with David to learn more about digital citizenship, the role of social media in our lives—and what parents can do to set their kids up for success in the future.

How do you define digital citizenship and why is it important?

There are a lot of different definitions of digital citizenship, but the way I see it is that it’s not only about an ability to be safe online, but also to be savvy online. That is important to point out because you don’t just want to think about all the ways that you can have less harm done to you—you also want to leverage digital tools towards benefitting others.

Early on, we saw a lot of stories with Facebook where decisions around what photos were showing up in somebody’s feed had a profound impact on the recipient. There was one case where a father was quite upset because images of his daughter who had just passed away kept on showing up on his feed. That was a pivotal moment in social media because it caused companies to start thinking. These decisions we’re making, they’re not simply about maximizing an algorithm. They’re not simply about saying, “This person posted a photo a year ago, therefore it’s the anniversary, therefore I should post it back to them.” At the end of the day, behind every avatar is a living, breathing person who has emotions.

What is social media? How do you understand it?

That’s a complicated answer because it’s a complicated question. Social media is a hybrid: a conduit for information, a media company and a quasi-governmental public square. The issue that is happening—that we have not focused enough attention on—is that the public perception of social media is dramatically different than the legal perception of social media. This is a major, major problem, because social media companies are caught in-between a rock and a hard place.

Right now, there’s a growing concern and push for social media companies to moderate more of their content, especially around COVID-19 misinformation. Social media companies obviously are reacting to that and then taking a more proactive and aggressive stance to try and lessen misinformation on their platforms. This is tricky because we still, as a society, have to determine what role we want social media companies to play.

Section 230 of the Communications Decency Act of 1996 is the highly debated clause right now that offers limited liability for social media companies, as long as they are acting as a neutral platform. So, the issue is that social media has evolved, but our laws haven’t. The laws that are most consequential for social media were written before social media even existed.

What kind of implication does that have?

It’s highly problematic right now because we’re trying to make something fit that doesn’t exactly fit. We, as a society, need to determine what role we want social media companies to play. It seems like the public wants them to play a more proactive role with content moderation. However, this also needs to be done in conjunction with how we offer transparency around these decisions.

The reason for that is because you don’t ever want to have a social media company—if it is a public square—to be the judge, jury and executioner. The way that the public is relating to social media companies is like a governmental capacity. But they have not adjusted to that.

We really need to have an honest discussion about what social media is and what it should be. We realize that social media impacts society at large. It impacts the course of democracy, how we get our news, how happy we are. That is a big, big deal. I say that because I think it needs to be emphasized.

What level of social responsibility do you think platforms should have?

Social media companies have a growing responsibility. They are realizing this, but it has been a rude awakening. Section 230 of the Communications Decency Act of 1996 set up this original framework where we saw social media platforms as very neutral. It was basically saying that this is user-generated content, the platform is the equivalent of the post office. So, we don’t blame the post office when somebody is sending a negative letter to somebody else, right? Now, we have realized that it’s much more than a neutral platform, it’s about amplification.

That’s been one of the reasons why there’s a rising need for social media platforms to have more social responsibility. Because they’re not just relaying a message, they’re amplifying a message.

The public square analogy doesn’t work perfectly because it’s not like everybody on a platform is in a square speaking to one another. That’s not the way it works, because there are too many people. Yes, everybody is in this square, but everybody has a microphone that has dramatically different volumes and the platform has the ability to raise, lower or turn off that microphone. That is the tremendous level of power and a power that social media platforms are starting to realize they need to take extremely seriously, but they also owe the general public a greater level of transparency on how they go about their decisions.

How can we improve social media and technology?

We want it to be simple, we want it to be one area that can be fixed and improved, rather than looking at it as a system and a collective. But, when we think about what’s going to build a better web, it’s a combination of more socially responsible companies, better educated and engaged parents and digital citizens, politicians who are informed and proactive and media that offers necessary oversight to scrutinize corporate behavior and offer education to the general public. So, it’s four major intersecting factors.

And I point this out because I think this is often lost. Every issue works in a collective action of these intersecting parts. For example, when we think about safe driving, it’s not just about car companies being socially responsible. That’s only one part of it. That’s the same thing that’s happening with social media. We need better digital systems, more socially responsible social media companies, more engaged politicians and better oversight with the media.

What advice do you have for parents about introducing kids to the internet and starting them on the track of digital citizenship?

The more symbiotic the learning relationship, the better. Parents have a lot to teach their kids and kids have a lot to teach their parents.

Also, parents need to be given more mulligans. There’s a lot of social pressure on parents right now to have a perfect solution and all of a sudden solve all the problems for their kids. Whereas this is something that a lot of us are going through and always learning. This is my entire life, yet even I can’t keep up with the information that is coming out about it. I always feel like there’s more articles I need to read, more conferences to attend. There needs to be a lot more empathy given to parents because they have a lot on their plate.

Similar to how we train kids around safety, the more we can empower them and educate them within those roles, the overall safety increases. The same thing goes with digital citizenship and a younger audience going online: it’s about parents relaying and educating their younger children, but then also those children feeling empowered to be able to make better decisions when they’re online, but also seeing the larger network.

But, again, I do want to go back to the fact that I think this is definitely a point in time where we need to increase the empathy shown towards parents. And the more we can allow freedom to make mistakes, the better. I will say too that we should also recognize that parents don’t agree right now on the best way forward.

There is massive disagreement about screen time. There’s massive disagreement about how tech should be incorporated in the school system. And this is where we need to think about the larger role that tech is playing and then have a conversation about how the parents’ opinion affects the overall kind of environment.

To learn more, visit All Tech Is Human and follow along on Twitter, YouTube and LinkedIn!

You might also like...

How to encourage media literacy ahead of the new school year

How to encourage media literacy ahead of the new school year

It can feel like a massive challenge to keep up with the changing media landscape—especially for parents. We sat down with Michelle Lipkin, Executive Director of the National Association for Media Literacy to learn how we can help our kids develop critical media literacy skills to thrive in life.

How to connect with online safety: advice from an Internet Safety Expert

How to connect with online safety: advice from an Internet Safety Expert

Fareedah Shaheed is the CEO and founder of Sekuva and she shares her best digital parenting advice.

In Conversation with The White Hatter: Teaching Kids to Thrive Online

In Conversation with The White Hatter: Teaching Kids to Thrive Online

The White Hatter has been providing online safety and digital literacy training since 1993. We sat down to chat with founder Darren Laur and learned how parents can empower their kids to thrive online.

Better tech for kids is here

We’re working hard to be the most trusted brand for incorporating technology into our children’s lives.