In the best interests of the child

(Last of four parts)

For the past weeks, we looked at the mental, emotional and physical harm caused by indiscriminate social media use, especially on youth, which led US Surgeon General Vivek Murthy to call for warning labels on these platforms, similar to those found on alcohol and tobacco products.

Of course, social media at its best can be used for good, but the data is clear, says US psychologist Jonathan Haidt, who in his book “The Anxious Generation” calls for government and technology companies to act in the best interests of our children.

READ: Social media causes poor mental health

In 2017, Sean Parker, Facebook’s founding president, candidly sounded the alarm: “The thought process that went into building these applications, Facebook being the first of them … was all about: ‘How do we consume as much of your time and conscious attention as possible?’ And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you … more likes and comments. “It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. The inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people—understood this consciously. And we did it anyway.

“God only knows what it’s doing to our children’s brains,” he said.

Today we witness havoc: bullying, rolling and other uncivil behavior, exponential rates of anxiety, depression, self-harm; a fragmented, distressed, polarized world.

“Why would anyone treat their customers that way?” asks Haidt. “Because the users are not really the customers for most social media companies. When platforms offer access to information or services for free, it’s usually because the users are the product. Their attention is a precious substance that companies extract and sell to their paying customers—the advertisers.

“The companies are competing against each other for users’ attention, and like gambling casinos, they’ll do anything to hold on to their users even if they harm them in the process. We need to change the incentives so that companies behave differently, as has happened in many other industries. Think of food safety regulation in the Progressive Era, or automotive safety regulations in the 1960s, both of which contributed to the long-running decline in children’s mortality rates.”

READ: Can Asian community culture protect against social media harm?

In 2013, British filmmaker Beeban Kidron created the documentary “InRealLife” about how technology companies treat teens online. Incensed by what she learned, she listed design standards for these businesses to adopt. The Age-Appropriate Design Code took effect in the UK in 2020, and advocated policies in children’s best interests.

“For example, it is usually the case that the best interest of the child is served by setting all defaults about privacy to the highest standard, while the best interest of the company is served by making the child’s post visible to the widest audience possible. The law therefore requires that the default settings for minors be private; the child must make an active choice to change a setting if she wants her posts to be viewable by strangers.”

To critics wary of any government regulation, Haidt says, “most of the harms platforms are responsible for are not about what other users are posting (which is hard for platforms to monitor and control) but about design decisions that are 100 percent within the control of the platforms that incentivize or amplify harmful experiences.”

Haidt calls on governments to raise the age of internet adulthood to 16 (or if possible, 18), and calls on businesses to make better age verification features, among others. The future of our youth depends on urgent action today.

Read more...