The Senate has passed two major online safety bills amid years of debate over social media’s impact on teen mental health. The Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act, also known as COPPA 2.0, passed the Senate in a vote of 91 – T3.
The bills will next head to the House, though it’s unclear if the measures will have enough support to pass. If passed into law, the bills would be the most significant pieces of legislation regulating tech companies in years.
KOSA requires social media companies like Meta to offer controls to disable algorithmic feeds and other “addictive” features for children under the age of 16. It also requires companies to provide parental supervision features and safeguard minors from content that promotes eating disorders, self harm, sexual exploitation and other harmful content.
One of the most controversial provisions in the bill creates what’s known as a “duty of care.” This means platforms are required to prevent or mitigate certain harmful effects of their products, like “addictive” features or algorithms that promote dangerous content. The Federal Trade Commission would be in charge of enforcing the standard.
The bill was originally introduced but stalled amid pushback from digital rights and other advocacy groups who said the legislation would force platforms on teens. A revised version, meant to address some of those concerns, was introduced last year, though the ACLU, EFF and other free speech groups still oppose the bill. In a statement last week, the ACLU said that KOSA would encourage social media companies “to censor protected speech” and “incentivize the removal of anonymous browsing on wide swaths of the internet.”
COPPA 2.0, on the other hand, has been less controversial among privacy advocates. of the 1998 Children and Teens’ Online Privacy Protection Act, it aims to revise the nearly 30-year-old law to better reflect the modern internet and social media landscape. If passed, the law would prohibit companies from targeting advertising to children and collecting personal data on teens between 13 and 16 without consent. It also requires companies to offer an “eraser button” for personal data to delete children and teens’ personal information from a platform when “technologically feasible.”
The vote underscores how online safety has become a rare source of bipartisan agreement in the Senate, which has hosted numerous hearings on teen safety issues in recent years. The CEOs of Meta, Snap, Discord, X and TikTok at one such hearing earlier this year, during which South Carolina Senator Lindsey Graham accused the executives of having “blood on their hands” for numerous safety lapses.
Source link