![]() How states specify guidelines and the types of companies subjected to such scrutiny varies among the various state laws. There are some differences in the age definitions of “minor,” with most proposals referring to individuals under the age of 18, but KOSA codifies minors as those under 17 years of age. has become even more stringent in their policing of social media platforms with recent suggestions of pressuring behavioral changes among companies through fines and jail time for breaking laws. ![]() ![]() ![]() to 6 a.m., and limit its use to two hours per day for youth ages 16 to 18, one hour for those 8 to 15, and youth under 8 years old restricted to 40 minutes per day. This month, for example, China’s Cyberspace Administration published draft guidelines that restrict minors’ use of social media from 10 p.m. This succession of domestic activities emerges around the same time that the European Union (EU) and China are proposing regulation and standards that govern minors’ use of social media. Still, others, including civil liberties groups, have also opposed the Senate’s legislative proposals, pointing to the growing use of parental tools to increase surveillance of their children, content censorship, and the potential to collect more and not less information for age-verification methods. NetChoice also quickly responded to the movement of these bills, suggesting that companies, instead of bad actors, were being scrutinized as the primary violators of the issue. The other bill, COPPA 2.0, proposes to increase the age from 13 to 16 years old under the existing law, and establish bans on companies that advertise to kids. KOSA is intended to create new guidance for the Federal Trade Commission (FTC) and state AGs to penalize companies that expose children to harmful content on their platforms, including those that glamorize eating disorders, suicide, and substance abuse, among other such behaviors. Big tech companies, such as Amazon, Google, Meta, Yahoo, and TikTok, have fought recent state legislation, and NetChoice, a lobbying organization that represents large tech firms, recently launched a lawsuit contesting Arkansas’s new law, and also challenged California’s legislation last year.Īt the federal level, the Senate Commerce Committee voted out two bipartisan bills to protect children’s internet use in late July 2023-the Kids Online Safety Act (KOSA) and an updated Children Online Privacy Protection Act (COPPA 2.0) by a unanimous vote. However, individual state efforts have not gone unchallenged. For example, some states have already enacted new legislation to curb social media use among minors, including Arkansas, Utah, Texas, California, and Louisiana. While considered proactive, many of the same opponents of social media use have sparked doubts over the efficacy of these tools, and a range of federal and state policies have emerged, which may or may not align with the goals of protecting minors when using social media.Ĭurrently, regulation of social media companies is being pursued by individual states and the federal government, leading to an uneven patchwork of directives. Alongside increasing calls to action, big tech companies have introduced a multitude of parental supervision tools. surgeon general, who released a health advisory about its effects on their mental health. The increasing use of social media and other related platforms-most recently, generative AI-among minors has even caught the attention of the U.S. With increasing digitalization and reliance on existing and emerging technologies, youth are quick to adopt modern technologies and trends that can lead to excessive use, especially with the proliferation of mobile phones. With 95% of teens reporting use of a social media platform, legislators have taken steps to curb excessive use. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |