The SCOPE Act in Texas, which seeks to protect minors from harmful content, has been partially blocked by a federal judge.
KOSPA bill: Child protection or censorship?
The Kids Online Safety and Privacy Act (KOSPA), passed in the United States on 30 July, establishes strict regulations for the design of online platforms with the aim of protecting minors from exposure to certain content. Voices opposing a law that could severely harm the principle of freedom of speech have not been slow to emerge.
The precedents: KOSA and COPPA
K
OSPA is a reworking of two previous laws focused on protecting children and teenagers from addictions and certain content deemed dangerous in the digital environment: the well-known KOSA and COPPA 2.0. With a large majority in the Senate, the bill was approved in July and is expected to be revisited in September to decide on the details of its implementation before the November elections.
KOSA (Kids Online Safety Act) imposes obligations on digital platforms, such as limiting addictive features, allowing young people to reject algorithmic recommendations, and restricting access to their personal data.
COPPA 2.0 (Children’s Online Privacy Protection Act) updates a previous law in effect since 1998, with the aim of strengthening the protection of personal information for children and teenagers. It extends the protection age to 17, updates the definition of personal information to include biometric data, closes a legal loophole that allowed some companies to track minors, and grants teenagers more autonomy regarding consent for the use of their data. COPPA has long been criticised for allegedly achieving the opposite of its intention, by limiting minors’ freedom of expression.
The foundations of KOSPA
Building on these bases, KOSPA establishes a legal framework to protect young people from risks it deems inherent to digital platforms. The law imposes a “duty of care” on major online platforms, such as social media, video games, streaming services, and messaging apps, requiring them to take specific measures to prevent and mitigate potential harm to minors under the threat of substantial fines.
The platforms targeted by the law will be those aimed at minors or those with “actual knowledge” that the user is a child. This aspect underscores the need for age verification solutions to help comply with the regulations.
Among these obligations are implementing the highest levels of privacy by default, offering the voluntary exclusion of personalised algorithmic recommendations—that is, allowing users to choose whether they want a digital platform to show content based on their interests, search history, or personal data—and developing robust parental control tools (such as limiting screen time, monitoring access to “inappropriate” content, etc.). However, a notable modification of KOSPA compared to KOSA is that it allows minors to change these settings without parental consent.
Other measures in the new privacy and safety act
The text reflects certain guidelines that address the addictive nature of some sites, such as social media. Platforms must identify and limit features that may encourage addictive behaviours, such as constant notifications or reward mechanisms that keep young users engaged for extended periods.
Furthermore, KOSPA also prohibits the targeted advertising of age-restricted products to minors, including tobacco, alcohol, gambling, and other products unsuitable for young people.
Another significant aspect of the legislation is the requirement to incorporate reporting mechanisms, through which minors, parents, or schools can report “harm” to minors.
The scope of the KOSPA is broad and covers a range of digital services popular among young people. However, not all online services are subject to this legislation. For example, email providers and educational institutions are excluded from the scope of KOSPA. The law focuses on platforms with a significant user base that offer services specifically designed to appeal to a younger audience.
A real threat to civil liberties
This and other laws, such as Spain’s Organic Law for the Protection of Minors in Digital Environments, which tightens existing 2022 regulations on age verification measures for accessing adult content, opens up a debate about who controls information, who decides what is harmful and what is not, and how impacts on minors are managed. KOSPA is the penultimate example of how a tool for manipulating mainstream discourse operates. We should examine the true dangers of these political decisions.
- Censorship: Citizen groups opposed to KOSPA argue that the law will effectively censor a massive amount of content on controversial topics. Authorities might interpret content promoting addiction in the same way as content warning about the dangers of addiction. KOSPA’s rules could remove individual decision-making power, effectively shielding internet users, as the definition of what is harmful or dangerous is broad enough to encompass a wide range of content.
- Political persecution: Entrusting prosecutors and officials with the power to decide what content should be shown or removed creates a dangerous precedent by allowing blocks against content disliked by certain groups, thereby impacting freedom of expression and encouraging polarisation.
- Chilling effect: The threat of being sued or investigated under KOSPA might lead internet platforms to adopt excessively restrictive policies to avoid potential penalties. This could result in a chilling effect where content creators and users self-censor out of fear of legal repercussions, reducing the diversity of opinions and creativity online.
- Invasive filters: To comply with KOSPA, platforms may be compelled to implement age verification systems and invasive content filters. These mechanisms may not only fail to protect minors but also threaten the privacy of all users. Collecting personal data for age verification, if not handled with utmost care, and filtering content introduce new security risks and potential misuse of information.
- Compliance: Large corporations can absorb the costs of complying with these laws more easily. Consequently, smaller businesses might be discouraged from pursuing their projects and taking risks. The outcome could be less innovation and greater control of discourse by the same large power groups as always.
While KOSPA’s intentions may seem benign, it is crucial to question how this “protection” of minors is interpreted and what the real motivations are behind steering parental behaviour regarding their children. It is essential for legislators and society at large to carefully consider the implications of these regulations, ensuring that rights and freedoms are not sacrificed in the name of misunderstood protection.
This Post Has 0 Comments