Securing Children Online Through Parental Empowerment (“SCOPE”) Act: Effective September 1, 2024, the Scope Act requires digital service providers, such as companies that own websites, apps, and software, to protect minor children (under 18) from harmful content and data collection practices. This new law will primarily apply to digital services that provide an online platform for social interaction between users that: (1) allow users to create a public or semi-public profile to use the service, and (2) allow users to create or post content that can be viewed by other users of the service. This includes digital services such as message boards, chat rooms, video channels, or a main feed that presents users content created and posted by other users.
Read the full text of the Securing Children Online through Parental Empowerment Act.
Overview SCOPE Act
This overview is for informational purposes only and is not legal advice. Please consult your attorney if you have specific legal questions. Texas law prohibits the Office of the Attorney General from providing legal advice, opinions, or representation to private individuals.
Digital Service Providers Operating In Texas Must Comply With The Requirements Of The Securing Children Online Through Parental Empowerment (“SCOPE”) Act Including:
Duty to Register Age of User
- A digital service provider must register the age of the person creating an account for a digital service and prevent the person from later altering their age. A person is a minor if their registered age is younger than 18 years old, or the minor’s parent or guardian notifies the digital service provider of the minor’s age or successfully disputes the registered age of the minor.
Duties Relating to Agreements with Minors
- A digital service provider must limit the collection and use of a minor’s personally identifiable information and cannot share or sell that information. Minors are prohibited from making purchases or conducting other financial transactions through the digital service. A digital service provider may not collect the minor’s geolocation data or display targeted advertising to the minor.
Duty to Prevent Harm
- A digital service provider must also develop and implement a strategy to prevent the minor’s exposure to harmful material and other content that: promotes, glorifies, or facilitates suicide, self-harm, eating disorders, substance abuse, stalking, bullying, harassment, grooming, trafficking, child pornography, or other sexual exploitation or abuse. The provider will be required to apply filtering technology to monitor as well as regularly block harmful material and required to remain proactive in creating a database for this harmful language and material to stay on top of trends, such as purposeful misspellings that attempt to evade the filtering technology.
Use of Algorithms
- The digital service providers must clearly disclose (1) how they use algorithms to provide information and content to minors, (2) how the algorithms promote, rank, or filter information or content, and (3) what personal identifying information the algorithm uses. This information will need to be displayed in their terms of service, privacy policy, or similar document tied to the user agreement.
Duty to Create Parental Tools
- A digital service provider will be required to create and give access to parental tools that allow a parent or guardian to supervise the minor’s use of the service. The parental tools must allow a parent or guardian to control the minor’s privacy and account settings, restrict the minor’s ability to make purchases or engage in financial transactions, and monitor and limit the amount of time the minor spends using the service. The digital service provider must verify the parent or guardian who seeks to act on a digital service on behalf of the minor. The verified parent or guardian may review, download, or delete the minor’s personal identifying information collected or processed by the digital service provider.
Duties Regarding Advertising and Marketing
- A digital service provider must work to prevent advertisers on the platform from targeting minors with ads that promote or offer any product, service, or activity that is unlawful for a minor in Texas.
Duty as to Harmful Material
- A digital service provider that knowingly publishes or distributes content that is harmful or obscene must use a commercially reasonable age verification method to verify the age of the user seeking access to the content to ensure the user is 18 years of age or older.
Key Definitions
- “Personal identifying information” means any information, including sensitive information, that is linked or reasonably linkable to an identified or identifiable individual. The term includes pseudonymous information when that information is used by a controller or processor in conjunction with additional information that reasonably links the information to an identified or identifiable individual. The term does not include deidentified information or publicly available information.
Exemptions
The SCOPE Act contains numerous exemptions including the following:
- State agencies;
- Small businesses as defined by the Small Business Administration (SBA);
- Financial institutions or data subject to Title V, the Gramm-Leach-Bliley Act; covered entities or business associates governed by federal laws like HIPAA and the HITECH Act; and institutions of higher education;
- Digital service providers who process user data for express purposes of employment or education services;
- A digital service provider’s facilitation of e-mail or direct messaging services as long as the digital service only provides those services; or
- A digital service provider’s facilitation of access to news, sports, commerce, or content primarily generated or selected by the digital service provider; and allows chat, comment, or other interactive functionality that is incidental to the digital service.
- Internet service providers, search engines, or cloud service providers can be exempt unless they are responsible for the creation of harmful material or other content described by Section 509.053(a) of the Act. For example, when an internet service provider, search engine, or cloud service provider solely supplies the internet access or connection, allows for downloads, access to software, or other service to a website, they are generally not considered actionable since they often do not have control over the harmful content in question.
Enforcement
A violation of the SCOPE Act is a deceptive trade practice enforceable only by the Consumer Protection Division of the Office of the Attorney General of Texas, which may seek injunctive relief, civil penalties of up to $ 10,000 per violation, and attorneys’ fees. The Act does not confer a private right of action, but allows parents and guardians of known minors to file suit to obtain a declaratory judgment against a digital service provider. A court may not certify a case brought under the Act as a class action.
EFFECTIVE DATE: September 1, 2024.