EU moves forward with sweeping online services regulations that impose fines of up to 6% of a company’s annual gross income | Benesch

The Digital Services Bill will require online services (including social media platforms, search engines and marketplaces) to implement policies and procedures aimed at increasing transparency and tackling illegal products and content on online platforms.

First proposed over two years ago in December 2020, the Digital Services Act (“DSA”) was agreed at the end of last month. The European Parliament (the legislative arm of the EU) has reached an agreement in principle with the various EU member states to advance the process of finalizing the DSA.

According to the EU, the DSA will set new standards of accountability and fairness that online platforms, social media platforms and other internet content providers must meet. The DSA sets out a range of regulatory obligations that will apply to entities on a varied basis, depending on the size of the entity, its societal role and its impact on people’s lives.

Broadly, the DSA will combat the sale of illegal products and services in online marketplaces and will combat illegal and harmful content on online platforms such as social media. The DSA also aims to increase the transparency and fairness of online services.

In the same vein as comprehensive EU legislation passed, the DSA puts individuals in control through transparency requirements that entities must meet. New judicial or alternative dispute resolution mechanisms should be put in place to allow individuals to challenge content moderation decisions or seek legal redress for alleged damages caused by an online platform. The DSA will also require some transparency into the entities’ algorithms that are used to recommend content or products (i.e. targets) to individuals.

Context and EU legislative process

The DSA is one small piece of a larger set of laws and regulations that have slowly made their way through the EU legislative process. One element of this package, the Digital Markets Act (“AMD”) was already agreed in March 2022. The DMA focuses on the regulation of anti-competitive and monopolistic behavior in the technology and online platform industries (digital and mobile). The DMA is at the forefront of a trend, both globally and in the United States, of turning to antitrust law as a way to regulate technology companies and online services.

With the main agreement in place, the DSA will now be approached in a co-legislative way. This means that each EU member state (France, Germany, etc.) must adopt and adopt the DSA for full approval by the Council of the EU, which is made up of representatives from each EU member state. At the same time, the European Parliament will submit the DSA for approval.

Once the European Parliament and the full Council of the EU have approved the DSA, the DSA will be finalized and effective. As proposed, the DSA will enter into force no later than 15 months after its entry into force, i.e. on January 1, 2024.

However, very large online platforms (as defined below) will have a shortened timeframe and will need to comply with the DSA within 4 months of the effective date of the DSA.

There is a lack of detail about what might end up in the final version of the DSA, and the final scope and impact of the new law will not be known until the final text is released. However, the EU has provided some general ideas and principles that will guide the final text. This information can help the multitude of entities that need to prepare for DSA.

Spectrum of applicability

The DSA applies to “digital services”, which broadly includes online services. Online services may include online infrastructure, such as search engines; online platforms, such as social networks; or an online Internet marketplace, to smaller websites.

Further, the DSA will apply regardless of where an entity is established. If an entity is an online service that operates in the EU, it must comply with the DSA. However, as mentioned above, the applicability of specific requirements will depend on the size and impact of the service on an individual’s daily life.

There are four categories of online services under the DSA: (1) intermediary services; (2) hosting services; (3) online platforms; and (4) very large platforms. Each subsequent online service category is also considered a subcategory of the preceding online service type, which means that the requirements imposed on intermediary services are also imposed on hosting services, and so on.

Intermediary services include entities that provide network infrastructure, with some examples including Internet service providers and domain registrars. Hosting services include cloud and website hosting services. Online platforms include online marketplaces, app stores, business platforms, and social media sites. Finally, very large online platforms include online platforms that serve and reach more than 10% (around 45 million) of consumers in the EU.

DSA Requirements

The DSA requirements are cumulative and will depend on the size and impact of a given business. Although details and specifics are currently lacking, entities can begin to prepare for the types of policies and procedures that will be required. The requirements that the EU has set for the proposed DSA are broken down below by specific categories of online services.

1. Intermediate Services

If an entity is considered an intermediary service, it must implement policies and procedures related to the following areas: (1) transparency reporting; (2) terms of use/terms and conditions that take into account defined EU fundamental rights; (3) cooperation with national authorities; and (4) specific points of contact and contact information, and designated legal representatives, if any.

2. Hosting Services

If an entity qualifies as a hosting service, the entity must comply with the intermediary service requirements above.

In addition, a hosting service must: (1) report criminal offenses (likely related to selling illegal products and services, or posting illegal and harmful content); and (2) increase transparency to its consumer base through notice and choice mechanisms that fairly inform the individual consumer.

3. Online platforms

If an entity is considered an online platform, it will need to comply with the intermediary service and hosting service requirements above.

In addition, an online platform should implement policies and procedures that address: (1) the complaints and redress mechanism (which includes both legal remedies and alternative remedies in the event of a dispute); (2) the use of “trusted flaggers”; (3) abusive reviews and counter-reviews (eg, dark patterns); (4) verifications of third-party vendors in online marketplaces (including through the use of random or spot checks; (5) prohibition of targeted advertising to children and targeted advertising based on characteristics (e.g. race, ethnicity, political affiliation) and (6) transparent information about targeting and referral systems.

Specifically, the obligations imposed regarding “trusted flaggers” will require online platforms to allow trusted flaggers to submit reviews related to illegal content or products and services on the given online platform. For someone to be considered a “trusted flagger”, they must meet certain certification requirements. Certification is only granted to those who: (1) have expertise in detecting, identifying and notifying law enforcement authorities of illegal content; (2) is able to perform its responsibilities independently of the specific online platform; and (3) may submit notices to relevant authorities in a timely, diligent and objective manner.

4. Very large platforms

Very large platforms are the most regulated sub-category of online services under the DSA. Very large platforms must comply with the requirements laid down for intermediary services, hosting services and online platforms.

In addition, very large platforms should: (1) implement risk management and crisis response policies and procedures; (2) conduct internal audits and cause external audits of the Services to be conducted; (3) implement opt-out mechanisms so that individuals can opt out of targeted advertising or user profiling; (4) share data with public authorities and independent researchers; (5) implement internal and external codes of conduct; and (6) cooperate with authorities in response to a crisis.

The transparency and audit requirements set for very large platforms will require annual risk assessments to identify any significant risks to the platform’s systems and services. The risk assessment shall include reviews of the following (1) illegal content, products and/or services; (2) negative effects on certain EU fundamental rights, in particular with regard to privacy, freedom of expression and information, anti-discrimination and children’s rights; and (3) manipulation of services and systems that could adversely affect public health, children, civic discourse, the electoral system, and national security.

In addition to the risk assessment, independent external auditors should carry out service and system assessments at least once a year. These auditors will need to produce a written report and very large platforms will need to implement and maintain policies and procedures to address any identified issues.

Penalty for non-compliance

Each EU Member State will have the freedom to implement specific rules and procedures relating to how penalties will be imposed under the DSA. In the most recent draft, the DSA called for “effective, proportionate and dissuasive” sanctions, meaning that the sanctions imposed could be imposed in the absence of direct damages or exceed direct damages.

As proposed, any entity that violates the DSA can face a penalty of up to 6% of its annual turnover.

Looking forward

Once implemented and effective, the DSA will set the standard for fairness, transparency and accountability requirements that online services must meet. Entities that fall under any of the four categories of digital services within the scope of the DSA will need to begin investing resources in policies and procedures to address the various topics addressed in the DSA.

The DSA sets a compliance deadline of January 2024, which is 15 months after the final effective date of the DSA. This gives a number of entities time to start their compliance efforts.

However, the baseline compliance timeframe is a bit misleading as a large number of entities will likely fall into the very large platform category. These entities will only have 4 months after the effective date of the DSA to achieve compliance and cannot afford final approval from the DSA to restart their compliance programs.

Veronica J. Snell