The EU’s Digital Services Act is one step closer to becoming law. How will it impact U.S. startups?

Engine
9 min readJul 28, 2022

by Lauren Koop, Policy Fellow, Engine Foundation

Earlier this month, European officials advanced a watershed piece of legislation that will govern online intermediaries — including startups — functioning within the EU. On July 5th, the European Parliament overwhelmingly voted to adopt the Digital Services Act (DSA), which will create new obligations for companies serving EU users, restrict some current practices, and increase the resources needed to serve the European market. After a few remaining legislative steps (likely to occur this fall), the DSA will enter into force, after which most companies will have around 15 months to come into compliance.

Background:

What is the Digital Services Act?

The DSA builds upon the EU’s e-Commerce Directive (2000), the original legal framework governing digital services in the EU. The DSA covers an array of digital services, especially those that host or encounter user content — for example, podcasting sites, e-commerce services, websites with comment sections, and Internet infrastructure providers. The legislation aims to protect consumers and provide a more streamlined legal framework for companies by harmonizing national-level laws, though several provisions of the DSA are likely to increase barriers for businesses aiming to serve European users.

How did we get here?

Since the turn of the century, Internet usage has grown by 1,355%, facilitating the growth of new industries, creating new markets, increasing the flow of new ideas, and spurring innovation. American startups — including those serving EU users — have helped us to get goods faster, better enjoy our vacations, lead healthier lives, and promote accessibility. And every day we use services that were once startups to live, eat, work, play, and even sleep.

At the same time, bad actors comprising a small fraction of the Internet have reflected and helped to enable society’s worst qualities. In recent years, in response to challenges posed by mis-and disinformation, hate speech, terrorist content, child sex abuse material, and more, EU policymakers have developed the DSA, joining efforts designed to address those issues from their colleagues around the globe including the UK’s proposed Online Safety Bill, India’s Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, and SESTA-FOSTA in the U.S.

Who does the DSA apply to?

The DSA applies to intermediaries operating in the EU, and includes additional obligations for very large online platforms (VLOPs) with over 45 million users, while exempting small enterprises with fewer than 50 employees from some obligations. Because U.S. startups that serve EU users tend to be more mature, they are likely to surpass the thresholds for the DSA’s small enterprise exemptions and may be responsible for compliance with the majority of the new rules. One rough estimate places the number of entities to be regulated under the DSA in the hundreds of thousands.

Overview: What’s in the DSA & what does it mean for startups?

What obligations does the DSA create for content moderation?

A large portion of the DSA is dedicated to guidelines for new EU standards around moderating content. It builds upon the eCommerce Directive’s existing knowledge-based framework where companies are responsible for taking action on illegal content they know about. To accomplish its goals in counteracting the presence of illegal content online, the DSA adds avenues through which companies will be alerted to potentially problematic content.

For example, intermediaries must create mechanisms for users to notify companies of such content. They must cooperate with “trusted flaggers,” in various member states to address allegedly illegal content. And intermediaries must be responsive to government requests to remove content. These processes will require companies to have a “point of contact” to coordinate communication and appoint a “legal representative” in the Union, tasked with ensuring their compliance with EU law.

The DSA additionally creates new transparency and information disclosure requirements. When the DSA goes into effect, businesses must include a detailed breakdown of their content moderation practices in their terms and conditions, notify users of any changes to the terms, and publish an annual transparency report on their moderation practices.

And if a user’s content is removed or restricted, they must be informed about the removal and allowed to contest illegal content claims via an appeal mechanism — a process many companies will have to create in the wake of the DSA. If the user does not agree with the results of the appeal mechanism, they can move to an external out-of-court dispute resolution mechanism, which companies almost always have to pay for — except for in cases where the user is found to have “acted manifestly in bad faith.”

What will it mean for startups?

To build out these new mechanisms and processes, companies will need to devote more resources to content moderation. Startups already spend disproportionately more per user on content moderation than larger companies, meaning the DSA may have some competitive effects, especially among the large group of intermediaries that are not small entities (with fewer obligations) nor VLOPs (with additional obligations). Likewise, those increased costs impact access to the European market. The heightened barriers to serving EU users will have the likely effect of deterring or delaying U.S. startups from entering what would otherwise be an attractive market for global expansion.

These obligations may have implications for expression, as well. Because of how the DSA assigns liability for content — companies lose immunity if they fail to expeditiously remove alleged illegal content they are made aware of — companies are likely to lean toward over-removing content. And this issue creates a unique tension for startups who are both more sensitive to losing users who have their content improperly taken down and more vulnerable to the consequences of failing to remove content in face of a complaint. For startups, customers are easier to lose, and new users are more costly to obtain. Policies that effectively force them to remove non-problematic content could therefore drive away users and hurt startups’ bottom lines.

Does the DSA prohibit general monitoring obligations?

Consistent with longstanding EU liability law, the DSA upholds a prohibition on general monitoring obligations, meaning companies are not required to actively monitor their services for potentially illegal activity by users. The intermediary liability framework established by the DSA also goes on to clarify that a company’s independent efforts to detect and remove illegal content will not render them ineligible from liability exemptions.

Monitoring obligations have been a point of contention in intermediary law for the past several years — not only in the EU (with Article 17 of the Copyright Directive and resulting challenges) but also in the U.S. (especially around alleged copyright infringement) and elsewhere. The DSA was no different. The issue was debated during negotiations between the EU legislative institutions over the DSA text. One version of the text initially suggested a carve-out to the general monitoring prohibition. That version was not agreed to, but the text suggesting a carve out — that would have enabled EU or national authorities to require that platforms perform specific monitoring for content that had been taken down or was “manifestly illegal” — later resurfaced and caused controversy for Members of European Parliament and the technology ecosystem alike.

What will it mean for startups?

In the end, the controversial language was removed, but it is important to highlight because it would have opened the door to a stay-down requirement — a requirement that companies ensure any previously-removed content is never reposted. Stay-down requirements are problematic for many reasons. First, they necessitate the implementation of filtering technologies, because the only way to ensure a post does not contain previously removed content is to look at every user post, which is prohibitively expensive for startups. Current filtering technologies are also imperfect and limited in their ability to accurately identify content. Content moderation requires context and nuance, but filtering technologies can lead to removal of misidentified content or content that was problematic in one post, but could be legitimate and legal in another post. Such improper removals through the use of filtering technologies have ramifications for user expression. Had general monitoring obligations been included in the DSA, it would have created a huge financial burden for startups and smaller intermediaries while also creating concerns for user expression online.

What is the DSA’s impact on personalized ads?

The DSA bans serving personalized advertisements to minors and to individuals based upon certain sensitive data, which will broadly impact an avenue startups rely on to reach new customers. Policymakers had considered a full ban on targeted advertising, which would have entirely remade the digital economy in Europe and was extremely controversial. The extent of the ban on targeted ads was debated to the last minute during negotiations this spring and will likely reemerge in future EU legislation.

Under the DSA, companies are restricted from targeting advertising on the basis of sensitive data — a category from the General Data Protection Regulation (GDPR) that includes, e.g., racial or ethnic origin, political opinions, religious or philosophical beliefs, health, and sexual orientation. Because of its status in the GDPR, many industry players in the EU already avoid targeting ads using these data categories.

What will it mean for startups??

The ban on targeted ads to minors is more likely to have a significant impact across the Internet ecosystem — including for startups — in part because of the knowledge standard in the law. Companies are prohibited from serving personalized ads to users when they are “aware with reasonable certainty that the recipient of the service is a minor.” The DSA says companies will not have to collect additional data to assess a user’s age, and the EU Commission may provide additional guidance. Still, it is not hard to imagine a company’s level of awareness being at issue in an enforcement action or litigation when an ad is inevitably and inadvertently targeted to a user that is a minor. There are practical difficulties to ascertaining a user’s age and technologies to help with the task are likely to be imperfect.

As a result of the partial ban on targeted ads, companies — especially startups — will find it tougher to utilize what is now an effective, low-cost marketing tool. This will be a particularly hard hit for startups that rely on targeted advertising as an affordable way to gain new customers. Targeted ads are often used to experiment and test ideas, enter markets (like breaking into the EU), and reach a wide array of consumers with a limited marketing budget, ultimately, acting as a tool to help startups to compete and grow. By Burdening a key method for growth among startups, the DSA undermines one of its stated goals to diversify the digital services landscape.

The new legal framework also calls on companies to ensure online advertising transparency by requiring business to clearly identify what content is an ad, who is sponsoring that ad, and the parameters that were used to identify a given user to receive a given ad. While businesses already follow some of these practices, the provisions amount to another burden upon critical entrepreneurial infrastructure.

What’s Next?

Following the July 5th parliamentary adoption of the DSA, it is widely assumed that the legislation will be adopted by the Council of the European Union in the early fall and will become functioning shortly after. VLOPs will be held to an expedited timeline to adopt the guidelines of the DSA, while others will be required to adhere to the DSA by January 1st, 2024, or after 15 months (depending on what comes sooner).

Moving forward, the legislation will set the standard for a new age in European digital policy, focusing on consumer protections at the potential risk of restricting legitimate user speech, burdening startup innovation, and hampering market diversity. Meanwhile, startups functioning in the EU will use what limited time, money, and resources they can to adjust and adapt to the changing regulatory landscape.

Disclaimer: This post provides general information related to the law. It does not, and is not intended to, provide legal advice and does not create an attorney-client relationship. If you need legal advice, please contact an attorney directly.

Engine is a non-profit technology policy, research, and advocacy organization that bridges the gap between policymakers and startups. Engine works with government and a community of thousands of high-technology, growth-oriented startups across the nation to support the development of technology entrepreneurship through economic research, policy analysis, and advocacy on local and national issues.

--

--

Engine

Engine is the voice of startups in government. We are a nonprofit that supports entrepreneurship through economic research, policy analysis, and advocacy.