Access Partnership’s Héloïse Martorell and Mike Laughton discuss the Digital Services Act, which could be viewed as the EU’s next attempt to set a global standard for digital companies.
Emboldened by the GDPR, the EU is increasingly confident in its ability to set global standards for technology regulation. With the new European Commission in place, 2020 will be a year defined by the Digital Services Act (DSA). In Brussels and for much of the industry, the question is not whether to hold digital companies accountable for the actions of their users, but rather which parts of the digital sector to hold liable and to what extent. The process will take time but will be the defining digital regulation of the decade, as it tackles subjects such as the rights of consumers, censorship, the free market and the responsibility of online platforms.
The danger for industry and users alike is that, in tackling the serious challenges associated with the propagation of harmful content online – through platforms owned by Facebook, Google and Twitter – the competitive landscape is undermined, and businesses further removed from end-users are caught within the scope.
The Digital Services Act is timely, as European countries have already pushed their own initiatives to regulate online platforms. Germany blazed the trail of content regulation with the Network Enforcement Act (NetzDG) in 2017 and is looking to expand provisions. France’s Senate is currently debating its own hateful content regulation (Loi Avia). The U.K., now decidedly separate from the EU as of January 31, continues to provide inspiration for its erstwhile partners with the Online Harms White Paper, Age Appropriate Design Code and even through the now-abandoned age-gating proposals of the Digital Economy Act 2017, currently being explored as a model in Paris.
A pan-EU initiative is welcome, at least in the sense that it will attempt to harmonize 27 regimes. It will regulate social media platforms, search engines, video gaming platforms and online marketplaces. First and foremost, it will upgrade liability and safety rules for digital platforms, services and products to incentivize companies to remove illegal and harmful content. This form of content can range from hate speech to fraudulent products by third-party sellers. Current rules on intermediary liability lie within the eCommerce Directive. The 2001 rulebook adopted a “laissez faire” approach, now outdated for today’s range of services, consumption rates and the known dangers of the proliferation of user-generated content. Under the DSA, platforms will be subject to further obligations in the form of “notice and take down” orders or a “duty of care,” potentially requiring companies to use upload filters. This must be balanced by the need to avoid internet intermediaries becoming, in effect, publishers – maintaining safe harbors as much as possible.
While we do not expect legislation to be complete in 2020, this year will be to a large extent where the lines around the initial proposals are drawn. Businesses need to engage now to ensure that the new Commission understands the plethora of services they are due to regulate. While the work will be led by Internal Market Commissioner Thierry Breton, it will become a joint effort across the College.
With policy issues like consumer protection, disinformation, workers’ rights in the gig economy and competition also on the agenda, businesses will need to widen engagement efforts to the cabinets of Didier Reynders (Justice), Věra Jourová (Rule of Law), Nicolas Schmit (Employment) and Margrethe Vestager (Digital and Competition). Meanwhile, businesses must also be aware of the risk of the Digital Services Act becoming a belated Christmas tree bill, where policymakers in the Council and European Parliament can reopen old arguments concerning copyright or privacy. Of immediate concern to businesses is the expected consultation and communication on the scope of the DSA in the first quarter of 2020, followed by the first legislative proposals in the latter part of the year.
Both policymakers and the public have drawn the line after platforms failed to act responsibly out of moral conscience, rather than legal duty. The sector, having now exhausted all avenues for obtaining self- or co-regulatory regimes, should prepare for sweeping regulatory oversight. It would be also a mistake to assume this is merely a problem for the hyperscale online platforms. The proposals drawn up in the Digital Services Act will affect not only the entire technology sector, but potentially any service relying on user-generated content to any extent.
The pieces in this series have been extracted from a larger report by Access Partnership on the trajectory of tech policy in 2020. The next installment will discuss a potential framework for future privacy laws in the United States.