Europe's Digital Services Act: Opportunities and Challenges Ahead
In conversation with Oliver Marsh from AlgorithmWatch


The European Union has adopted the Digital Services Act (DSA) 2022, intending to make the use of online services generally more transparent and secure.

Starting 17 February 2024, the member states are obliged to ensure its implementation - amongst others the Digital Services Coordinators, who are supposed to be taking office by that date. Oliver Marsh, who leads the Landecker funded project "Auditing Algorithms for Systemic Risks," explains in an interview why there are still problems with implementation, what obligations the DSA entails, and the role our partners at AlgorithmWatch play in this.


The DSA was adopted in 2022 and is legally binding since January 2024. On February 17th national coordinators ensuring its enforcement in individual EU-Member States will legally come into force. What does this mean, what should we expect and why is this important for your work at AlgorithmWatch?

The Digital Services Act (DSA) is the EU’s new regulation against risks from online platforms and search engines. Although it came into force last year, the establishment of Digital Services Coordinators in each Member State will unlock a range of sections of the DSA which have been dormant until now – many of which are the parts giving powers to citizens, researchers, and civil society. Previous provisions of the DSA have been quite untransparent – for instance platforms have had to conduct risk assessments, but only the EU Commission has been allowed to read them. But the Digital Services Coordinators can now request access to internal data of the platforms – including, excitingly, on behalf of external researchers who are researching systemic risks in the EU. They will also set up systems like “out of court dispute settlements” and “trusted flaggers,” which will make complaining to platforms a fairer and more effective process.

At AlgorithmWatch, we fight for a world where technology does not weaken justice, democracy, and sustainability, but strengthens them. These new rights for citizens, and new powers for scrutinising platforms, are very welcome to us. However, despite the text of the DSA, it looks like many Digital Services Coordinators will not actually be ready on 17 February. In Germany, for example, we’re hearing the Bundesnetzagentur will not be legally given its powers as the Digital Services Coordinator until April at earliest. So, it is highly unclear how, and when, citizens and researchers can actually use these new opportunities. This is disappointing; and also concerning, given that these powers should be playing a vital role in European elections in the next few months.


2024 is a big election year around the world, among others the EU and several of its member states. Much of your work focuses on mitigating risk to democracies posed by algorithms. What are these risks and how can your work reduce the dangers and support fair and transparent elections?

There's a range of possible risks, some of which are better understood than others. Some people are worried that floods of fake or exaggerated posts on social media, maybe using AI, will deliberately "trick" people into believing things that affect their vote. I’m a little sceptical of that narrative, it risks patronising voters and ignoring their real concerns - but it's certainly possible, particularly in close-run elections. Another risk is that credible or sensible content will simply be drowned out by unhelpful or inaccurate messages; we saw in our study of AI-driven Bing Chat responses during recent Swiss and German state elections that around 1/3 of the answers were inaccurate.

A final one that I think is already apparent is that elections bring a lot of polarising rhetoric - in mainstream media, in real-world conversations, and also online. Social media algorithms which prioritise engagement can boost this polarising content, which can encourage such behaviour in politics more generally.

A big problem is: when any of these issues are occurring, we've been too reliant on platforms to "do the right thing" to deal with the problem. But they may not want to, or they may not know how, or they may be badly prepared. The DSA forces platforms to conduct risk assessments and - in theory - gives outside parties much more power to monitor and support best practices.


What are you hoping that your work at AlgorithmWatch can achieve in the next months?

We have two main goals. Firstly, to use the DSA provisions to support our research work, and, when we struggle to use them, to push for changes. We'll be making a data access request to Microsoft to extend our Bing research. With more "inside information" we can better understand the scale of the problem and the effectiveness of mitigations - hopefully before the next German state elections. We're also building a "risk repository" of potential examples of systemic risks that people have observed since the DSA came into effect, to support the EU Commission in analysing and clarifying the concept of "systemic risk".

Secondly, to spread the word and help people use the DSA opportunities. I have a research background myself: I know how valuable these new data access provisions could be, but also how hard it is for researchers to keep up with policy changes. We want to help researchers, and others, understand how they could benefit from data access provisions or other parts of the DSA. If that’s you, please get in touch!


What do you wish people would better understand with regards to the new DSA rulings & algorithmic transparency?

That it's designed to ensure platforms conduct responsible business operations. It doesn't tell platforms and search engines what content they can and can't host (unless it's illegal). It just means it’s harder for them to sidestep their responsibilities to users and society. If you're a citizen who makes complaints to platforms - there are now more options to make sure they can't ignore you. If you're a researcher who is hitting walls when trying to accessing data – there are options to support you. And also, like any policy, there’ll be a learning period. So, people exercising their rights now could help demonstrate the real needs and establish good processes for the future.


More about our cooperation with AlgorithmWatch here.

Explore what we do

Confront the past

Combat antisemitism

Protect minorities

Strengthen democracy

Reinforce critical thinking

Share on Twitter
Twitter
Share on Facebook
Facebook
Share by email
Share-mail
Copy link
Link Copied
Copy link
AlgorithmWatch
back arrow