After reporting on the ruling of the Hamburg Regional Court in our last AI Flash: AI and copyright - who wins the battle for image rights, we would like to continue to provide you with legal impulses at regular intervals.
Today's topic: Effects of the DSA on providers of AI systems
After the Digital Services Act (DSA) has already applied in part to Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs) such as Google Search, Facebook or Amazon since November 16, 2022, it has now been fully applicable to all digital services within the meaning of the DSA since February 17 of this year. The DSA aims to give users more control over their online experiences, protect minors and increase the transparency and accountability of online platforms. Although the term “artificial intelligence” is not explicitly mentioned in the text, the topic of AI is also high on the agenda when it comes to enforcing the DSA.
In March 2024, the European Commission sent a total of 20 formal requests for information to VLOPs and VLOSEs, including Facebook, Instagram, Snapchat, TikTok, YouTube, X, Bing and Google Search, in order to implement Articles 34 and 35 of the DSA. The aim of this survey was a risk analysis with a particular focus on potential gateways for influencing political elections in the future. The companies concerned must, for example, provide detailed information on the measures they take to mitigate risks associated with generative AI. This includes hallucinations, the viral spread of deepfakes and automated manipulation that could mislead voters. The Commission is also calling for information on the impact of generative AI on electoral processes, the dissemination of illegal content, the protection of fundamental rights, gender-based violence, the protection of minors, mental well-being, the protection of personal data, consumer protection and intellectual property.
So far, the Commission's requests for information have only concerned VLOPs and VLOSEs. All other service providers are subject to far less comprehensive regulations on basic due diligence obligations, such as the creation of transparent terms of use and the implementation of mechanisms for reporting illegal content. It is not the Commission that is responsible for monitoring them, but the national supervisory authorities, the Digital Services Coordinators (DSCs). In Germany, this is the Federal Network Agency. It is therefore quite possible that comparable information orders will also be issued by the Federal Network Agency in the coming months.
This means that, in addition to the much-discussed AI Act and the GDPR, the DSA in connection with AI applications should not be lost sight of.