For Further Information Contact:
New Obligations for Swiss Hosting Services and Online Platforms from February 2024?
30/01/2024The most important facts in a nutshell:
- The DSA aims to create a trustworthy online environment in the EU and to facilitate the fight against illegal content.
- To this end, the DSA regulates providers who transmit, cache or store data for third parties. Cloud computing providers, web hosters, online platforms and search engines are particularly affected.
- The DSA applies to providers who are based or established in the EU or who focus their services on the EU. The latter must appoint a legal representative in the EU. Thus, the DSA is potentially also relevant for Swiss providers.
- The DSA imposes various obligations on providers. These range from obligations to cooperate with authorities to transparency and accountability obligations towards users to the obligation to remove illegal content as soon as it becomes aware if it is stored on behalf of third parties. However, there is no obligation to actively monitor.
- The DSA does not define what constitutes illegal content. This is governed by the national law of the EU member states.
- The DSA grants providers a liability privilege if they take swift action against illegal content and activities as soon as they become aware of them. Otherwise, they expose themselves to liability.
- The sanctions for violations of the DSA are draconian, but their enforceability in Switzerland is questionable.
- Swiss providers without an establishment or registered office in the EU should check whether they are subject to the DSA and, in particular, whether they should appoint a legal representative in the EU.
What is the Digital Services Act and who is obligated?
The Digital Services Act (DSA) is a regulation of the European Union. It is the latest (and most important) in a series of EU legislative projects aimed at protecting consumers in the digital space. The aim of the DSA is to create a “safe, predictable and trustworthy online environment“. A special focus is therefore placed on preventing the spread of illegal online content and disinformation.
Which companies does the DSA apply to?
The DSA regulates providers of so-called intermediary services. Providers of intermediary services are:
- Provider of intermediary services in the strict sense: any person who carries out the mere transmission of information provided by a user in a communication network or provides a caching service in which that information is automatically cached for a limited period of time for the purpose of making the transmission of the information more efficient at the request of the user. These include, for example, WLAN, ISP, VoIP or VPN providers or registrars of top-level domain names. Anyone who also allows users to enter search queries in order to find and display results on (external) websites is also considered an “online search engine“.
- Hosting Service Provider: Who stores information provided by the user on their behalf. If the information is also disseminated to the public (and this does not constitute an ancillary function), the hosting service provider is considered to be the provider of an “online platform“. Classic cloud computing and web hosting services are considered pure hosting service providers and not online platforms because they only provide an infrastructure or computing service of an internet-based application. If online platforms allow the conclusion of distance contracts between traders and consumers, they are considered a ‘marketplace‘.
- Additional requirements apply to very large online platforms (“VLOPs“) and very large online search engines (“VLOSE”). This threshold is reached when an average monthly number of users in the EU of 45 million is reached. We won’t go into these in more detail in this article.
Certain parts of the DSA are already in force. Since 17 February 2022, online platforms and search engines have been required to publish transparency reports and the number of their active users (Art. 24 DSA). However, essential parts of the DSA will now enter into force on 17 February 2024.
Does the DSA also apply to providers of intermediary services in Switzerland?
The DSA applies not only to providers based in the EU, but also to foreign (e.g. Swiss) providers who target or offer their services in the EU. In order for there to be an ‘offering’ within the meaning of the DSA, the intermediary service must have a significant number of EU users (compared to the population of the Member State concerned). Signs of such an “alignment” can be, for example, the use of an EU currency or language, deliveries to EU countries or the presence of the app in the app store of an EU country, or potentially also the fact that the privacy policy takes the GDPR into account. However, the mere availability of a website in the EU area is not enough.
Swiss providers who do not wish to be subject to the DSA should take care not to advertise their products and services only in Swiss francs, not to translate their websites into languages of EU member states (except those with Swiss national languages), not to advertise in the EU area or generally not to make any references to the EU – whether on the company website or in other (online and offline) appearances. According to the DSA, “all relevant circumstances” are decisive in each case. Anyone whose business is significantly dependent on EU customers or who knows that they have a significant number of users in the EU area will not be able to escape the scope of the DSA.
What general obligations does the DSA impose on intermediary service providers?
The DSA distinguishes between general obligations, which apply to all providers of intermediary services, and those imposed on providers, depending on their size and category.
The following obligations apply to all providers of intermediary services:
- Legal representative (Art. 13 DSA): Companies without an EU establishment must appoint an EU legal representative who is the point of contact for authorities and users. The legal representative may be held liable for violations of the DSA, regardless of the provider’s liability. In this way, companies from EU third countries (e.g. Switzerland) can also be held accountable directly and without mutual legal assistance proceedings. The legal representative must be empowered and equipped to cooperate effectively and promptly with public authorities and to implement their orders. The identity of the legal representative must be communicated publicly.
- In practice, Swiss providers without a registered office or branch in the EU will have to carry out a risk assessment. Since the legal representative can be held liable for breaches of duty under the DSA (Art. 13 para. 3 DSA), the appointment of such a representative will be significantly more expensive than, for example, a legal representative under the GDPR. In addition, we assume that it is not possible to enforce sanctions imposed in the EU under the DSA via a mutual legal assistance procedure in Switzerland – as is the case under the GDPR – due to the lack of an international treaty regulation. However, as long as no legal representative has been appointed, the authorities of all member states can take action against a provider and sanction misconduct. As soon as such a person has been appointed, the EU member state at the registered office of the legal representative is responsible for sanctioning. If, in case of doubt, no legal representative is appointed, the risk increases that there has been a material violation of the DSA. If a legal representative is appointed in case of doubt, the risk of a DSA violation is reduced, but the practical enforcement risk increases because it is easier for the authorities to sanction a Swiss provider.
- Contact points (Art. 11, 12 DSA): Authorities and users must have a central point of contact with the provider for quick communication (a classic chat bot does not meet these requirements). In addition to the imprint obligation applicable in Switzerland (cf. Art. 3 para. 1 lit. s UWG), which requires the disclosure of one’s own identity (name of the company, physical address, e-mail address), the DSA must also indicate the languages in which communication is carried out. It must be at least the language of the registered office of the principal place of business or the legal representative. This must be taken into account when choosing the legal representative (see above).
- Compliance with official orders (Art. 9 DSA): Intermediary services must inform the authorities that have issued them an order to take action against illegal content about the implementation of the order and indicate whether and when they have implemented it. The users concerned must then be informed of this fact by the intermediary service and the measures taken must be justified.
- General Terms and Conditions (Art. 14 DSA): General Terms and Conditions (GTC) must be formulated in a clear and transparent manner and be easily accessible to users. In the case of use by minors, the T&Cs must be written in comprehensible language. Detailed information on content moderation must be provided, taking into account freedom of expression and freedom of the media.
- Transparency reporting obligation (Art. 15): Annual report on content moderation, including orders from authorities, active moderation and methods used. The reports must be easy to understand and publicly available.
What are the additional obligations for hosting services?
The following obligations also apply to hosting services:
- Notification and remediation procedure (Art. 16): Users should be able to report illegal content easily, ideally via an input mask instead of an email address. As long as the hosting provider has no knowledge of illegal content, it cannot be held accountable for it (liability privilege, Art. 6 DSA). However, the service loses its liability privilege as soon as it becomes aware of the report (and thus of the allegedly illegal content) and does not “take swift action to block access to the illegal content or remove it” (Art. 6 para. 1 lit. b DSA).
- Obligation to state reasons (Art. 17): Restrictions on user rights (profile blocking, deletion of content) must be justified to the user, for example because there are violations of the terms of use of the service or illegal content is stored. The justification must also provide information about available remedies, such as internal complaint procedures or legal recourse to a court.
- Reporting of criminal offences (Art. 18): Suspicious criminal activity involving life and physical integrity (e.g. death threats or incitement to terrorism) must be reported to the competent law enforcement authorities.
- Liability: Hosting service providers are liable for the presence of illegal content if they “have actual knowledge” of the illegality of content or activities, but fail to take swift action to block access to the illegal content or remove it. Hosting service providers are explicitly exempt from liability if they have no actual knowledge of illegal activity or illegal content and are also not aware of any circumstances from which such activity is “obvious“. However, there is explicitly no obligation to monitor or investigate (Art. 8). The DSA does not define what counts as illegal activity or content and refers to the respective national laws. As a general rule, everything that is already illegal offline should also be illegal online. This isn’t really new for Swiss hosting services.
What are the additional obligations for online platforms?
The following additional obligations apply to online platforms:
- Internal complaint management system (Art. 20): Platforms must provide an easily accessible, user-friendly and free complaint management system for users, where complaints must be dealt with in a timely, non-discriminatory and diligent manner. If sufficient reasons are provided, the platform must immediately reverse the decision and provide the complainant with a justification.
- Out-of-court dispute resolution (Art. 21): Users affected by decisions have the right to choose an independent, knowledgeable and transparent dispute resolution body. These bodies must report on their activities and take their decisions within a certain period of time. The costs are borne by the provider of the online platform, unless the user acts maliciously.
- Trusted whistleblowers (Art. 22): Reports from trusted whistleblowers such as public authorities must be treated as a priority. Annual reports on these reports must be published. In the event of misuse, the status of the whistleblower can be temporarily suspended.
- Measures and protection against misuse (Art. 23): Users who repeatedly provide illegal content will be temporarily blocked after a warning. The processing of reports and complaints from people who frequently submit unsubstantiated reports may be temporarily suspended.
- Transparency reporting obligations (Art. 24): Online platforms must publish information on the number of their active users every six months. This information shall be provided to the National Digital Services Coordinator and the Commission upon request.
- Design and organisation of the online interface (Art. 25): The online interfaces must not deceive, manipulate or impair users’ freedom of choice. The Commission may issue guidelines for its application.
- Advertising on online platforms (Art. 26): Advertising must be clearly identifiable as such and contain information about the advertiser and the main display parameters, in particular information about the person in whose name the advertisement is placed and the person who paid for the advertisement. Users must be able to indicate whether their content is commercial communication, and platforms must provide transparent labels. Advertising based on sensitive personal data is not allowed.
- Transparency of recommender systems (Art. 27): Platforms must clearly and comprehensibly set out the key parameters of recommender systems in their terms and conditions and give users the opportunity to influence these parameters.
- Online protection of minors (Art. 28): Platforms must take appropriate measures for the privacy and safety of minors. Advertising based on profiling with personal data of minors is strictly prohibited. No additional personal information is required for age verification.
What additional obligations apply to marketplaces?
The following additional obligations apply to marketplaces:
- KYC (Art. 30): Marketplaces must collect and verify (KYC) contact details, payment information and proof of identity of entrepreneurs who offer their services on the platform. If entrepreneurs do not provide this information, it must be blocked.
- Transparency (Art. 30 para. 7, Art. 31, Art. 32): Name, address, telephone number and e-mail address of the entrepreneur, company identification number and a self-declaration of the entrepreneur must be made available to the users. The platform design must ensure that entrepreneurs provide necessary information for consumers before concluding a contract, e.g. product information on safety. If the platform operator detects that illegal products/services are being offered via the platform, the online platform shall inform consumers. Furthermore, marketplaces should not give the impression that they themselves are contractual partners (i.e. the companies on the platform) in order to benefit from liability privileges for content provided by users.
Is there an SME exception?
Yes, but only to a limited extent. Companies with fewer than 50 employees and less than 10 million in annual turnover do not have to comply with the additional obligations for online platforms and marketplaces. However, at the request of the EU Commission or the competent authority, they must provide information on the average number of their active users in the EU. They are not exempt from the other obligations under the DSA.
What sanctions do providers face for violations of the DSA?
Users of intermediary services may claim compensation for damages or losses incurred as a result of a breach of the DSA. These are governed by the national law of the member states.
Intermediary services that violate the DSA face fines of up to six percent of global annual turnover. In the event of incorrect, incomplete or misleading information, fines of a maximum of one percent of global annual turnover are possible. In addition, these companies can be banned from operating in the EU internal market.
By Vischer, Switzerland, a Transatlantic Law International Affiliated Firm.
For further information or for any assistance please contact switzerland@transatlanticlaw.com
Disclaimer: Transatlantic Law International Limited is a UK registered limited liability company providing international business and legal solutions through its own resources and the expertise of over 105 affiliated independent law firms in over 95 countries worldwide. This article is for background information only and provided in the context of the applicable law when published and does not constitute legal advice and cannot be relied on as such for any matter. Legal advice may be provided subject to the retention of Transatlantic Law International Limited’s services and its governing terms and conditions of service. Transatlantic Law International Limited, based at 42 Brook Street, London W1K 5DB, United Kingdom, is registered with Companies House, Reg Nr. 361484, with its registered address at 83 Cambridge Street, London SW1V 4PS, United Kingdom.