KMU-Magazin Nr. 4/5, April/Mai 2025 EU digital policy and its impact on Switzerland
The EU has taken on a significant role in various areas, such as AI and platform regulation. This also has an impact on Switzerland. This article highlights a few of the EU's activities and their significance for Switzerland and Swiss companies without claiming to be exhaustive.
Artificial intelligence (AI) is no longer a thing of the future – it is already influencing numerous areas of business, from process automation to customer data analysis. The EU is very active in digital policy. Examples include the Artificial Intelligence Act (AI Act), the Digital Services Act (DSA), and regulations in the area of cybersecurity and resilience.
The AI Act
With the Artificial Intelligence Act (AI Act), the EU has created the world's first transnational regulatory framework that sets priorities and standards for the use of artificial intelligence in order to promote ethical and transparent innovation. The AI Act provides for the classification of AI systems into different risk categories:
- AI systems that are not considered to pose a risk (no particular risk): These include AI systems that, after careful assessment, do not pose a significant risk to the health, safety, or fundamental rights of individuals.
- Low-risk AI systems: These are AI systems that pose a low but identifiable risk that can be mitigated by general principles of data protection and cybersecurity.
- High-risk AI systems: AI systems that pose a high risk to the health, safety, or fundamental rights of individuals and are subject to strict security and transparency requirements. These include, for example, AI systems in healthcare, education, or the justice system.
- Prohibited AI systems: This category includes AI applications that are prohibited due to their inherent harmfulness or incompatibility with European values. Examples include social scoring systems or systems designed to manipulate behavior.
The AI Act aims to strike a balance between innovation and security. There are several key requirements that companies must comply with:
- Transparency requirements: Companies must disclose when their AI systems are used in critical processes, such as in application procedures or when granting loans. In addition, affected users must be informed when AI contributes to decision-making.
- Data quality and security: High requirements apply to AI systems that work with sensitive data. The systems must be robust in order to minimize bias and prevent discrimination. This also includes regular audits and testing procedures to ensure fairness.
- Documentation and supervision requirements: Companies that use high-risk AI must maintain detailed technical documentation, document algorithms, and conduct regular risk assessments.
- Regulatory and sanctioning mechanisms: Violations of the AI Act can result in heavy penalties of up to €35 million or seven percent of a company's global annual turnover (whichever is higher). This is intended to ensure that companies take compliance seriously.
The AI Act also provides for a market surveillance procedure to ensure that companies comply with the new rules. National supervisory authorities will be tasked with monitoring compliance, punishing violations, and intervening in the event of risks.
The AI Act applies to AI systems placed on the market or put into service in the EU or the EEA, regardless of whether the providers are established or located in the EU or the EEA or in a third country. The AI Act also applies to operators based in third countries if they use an AI system whose results are used in the EU or the EEA. This means that the AI Act may also apply to companies based outside the EU or the EEA, such as in Switzerland.
The AI Act was finally adopted on June 13, 2024, and has been in force since August 1, 2024. In principle, it will only apply after a transition period of 24 months, ending on August 2, 2026. However, this does not apply to AI systems with unacceptable risk, which have already been banned since February 2025, and to provisions on AI models for general use, which will already take effect in August 2025. From then on, the AI Act's sanction mechanism will also apply.
On February 12, 2025, the Swiss Federal Council published a comprehensive analysis of AI regulation and made important decisions based on this analysis. Switzerland plans to ratify the Council of Europe's AI Regulation and make the necessary adjustments to Swiss law. In addition, activities to regulate AI in specific areas such as healthcare and transport are to be continued. By the end of 2026, a consultation draft is to be prepared that implements the Council of Europe's AI Convention by laying down the necessary legal measures, particularly in the areas of transparency, data protection, non-discrimination, and supervision.
The Digital Services Act (DSA)
The Digital Services Act (DSA) is an EU regulation designed to make digital platforms and online services safer and more transparent. It has been in force since November 16, 2022, and has been fully applicable since February 17, 2024. It aims to better protect users on the internet and increase the responsibility of large platforms. Platforms such as social networks must provide clearer information about how they moderate content and why certain content is removed. They must respond quickly to illegal content and remove it. The DSA requires advertising to be transparent, i.e. to clearly show who is paying for it and why it is being displayed to the user. Users are given more rights, for example the right to appeal against content that has been removed.
The DSA covers a wide range of online services and platforms operating within the EU (even if they are based outside the EU). These include online platforms such as social networks, online marketplaces, app stores, hosting services, and sharing services.
Special rules apply to very large online platforms and search engines.
The Digital Services Act (DSA) provides for clear and in some cases very high penalties (in particular fines of up to 6% of global annual turnover) to effectively punish violations – especially in the case of large platforms, but also for smaller providers.
Platform regulation is also planned in Switzerland. The Federal Council has been working on a corresponding bill for some time. However, the draft of the new Federal Act on Communication Platforms and Search Engines (KomPG) has been delayed.
Cybersecurity and resilience
In addition to the AI Act and the DSA, the EU has introduced further regulations that are important for the cybersecurity and resilience of important institutions and businesses:
- NIS 2 Directive: This is an EU-wide set of rules that strengthens cybersecurity in 18 critical sectors (such as energy, transport, health, and finance) by setting stricter requirements for security measures, reporting, incident notification, and cooperation. EU member states had until October 17, 2024, to transpose the directive into national law.
- CER Directive (Critical Entities Resilience Directive): This aims to strengthen the resilience of critical entities (energy, transport, water, finance, health, etc.). EU member states are required to develop a national strategy to strengthen the resilience of critical entities and conduct a risk assessment at least every four years.
- Digital Operational Resilience Act (DORA): This law particularly affects the financial sector and sets strict requirements for cybersecurity and operational resilience. AI applications used in the financial sector must have robust security mechanisms and be subject to regular stress tests. Companies must also have contingency plans in place for cyber attacks and comply with reporting requirements. DORA was adopted in January 2023 and entered into force in its entirety on January 17, 2025.
- Cyber Resilience Act (CRA): This law aims to improve cybersecurity standards for digital products and connected devices. Companies that develop AI-enabled software or hardware must ensure that their products meet basic security requirements and are provided with security updates for a defined period of time. The CRA requires manufacturers and providers to promptly close security gaps and proactively minimize potential risks. The CRA entered into force on December 10, 2024. The most important obligations introduced will apply from December 11, 2027.
Although these EU regulations do not apply directly in Switzerland, they have a significant impact on Swiss SMEs, especially if they do business with the EU or export digital products and services to the EU market:
- DORA: IT service providers and fintech companies in Switzerland must implement enhanced security measures when working with European financial institutions. These include robust security measures and contingency plans to ensure operational resilience.
- CRA: Manufacturers of digital products or software providers in Switzerland must comply with high cybersecurity standards if their products are sold in the EU. This includes continuous security updates and risk mitigation measures.
In this context, reference should be made to the Information Security Act (ISG), which came into force on January 1, 2024. Operators of critical infrastructure and basic service providers (e.g., food retailers, media, but also IT companies) must report cyberattacks within 24 hours since April 1, 2025.
Conclusion
Digitalization and the associated legal framework in the EU have implications for Switzerland and Swiss companies. Even though these regulations do not apply directly in Switzerland, their impact is significant for Swiss companies with EU connections. Companies that offer digital services, IT services, AI applications, or platforms in the EU in particular must prepare for complex but structured compliance requirements. These developments present Swiss SMEs with both challenges and opportunities to position themselves successfully and sustainably in the digital future and in the European market.