
The ‘roles’ in the AI Act
Who is a provider ?

Philipp Fischer
(Translated by DeepL)
The European Regulation on Artificial Intelligence (AI Act) sets out specific obligations for the various actors involved in the different stages of the development, operation and use of a tool covered by the AI Act.
The two most important ‘roles’ in relation to these obligations are those of “provider” and ‘deployer’. It is therefore important to determine the role that each company plays in relation to the artificial intelligence systems (AIS) or general-purpose AI models (GPAIM) (for the concepts of AIS and GPAIM, see Caballero Cuevas, cdbf.ch/1382) that it uses or develops, as this role determines the obligations applicable under the AI Act. This commentary focuses on the concept of supplier within the meaning of the AI Act.
1. Who is considered a supplier ?
A supplier is defined as ‘a natural or legal person, a public authority, an agency or any other body that develops or has developed an AI system or a general-purpose AI model and places it on the market or puts the AI system into service under its own name or trademark, whether for a fee or free of charge’ (Article 3(3) AI Act).
The constituent elements of this definition have several implications :
- Placing on the market : This only concerns the market of the European Union (EU) and refers to the first making available of an AIS or an GPAIM on that market.
- Putting into service : This also only applies to the EU. What is meant here is the initial provision of an AIS for first use directly to a deployer or for the supplier’s own use. This second criterion does not apply to GPAIMs, as these models are only considered a preliminary stage of an AIS.
- Activities directed towards the EU : The activities must have been intentionally directed towards the EU. A mere ‘unintended spillover’ into the EU market is not sufficient (see Fischer, cdbf.ch/1397).
- Whether the service is provided for a fee or free of charge : It is irrelevant whether the AIS is offered for a fee or free of charge.
The definition of ‘supplier’ therefore primarily refers to IT service providers who are the originators of an AIS or GPAIM. Swiss financial institutions should not, in principle, be affected.
That said, there is an alternative scenario in which a company may be classified as a ‘supplier’. A company may become a ‘supplier’ within the meaning of the RIA if one of the following alternative conditions is met :
- it puts its name or trademark on a high-risk AIS that is already on the EU market ;
- it substantially modifies such a high-risk AIS (which remains ‘high-risk’) ; or
- it modifies or uses an AIS on the EU market contrary to its intended purpose in such a way that it becomes a high-risk AIS (see Article 25(1) AI Act).
Case (iii) can be illustrated by the use of a general-purpose chatbot for a high-risk application, for example if a Swiss bank uses such a tool to establish the credit rating of a natural person domiciled in the European Union and communicates the results to that person.
In territorial terms, only entities that place an AIS on the market or into service in the EU are considered ‘providers’ within the meaning of the AI Act. However, the AI Act may also apply to providers outside the EU, in particular where the result of their AIS is intentionally used in the EU (see Fischer, cdbf.ch/1397). This provision aims to prevent circumvention and has the effect of extending the formal definition of provider (which is in principle focused on the concept of placing on the market or putting into service in the EU). In cases (i) to (iii) above, it is the company that reuses the AIS (including outside the EU) that may be subject to the AI Act as a ‘supplier’, even if the system is not placed on the market or put into service in the EU. In such cases, it is sufficient that the result of the system is used within the EU.
2. What are the practical consequences of the role of supplier ?
The obligations of suppliers in relation to high-risk AIS are discussed separately (see Caballero Cuevas, cdbf.ch/1406).
With regard to other AIS, the main obligations of providers can be summarised as follows :
Obligations relating to AIS (Art. 50 AI Act)
- Information : Individuals must be informed that they are interacting with an AIS, unless this is obvious to an informed user.
- Labelling : Text, audio, video and visual content generated or modified by an AIS must bear a machine-readable label indicating the involvement of AI. This labelling, which is simpler for images (e.g. watermarks), remains unclear for text. An exception exists for editing tools without substantial modification, which could benefit services such as DeepL.
Obligations relating to GPAIM (Art. 53 AI Act)
- Technical documentation : Detailed technical documentation of the model must be maintained and regularly updated for the supervisory authorities. Simplified documentation must also be made available to GPAIM users.
- Representative in the EU : A legal representative established in the EU must be appointed when the provider is located outside the EU.
- Copyright : Internal rules must be introduced to comply with EU copyright law, including regulations on text and data mining (TDM) and the right to withdraw consent. EU copyright law applies to the training of GPAIM offered in the EU, including those developed outside the EU. This rule aims to prevent models trained under more lenient rules from being subsequently commercialised in Europe.
- Training data : A publicly available summary of the content used to train the model must be provided.
The AI Act exempts GPAIM providers from the first three obligations provided that their model is published under a free and open (open source) licence. This licence must allow the model to be viewed, used, modified and distributed, and must make public the parameters, including weights, architecture and usage information. However, the last two obligations remain applicable.
Additional requirements apply to GPAIM providers that present ‘systemic risks’ (Art. 55 AI Act), i.e. very powerful models, assessed in particular according to the computing power mobilised. Large language models (LLMs) such as OpenAI’s GPT-4 could fall into this category, but the exact criteria have yet to be specified. The AI Act leaves the door open to other models. The providers concerned must assess the risks associated with their model, put in place measures to address them, report serious incidents to the authorities and ensure the security of the system.
3. What are the practical consequences for Swiss companies ?
The majority of Swiss companies can consider that they do not assume the role of a supplier within the meaning of the AI Act as long as they do not place an AIS on the EU market or put it into service within the EU (see Fischer, cdbf.ch/1397). However, if they modify or use an AIS in a manner contrary to its original purpose, such that it becomes a high-risk SIA (see Art. 25 para. 1 AI Act) and the result is used within the EU, they could, in our view, be assigned the role of ‘supplier’ and would then have to comply with the above-mentioned obligations. Keeping an inventory of the AIS used could be one approach to managing this regulatory risk.