← Back to Examples
Customer Service Chatbot
Limited RiskA text-based chatbot that answers customer questions about products and services. Uses a large language model for natural language understanding.
Classification Walkthrough
- Step 1 — Prohibited practices (Article 5): The chatbot does not perform social scoring, biometric identification, or any other prohibited practice. ✗ Not prohibited.
- Step 2 — GPAI: While the chatbot uses an LLM, the chatbot itself is not a general-purpose AI model — it is a specific application. The LLM provider has separate GPAI obligations. ✗ Not GPAI.
- Step 3 — High-risk Pathway A: The chatbot is not a safety component of a regulated product listed in Annex I. ✗ Not Annex I.
- Step 4 — High-risk Pathway B: Customer service does not fall under any Annex III high-risk category. ✗ Not Annex III.
- Step 5 — Limited risk: The chatbot interacts directly with natural persons. Under Article 50, users must be informed they are interacting with an AI system. ✓ Limited risk transparency obligation applies.
- Result: LIMITED RISK — The chatbot must clearly disclose to users that they are interacting with an AI system.