Blog
                            
                    We already have rules for AI systems

                                Published
                                08 Nov 2021
                            
                    
The European Commission has published a draft for an AI (Artificial Intelligence) Regulation, since it does not consider current laws and regulations are sufficient for regulating AI systems. The approach of the draft is to choose human-centered AI. Developers of AI systems must assess for themselves in which of the four risk groups their system falls. The higher the risk of an AI system, the higher the requirements for that AI system. It will still take years before the AI Regulation will come into effect. There is also good chance that the draft will be adjusted in the mean time.
What does this draft mean for the AI systems that are currently being developed or used? Are there no rules which apply to them?
For several types of AI systems, there are already laws and/or regulations that must be complied with. For example:
    - Medical Devices Regulation: for AI systems in medical devices;
 - Constitutions + Human Rights Treaties: for protection of fundamental rights such as freedom of speech, privacy and self-determination;
 - General Data Protection Regulation: when personal data is processed;
 - Product safety regulations: when an AI system causes injury;
 - Consumer protection: when information obligations arise from these regulations;
 - Codes of conduct: when rules (code of conduct) have been established in a sector for AI systems, and
 - Contracts: when parties have established rules for AI systems in an agreement.
 
 
LegalAIR
 
AI compliance assessment