view all news & events
07/04/2023

Corporate Governance and AI

The dynamic development of Artificial Intelligence (AI) leads to new requirements for business managers of companies.

Special legal requirements

Appropriate measures must be taken to ensure that the legal regulations applicable to the company for the use of AI are observed. In addition to data protection regulations (DSGVO), the Act on the Protection of Business Secrets (GeschGehG), this also includes sector-specific laws (e.g., the Act on the Federal Office for Information Security), as well as, at the European level, the Digital Operational Resilience Act (DORA) (Regulation on Digital Operational Resilience in the Financial Sector). In the near future, particular attention will have to be paid to the AI Regulation ("Artificial Intelligence Act"/"AI-Act", Regulation laying down harmonized rules for artificial intelligence) (current draft), which is currently still in the draft stage at the European Commission. After the European Parliament also published its final position on the Commission draft on June 14, 2023, all three drafts are now available in addition to the position of the Council of the European Union, so that negotiations can be conducted with the aim of reaching a compromise (so-called trilogue process). Once the process has been completed, the AI Regulation is expected to come into force in 2024 and define uniform framework conditions and protection standards for the use of AI for the first time. The regulation pursues a horizontal, i.e. cross-sectoral, regulatory approach and covers the development, circulation and use of AI systems by private and state actors. The greater the risk of an AI system, the stricter the regulation (risk-based regulatory approach). According to the current status, the regulation will provide for bans, quality requirements for high-risk AI systems, labeling obligations and voluntary commitments according to risk levels.

Duties of care

Business managers must exercise the due care and diligence of a prudent and conscientious business manager in the performance of their management function (Section 43 of the German Limited Liability Companies Act; Section 93 of the German Stock Corporation Act). For the field of AI, it is crucial that the business manager ensures in his or her company that the limited capabilities of AI are realistically assessed and that the results provided by AI are subject to critical monitoring and review by humans. In the current state of the art, the business manager cannot easily rely on the results provided by AI systems, as they are basically "only" based on statistical considerations without making any reason-based considerations of their own. In order to avoid or limit technical and legal risks, the managing director must in principle set up a compliance system; this also applies to the use of AI in the company (cf. MüKoGmbHG/Fleischer, 4th ed. 2023, GmbHG Section 43 marginal no. 195). This means that the managing director must identify IT risks (himself and through suitable employees) in order to take the necessary measures to prevent and eliminate digital risks based on this. Examples of this are:

  • Definition of processes for the review of AI systems
  • Establishment of internal review panels
  • Exchange of information between experts, test committees and company management
  • Sufficient and ongoing testing of systems
  • Externally directed defenses,
  • Restricted access rights, access control for AI systems
  • Disabling devices.

In addition, it is advisable to define processes for ruling out technical malfunctions and responding to unforeseen events related to AI (technical compliance) and for ensuring compliance with legal requirements (legal compliance).

Delegation

The CEO can also delegate responsibility in the compliance area to specialized employees at subordinate levels (e.g. CSO, CCO). On the other hand, however, the necessary know-how, the necessary processes and any necessary personnel responsibility for effective monitoring of the employees must also be ensured at the horizontal (management) level. Delegation does not release the manager from his or her ongoing duty of supervision, in the context of which he or she must ensure that the person selected is professionally qualified and personally suitable for the task assigned to him or her. Moreover, a complete delegation of corporate decisions to AI systems is undoubtedly not permissible at present.

Consequences of breaches of duty

If the managing director violates his or her supervisory duties, he or she may be subject to personal liability claims if he or she is at fault. In the case of regulatory violations within the company, the managing director is already considered responsible (and can be fined) regardless of his own fault if there is no proper compliance system or if, for example, measures in accordance with Article 32 of the GDPR are not adequately implemented (Section 130 OWiG). In their activities in the area of CI, the managing director and the company will have to check the insurance conditions and any exclusions of the liability insurances (Directors and Officers (D&O) Insurance) applicable to them particularly critically and discuss their activities and compliance systems with the insurance company in order not to have any gaps in coverage.

    Share

  • LinkedIn
  • XING