You are currently viewing FCA Board Focuses on AI
  • Reading time:3 mins read
  • Post category:Latham & Watkins

The UK’s Financial Conduct Authority (FCA) has published its latest board minutes highlighting its increasing focus on artificial intelligence (AI), in which it “raised the question of how one could ‘foresee harm’ (under the new Consumer Duty), and also give customers appropriate disclosure, in the context of the operation of AI”. This publication indicates that AI continues to be a key area of attention within the FCA. It also demonstrates that the FCA believes its existing powers and rules already impose substantive requirements on regulated firms considering deploying AI in their services.

By Stuart Davis, Fiona M. Maclean, Gabriel Lakeman, and Imaan Nazir

In particular, the FCA’s new Consumer Duty imposes broad “cross-cutting” obligations on firms, including the obligation to avoid causing foreseeable harm to consumers, and the FCA has enforcement powers to ensure firms comply with these requirements. Although the board minutes state that the FCA “considered it was important to discuss opportunities for achieving good outcomes for customers, integrity in markets, as well as efficiencies in firms,” the clear focus of the FCA’s discussion is on consumer protection.

This position sits within the context of UK regulators’ continuing attention to the impact of AI on financial services and its consequences for consumers, and follows two other significant developments in this area:

  • the FCA’s August 2022 Discussion Paper with the Bank of England and the Prudential Regulation Authority, which considered the benefits of AI in financial services and the relevance of regulatory requirements in mitigating associated risks; and
  • the July 2023 speech by Nikhil Rathi, the FCA’s Chief Executive, setting out the FCA’s regulatory approach to Big Tech and AI.

The board minutes reference the July 2023 speech, which also emphasised that the FCA regards existing UK regulatory frameworks as addressing many AI-associated risks — in that case highlighting both the Consumer Duty and the Senior Managers & Certification Regime (SMCR) which imposes requirements for senior managers to be ultimately accountable for a firm’s activities.

The FCA has in the past used its board minutes to flag its high-level regulatory approach, and the current board minutes are therefore significant in providing a clear signal to firms that the FCA expects them to take steps to manage risks in the use of AI under the existing regulatory regime, even while discussions are ongoing around the extent to which new rules will be required to address specific AI issues.

Latham & Watkins

Please visit the firm link to site


You can also contribute and send us your Article.


Interested in more? Learn below.