ET

EverythingThreads

Is AI giving you financial advice it's not qualified to give?

AI tools routinely cross the line from information to regulated financial advice. The FCA is watching. Here is how to check.

The regulatory landscape has shifted

In June 2025, the FCA launched its AI and Algorithmic Trading (IAAT) programme, signalling active regulatory scrutiny of AI in financial services. Under Consumer Duty rules, firms must ensure that AI-generated communications do not mislead retail consumers — and that includes outputs from general-purpose models like ChatGPT being used internally or surfaced to clients.

OpenAI's own usage policies now ban personalised professional advice, including financial advice. Yet ChatGPT will still generate specific investment recommendations, risk assessments, and portfolio suggestions if prompted. It does so with no suitability assessment, no risk profiling, and no regulatory disclosure.

What Financial AI Scope detects

Financial AI Scope analyses AI-generated financial text against FCA-specific risk flags: suitability bypass, hallucinated figures, unauthorised investment advice, missing risk warnings, and Consumer Duty disclosure gaps. Each flag maps to specific FCA guidance.

Whether you are a compliance team reviewing AI-assisted client communications, a financial adviser checking AI-drafted reports, or an individual wondering whether to trust what ChatGPT told you about your pension — paste the text and get an immediate risk breakdown.