AI should not be making your pension decisions, your retirement plans or anything that’s irreversible.
We recently hosted a webinar called Truth, Trust and Tech: Making Financial Decisions in the Age of AI, alongside Mike Chatterton from the AI Accelerator in North Greenwich. It was a great conversation — honest, practical, no hype — and it clearly struck a chord. So I thought it was worth turning the key takeaways into a blog for anyone who missed it, or who’s been thinking about this since.
Because this isn’t going away. If anything, it’s accelerating.
What’s actually changed with AI?
AI as a concept has been around since the 1950s. It’s been quietly powering things like spam filters, medical imaging and sat navs for years. But around 2017, something shifted. Computer scientists cracked some big problems around language — and that led to what we now call large language models. ChatGPT, Copilot, Claude, Gemini. You’ve probably used at least one of them by now.
The thing to understand is this: these tools don’t know things the way a textbook does. They predict what a good answer might look like, based on patterns in the enormous amount of data they were trained on. As Mike put it during the webinar, think of it as a magic box. You put a question in and get something impressive out — but you’re not always 100% sure why it gave you that answer.
And if you’re trusting it with your finances, that matters quite a lot.
Why are so many people using AI for money?
The numbers are striking. Lloyds Banking Group’s Consumer Digital Index, published in late 2025, found that over 28 million UK adults — that’s 56% of us — have used AI to help manage their money in the past year.¹ One in three people now use it weekly for money matters. More than for health advice, shopping or travel planning.
People are using it for budgeting, savings planning, investment research and even pension questions. ChatGPT is the most popular tool, used by six in ten. And users reckon they’re saving an average of £399 a year from the insights they get.¹
So it clearly has its uses. But here’s the thing: 80% of those same users worry about getting inaccurate or outdated information, and 83% are concerned about data privacy.¹ There’s a trust gap — and it’s a sensible one.
Where can AI actually help with your finances?
We’re not anti-AI at Ginkgo. Far from it. We use it ourselves to cut through admin, improve how we prepare for meetings and speed up the paperwork that financial services is famous for. It means we can spend more time doing the bit that actually matters — sitting with you, understanding your situation, and helping you make good decisions.
For you at home, there are some genuinely useful things AI can help with. We covered this in the webinar and I’d stand by it:
Where does it start to go wrong?
This is where we need to be careful. During the webinar, Mike and I talked about the fact that AI doesn’t know it’s wrong. It gives you an answer with total confidence — even when it’s made something up. The technical term is “hallucination,” but in plain English, it’s guessing without telling you it’s guessing.
Tax calculations are a good example. AI rarely knows the current rates. Since Rachel Reeves has been changing things so regularly, there’s a real risk of getting outdated figures. It won’t know your personal allowances, your pension contributions or whether you’ve already used certain reliefs. It doesn’t know you.
Comparing investments is another area where it struggles. It can overweight a small detail and underweight something important. It doesn’t understand nuance the way a qualified adviser does — someone who’s seen hundreds of different client situations and knows what questions to ask.
And here’s something that’s got worse since we ran the webinar: AI-powered scams are rising sharply. Deepfake fraud attempts in the UK nearly doubled in 2025, rising 94%.² Criminals are using AI to clone voices, create fake video calls and build convincing investment apps that are nothing more than elaborate cons. UK consumers lost an estimated £9.4 billion to AI-related scams in the nine months to November 2025.³ If a financial app or tool looks too good to be true, it almost certainly is.
What about the big financial decisions?
This is where I’m really clear. AI should not be making your pension decisions, your retirement plans or anything that’s irreversible.
We had someone ask during the webinar Q&A whether they should trust what ChatGPT told them about their pension. My honest answer? No — not on its own. Cross-check it with another AI tool if you like (they’re actually quite good at catching each other out). But for anything that involves real money and real consequences, you need a real person who’s qualified, regulated and accountable.
If you take your tax-free cash based on what an AI told you and it turns out to be wrong, there’s nobody to complain to. No comeback. No compensation. The AI company will not be picking up the bill.
If you’re already a Ginkgo client, you know this. You’ve experienced what it’s like to have someone who knows your family, your goals and your circumstances sitting across the table from you — someone who’ll ask the right questions, challenge assumptions and make sure you’ve thought things through properly. That’s the judgement piece. And it’s something no algorithm can replicate.
Since we ran the webinar in January, things have moved on. The FCA has launched a major review into the long-term impact of AI on retail financial services, led by Sheldon Mills, looking ahead to 2030 and beyond.⁴ They’re asking how AI could reshape the relationship between advisers and clients, what the risks are as the technology gets more capable, and whether existing rules need to adapt.
What they’ve been clear about is this: they won’t be introducing AI-specific regulations. Instead, they’re relying on existing frameworks — the Consumer Duty, the Senior Managers Regime — to hold firms accountable for how they use it. Their approach is outcomes-focused. In other words, they don’t mind how you use the technology, but they do mind if it leads to poor outcomes for clients.
The FCA’s chief executive, Nikhil Rathi, has also said something interesting: if an adviser tells you they’re not using AI, that should raise a flag.⁵ Not because AI replaces advice — but because any firm that isn’t embracing it risks falling behind on efficiency and client service.
Mike made the same point in the webinar, and I agree. We’re not using AI to replace expertise. We’re using it to free up time so we can spend more of it with you.
AI is a powerful tool. It’s here to stay, and it’s going to keep getting better. At Ginkgo, we’re making sure we stay on top of it — using it to work more efficiently, to prepare better and to give you a better experience.
But it’s not a replacement for human expertise and judgement. And when it comes to the decisions that really matter — your pension, your retirement, your family’s financial security — you need someone who knows you, who’s qualified and who’ll take responsibility.
If you’ve got questions about how we use AI, or if you’d like to talk through any of the topics we covered in the webinar, we’d be happy to chat.
Daren Wallbank Chartered Financial Planner, Ginkgo Financial
You can watch the full webinar recording on our YouTube channel.
Sources
¹ Lloyds Banking Group, Consumer Digital Index 2025 (published November 2025)
² Sumsub, Identity Fraud Report 2025–2026 (published November 2025)
³ The News International, reporting on UK AI fraud losses, February 2026
⁴ FCA, “The FCA’s long-term review into AI and retail financial services: designing for the unknown,” Sheldon Mills speech, 28 January 2026
⁵ FCA, “AI and the FCA: our approach,” September 2025; Nikhil Rathi remarks at FT Global Banking Summit, December 2025
The value of investments can go down as well as up and you may get back less than you invest. Past performance is not a reliable indicator of future results.
Ginkgo Financial Ltd is an appointed representative of Quilter Financial Planning Limited which is authorised and regulated by the Financial Conduct Authority.
Approver Quilter Financial Services 12/2/26.