More or Less - Did AI researchers let AI hallucinations into scientific papers?

Did AI researchers let AI hallucinations into scientific papers?

Download Did AI researchers let AI hallucinations into scientific papers?

AI can make mistakes – and AI chatbots like ChatGPT warn you about that whenever you ask them anything.

These mistakes sometimes involve making up entirely fictitious, factually false statements known as “hallucinations”.

Whether these hallucinations matter depends on what you’re using AI for, and whether they are spotted and corrected.

The team on More or Less were slightly surprised to read a headline in Fortune magazine, claiming that a top academic AI conference accepted research papers which contained 100 AI-hallucinated citations.

You might think that the top AI researchers in the world would be careful about using AI to write their research papers.

Alex Cui, CTO and co-founder of GPTZero – whose company discovered the hallucinations – explains what’s going on.

CREDITS: Presenter and producer: Tom Colls Sound mix: James Beard Production co-ordinator: Brenda Brown Editor: Richard Vadon


Published on Saturday, 21st February 2026.

Available Podcasts from More or Less

Subscribe to More or Less

We are not the BBC, we only list available podcasts. To find out more about the programme including episodes available on BBC iPlayer, go to the More or Less webpage.