AI hallucinations can’t be stopped — but these techniques can limit their damage. Jones, N. Nature, 637(8047):778–780, January, 2025. Bandiera_abtest: a Cg_type: News Feature Publisher: Nature Publishing Group Subject_term: Machine learning, Computer science, Technology
AI hallucinations can’t be stopped — but these techniques can limit their damage [link]Paper  doi  abstract   bibtex   
Developers have tricks to stop artificial intelligence from making things up, but large language models are still struggling to tell the truth, the whole truth and nothing but the truth.
@article{jones_ai_2025,
	title = {{AI} hallucinations can’t be stopped — but these techniques can limit their damage},
	volume = {637},
	copyright = {2025 Springer Nature Limited},
	issn = {1476-4687},
	url = {https://www.nature.com/articles/d41586-025-00068-5},
	doi = {10.1038/d41586-025-00068-5},
	abstract = {Developers have tricks to stop artificial intelligence from making things up, but large language models are still struggling to tell the truth, the whole truth and nothing but the truth.},
	language = {en},
	number = {8047},
	urldate = {2025-01-22},
	journal = {Nature},
	author = {Jones, Nicola},
	month = jan,
	year = {2025},
	note = {Bandiera\_abtest: a
Cg\_type: News Feature
Publisher: Nature Publishing Group
Subject\_term: Machine learning, Computer science, Technology},
	keywords = {Computer science, Machine learning, Technology},
	pages = {778--780},
}

Downloads: 0