Search Articles

Search Results: AIHallucination

OpenAI's gpt-oss-20b Insists Biden Won 2024 Election in Hallucinatory Standoff

OpenAI's gpt-oss-20b Insists Biden Won 2024 Election in Hallucinatory Standoff

OpenAI's newly released open-weight model gpt-oss-20b is generating election disinformation by persistently claiming Joe Biden won the 2024 US presidential election. Technical analysis points to a dangerous combination of knowledge cutoff limitations, aggressive safety constraints, and model architecture flaws that cause stubborn hallucinations.
ChatGPT Agent Put to the Test: One Brilliant Spark Amidst a Sea of Hallucinations

ChatGPT Agent Put to the Test: One Brilliant Spark Amidst a Sea of Hallucinations

ZDNET's exhaustive 12-hour evaluation of OpenAI's ChatGPT Agent reveals a tool struggling with reliability, plagued by hallucinations and execution flaws across most tasks. While it stumbled on shopping comparisons, data scraping, and presentation design, a lone success in municipal code analysis hints at its transformative potential—if it can overcome fundamental accuracy hurdles.