{"id":232741,"date":"2026-05-11T19:32:36","date_gmt":"2026-05-11T19:32:36","guid":{"rendered":"https:\/\/yogaesoteric.net\/?p=232741"},"modified":"2026-05-11T19:33:26","modified_gmt":"2026-05-11T19:33:26","slug":"a-scientist-invented-a-fake-disease-ai-told-people-it-was-real","status":"publish","type":"post","link":"https:\/\/yogaesoteric.net\/en\/a-scientist-invented-a-fake-disease-ai-told-people-it-was-real\/","title":{"rendered":"A scientist invented a fake disease. AI told people it was real"},"content":{"rendered":"<p>Bixonimania doesn\u2019t exist except in a clutch of obviously bogus academic papers. So why did AI chatbots warn people about this fictional illness?<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-232745\" src=\"https:\/\/yogaesoteric.net\/wp-content\/uploads\/2026\/05\/2-e1778527869100-300x185.png\" alt=\"\" width=\"560\" height=\"345\" srcset=\"https:\/\/yogaesoteric.net\/wp-content\/uploads\/2026\/05\/2-e1778527869100-300x185.png 300w, https:\/\/yogaesoteric.net\/wp-content\/uploads\/2026\/05\/2-e1778527869100.png 442w\" sizes=\"auto, (max-width: 560px) 100vw, 560px\" \/><\/p>\n<p>Got sore, itchy eyes? You\u2019re probably one of the millions of people who spend too much time staring at screens, being bombarded with blue light. Rub your eyes too much and your eyelids might turn a slight, pinkish hue. So far, so normal. But if, in the past two years, you typed those symptoms into a range of popular chatbots and asked what was wrong with you, you might have got an odd answer: <em>bixonimania<\/em>.<\/p>\n<p>The condition doesn\u2019t appear in the standard medical literature \u2013 because it doesn\u2019t exist. It\u2019s the invention of a team led by Almira Osmanovic Thunstr\u00f6m, a medical researcher at the University of Gothenburg, Sweden, who dreamt up the skin condition and then uploaded two fake studies about it to a preprint server in early 2024.<\/p>\n<p>Osmanovic Thunstr\u00f6m carried out this unusual experiment to test whether large language models (LLMs) would swallow the misinformation and then spit it out as reputable health advice. \u201c<em>I wanted to see if I can create a medical condition that did not exist in the database<\/em>,\u201d she says.<\/p>\n<p>The problem was that the experiment worked too well. Within weeks of her uploading information about the condition, attributed to a fictional author, major artificial-intelligence systems began repeating the invented condition as if it were real.<\/p>\n<p>Even more troublingly, other researchers say, the fake papers were then cited in peer-reviewed literature. Osmanovic Thunstr\u00f6m says this suggests that some researchers are relying on AI-generated references without reading the underlying papers.<\/p>\n<p><strong>Fabricating an illness<\/strong><\/p>\n<p>Bixonimania didn\u2019t exist before March 15, 2024, when two blog posts about it appeared on the website <em>Medium<\/em>. Then, on April 26 and May 6 that year, two preprints about the condition popped up on the academic virtual communication network <em>SciProfiles<\/em> (see https:\/\/doi.org\/qzm5 and https:\/\/doi.org\/qzm4).<\/p>\n<p>The lead author was a phoney researcher named Lazljiv Izgubljenovic, whose photograph was created with AI. Osmanovic Thunstr\u00f6m says the idea to invent Izgubljenovic and bixonimania came out of studies on how large language models work. When she teaches her students how AI systems formulate their \u2018knowledge\u2019, she shows them how the <em>Common Crawl<\/em> database, a giant trawl of the Internet\u2019s contents, informs their outputs.<\/p>\n<p>She also shows students how prompt injection \u2013 giving an AI chatbot a prompt that shunts it outside of its safety guard rails \u2013 can manipulate the output.<\/p>\n<p>Because she works in the medical field, she decided to create a condition related to health and hit on the name bixonimania because it \u201c<em>sounded ridiculous<\/em>\u201d, she says. \u201c<em>I wanted to be really clear to any physician or any medical staff that this is a made-up condition, because no eye condition would be called \u2018mania\u2019 \u2013 that\u2019s a psychiatric term<\/em>.\u201d<\/p>\n<p>If that wasn\u2019t sufficient to raise suspicions, Osmanovic Thunstr\u00f6m planted many clues in the preprints to alert readers that the work was fake. Izgubljenovic works at a non-existent university called Asteria Horizon University in the equally fake Nova City, California.<\/p>\n<p>One paper\u2019s acknowledgements thank \u201c<em>Professor Maria Bohm at The Starfleet Academy for her kindness and generosity in contributing with her knowledge and her lab onboard the USS Enterprise<\/em>\u201d.<\/p>\n<p>Both papers say they were funded by \u201c<em>the Professor Sideshow Bob Foundation for its work in advanced trickery. This works is a part of a larger funding initiative from the University of Fellowship of the Ring and the Galactic Triad<\/em>\u201d.<\/p>\n<p>Even if readers didn\u2019t make it all the way to the ends of the papers, they would have encountered red flags early on, such as statements that \u201c<em>this entire paper is made up<\/em>\u201d and \u201c<em>Fifty made-up persons aged between 20 and 50 years were recruited for the exposure group<\/em>\u201d.<\/p>\n<p>Soon after Osmanovic Thunstr\u00f6m first posted information about the phoney condition, it started showing up in the output of the most commonly used LLM chatbots.<\/p>\n<p>On April 13, 2024, Microsoft Bing\u2019s <em>Copilot<\/em> was declaring that \u201c<em>Bixonimania is indeed an intriguing and relatively rare condition<\/em>\u201d, and on the same day, Google\u2019s <em>Gemini<\/em> was informing users that \u201c<em>Bixonimania is a condition caused by excessive exposure to blue light<\/em>\u201d and advising people to visit an ophthalmologist.<\/p>\n<p>On April 27, 2024, the <em>Perplexity<\/em> AI answer engine outlined its prevalence \u2013 one in 90,000 persons were affected \u2013 and that same month, OpenAI\u2019s <em>ChatGPT<\/em> was telling users whether their symptoms amounted to bixonimania.<\/p>\n<p>Some of those responses were prompted by asking about bixonimania, and others were in response to questions about hyperpigmentation on the eyelids from blue-light exposure.<\/p>\n<p>Such answers by LLMs have alarmed some experts. \u201c<em>If the scientific process itself and the systems that support that process are skilled, and they aren\u2019t capturing and filtering out chunks like these, we\u2019re doomed<\/em>,\u201d says Alex Ruani, a doctoral researcher in health misinformation at University College London. \u201c<em>This is a masterclass on how mis- and disinformation operates<\/em>.\u201d<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-232742\" src=\"https:\/\/yogaesoteric.net\/wp-content\/uploads\/2026\/05\/1-1-300x78.jpg\" alt=\"\" width=\"560\" height=\"146\" srcset=\"https:\/\/yogaesoteric.net\/wp-content\/uploads\/2026\/05\/1-1-300x78.jpg 300w, https:\/\/yogaesoteric.net\/wp-content\/uploads\/2026\/05\/1-1.jpg 767w\" sizes=\"auto, (max-width: 560px) 100vw, 560px\" \/><\/p>\n<p>Ruani says that the details of the fake-disease experiment might seem silly, but there\u2019s a bigger, more fundamental issue. \u201c<em>It looks funny, but hold on, we have a problem here<\/em>,\u201d she says.<\/p>\n<p>Since the fake papers came out, some versions of major LLMs have become sophisticated enough to express suspicion about bixonimania. When asked about the condition on March 11, 2026, for example, <em>ChatGPT <\/em>declared that the condition \u201c<em>is probably a made-up, fringe, or pseudoscientific label<\/em>\u201d.<\/p>\n<p>But a few days later, <em>ChatGPT <\/em>was less sceptical, saying: \u201c<em>Bixonimania is a proposed new subtype of periorbital melanosis (dark circles around the eyes) thought to be associated with exposure to blue light from digital screens<\/em>.\u201d<\/p>\n<p>In mid-March, Microsoft <em>Copilot<\/em> said that bixonimania \u201c<em>is not a widely recognized medical diagnosis yet, but several emerging papers and case reports discuss it as a benign, misdiagnosed condition linked to prolonged exposure to blue light sources such as screens<\/em>\u201d.<\/p>\n<p>And in January this year, <em>Perplexity<\/em> was describing bixonimania as \u201c<em>an emerging term<\/em>\u201d. When shown that response, a <em>Perplexity<\/em> spokesperson said: \u201c<em>Perplexity\u2019s central advantage is accuracy. We don\u2019t claim to be 100% accurate, but we do claim to be the AI company most focused on accuracy<\/em>.\u201d<\/p>\n<p>An OpenAI spokesperson said: \u201c<em>The models that power today\u2019s version of ChatGPT are significantly better at providing safe, accurate medical information, and studies conducted before GPT-5 reflect capabilities that users would not encounter today<\/em>.\u201d<\/p>\n<p>When asked about past responses from <em>Gemini<\/em> that treated bixonimania as a real condition, a Google spokesperson said such results reflected the performance of an earlier model.<\/p>\n<p>They added: \u201c<em>We have always been transparent about the limitations of generative AI and provide in-app prompts to encourage users to double-check information. For sensitive matters such as medical advice, Gemini recommends users consult with qualified professionals<\/em>.\u201d<\/p>\n<p>Microsoft did not respond to a request for comment.<\/p>\n<p>Part of the problem is that AI models can offer wildly different results depending on exactly what is asked and what kind of information they are drawing on. Search for \u201c<em>bixonimania<\/em>\u201d, and Google\u2019s AI overview might treat it as a legitimate condition. Ask it \u201c<em>Is bixonimania real?<\/em>\u201d and the same AI overview might confirm that it isn\u2019t legitimate.<\/p>\n<p>Mahmud Omar, a physician and researcher specializing in the applications of AI in health care at Harvard Medical School in Boston, Massachusetts, says the speed at which AI firms are rolling out new models makes it difficult to reach \u201c<em>a pipeline, a consensus or a methodology to automatically test each model<\/em>\u201d.<\/p>\n<p>The format of the fake-disease experiment \u2013 and the way the results pretended to be from an official source, namely an academic paper, might have been a key factor in its success.<\/p>\n<p>In a separate study of 20 LLMs, Omar found that LLMs are more prone to hallucinate and elaborate on misinformation when the text they\u2019re processing looks professionally medical \u2013 formatted like a hospital discharge note or clinical paper \u2013 than when it comes from virtual communication networks posts (M. Omar <em>et al.<\/em> in <em>Lancet Digit. Health<\/em> 8, 100949; 2026). \u201c<em>When the text looks professional and written as a doctor writes, there\u2019s an increase in the hallucination rates<\/em>,\u201d says Omar.<\/p>\n<p>The experiment\u2019s reach has now spread into the published medical literature. The bixonimania research has been cited by a handful of researchers, including a study that appeared in <em>Cureus<\/em>, a journal published by Springer Nature, the publisher of <em>Nature<\/em>, by researchers at the Maharishi Markandeshwar Institute of Medical Sciences and Research in Mullana, India (S. Banchhor <em>et al<\/em>. in <em>Cureus<\/em> 16, e74625 (2024); retraction 18, r223 (2026)).<\/p>\n<p>(<em>Nature<\/em>\u2019s news team is editorially independent of its publisher.) That study cites one of the fake preprints and says: \u201c<em>Bixonimania is an emerging form of POM [periorbital melanosis] linked to blue light exposure; further research on the mechanism is underway<\/em>.\u201d<\/p>\n<p>The corresponding author did not respond to a request for comment on this story.<\/p>\n<p>After <em>Nature<\/em> contacted <em>Cureus<\/em> to ask for comment, the journal retracted the paper on March 30. The retraction notice says: \u201c<em>This article has been retracted by the Editor-in-Chief due to the presence of three irrelevant references, including one reference to a fictitious disease. As a result, the journal\u2019s editorial staff no longer has confidence in the accuracy or provenance of the work, thus requiring retraction. The authors disagree with the decision to retract<\/em>.\u201d<\/p>\n<p>Ruani says the problem goes beyond LLMs because the bixonimania experiment also hoodwinked humans who cited the fake research. \u201c<em>We need to protect our trust like gold<\/em>,\u201d she says. \u201c<em>It\u2019s a mess right now<\/em>.\u201d<\/p>\n<p>&nbsp;<\/p>\n<p><strong>yogaesoteric<br \/>\nMay 11, 2026<\/strong><\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Bixonimania doesn\u2019t exist except in a clutch of obviously bogus academic papers. So why did AI chatbots warn people about this fictional illness? Got sore, itchy eyes? You\u2019re probably one of the millions of people who spend too much time staring at screens, being bombarded with blue light. Rub your eyes too much and your [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[1620],"tags":[],"class_list":["post-232741","post","type-post","status-publish","format-standard","hentry","category-the-threat-of-artificial-intelligence"],"_links":{"self":[{"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/posts\/232741","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/comments?post=232741"}],"version-history":[{"count":2,"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/posts\/232741\/revisions"}],"predecessor-version":[{"id":232749,"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/posts\/232741\/revisions\/232749"}],"wp:attachment":[{"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/media?parent=232741"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/categories?post=232741"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/yogaesoteric.net\/en\/wp-json\/wp\/v2\/tags?post=232741"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}