"It’s not just a scam, it’s an industry..."
By Tyler Durden: It's yet another reminder of why blindly 'trusting the science' may not always be the best go-to move in the future.
217 year old Wiley science publisher has reportedly "peer reviewed" more than 11,000 papers that were determined to be fake without ever noticing. The papers were referred to as "naked gobbledygook sandwiches", Australian blogger Jo Nova wrote on her blog last week.
"It’s not just a scam, it’s an industry," she said. "Who knew, academic journals were a $30 billion dollar industry?"
According to Nova's post, professional cheating services are employing AI to craft seemingly "original" academic papers by shuffling around words. For instance, terms like "breast cancer" morphed into "bosom peril," and a "naïve Bayes" classifier turns into "gullible Bayes."
Similarly, in one paper, an ant colony was bizarrely rebranded as an "underground creepy crawly state."
The misuse of terminology extends to machine learning, where a 'random forest' is whimsically translated to 'irregular backwoods' or 'arbitrary timberland'.
Nova writes that shockingly, these papers undergo peer review without any rigorous human oversight, allowing egregious errors, like converting 'local average energy' to 'territorial normal vitality', to slip through.
The publisher Wiley has confessed that fraudulent activities have rendered 19 of its journals so compromised that they must be shuttered. In response, the industry is developing AI tools to detect these fakes, a necessary yet disheartening development. Nova writes:
The rot at Wiley started decades ago, but it got caught when it spent US $298 million on an Egyptian publishing house called Hindawi. We could say we hope no babies were hurt by fake papers but we know bad science already kills people. What we need are not “peer reviewed” papers but actual live face to face debate. Only when the best of both sides have to answer questions, with the data will we get real science:
In March, it revealed to the NYSE a $US9 million ($13.5 million) plunge in research revenue after being forced to “pause” the publication of so-called “special issue” journals by its Hindawi imprint, which it had acquired in 2021 for US$298 million ($450 million).
Its statement noted the Hindawi program, which comprised some 250 journals, had been “suspended temporarily due to the presence in certain special issues of compromised articles”.
Many of these suspect papers purported to be serious medical studies, including examinations of drug resistance in newborns with pneumonia and the value of MRI scans in the diagnosis of early liver disease. The journals involved included Disease Markers, BioMed Research International and Computational Intelligence and Neuroscience.
The problem is only becoming more urgent. The recent explosion of artificial intelligence raises the stakes even further. A researcher at University College London recently found more than 1 per cent of all scientific articles published last year, some 60,000 papers, were likely written by a computer.
In some sectors, it’s worse. Almost one out of every five computer science papers published in the past four years may not have been written by humans.
In Australia, ABC has reported on this issue, reflecting concerns over diminishing public trust in universities, which are increasingly seen as businesses rather than educational institutions. This perception is fueled by incidents where universities, driven by financial incentives, overlook academic fraud.
The core of the scientific community is corroded, exacerbated by entities like the ABC Science Unit, which rather than scrutinizing dubious research, often shields it.
This ongoing degradation calls for a shift from traditional peer review to rigorous live debates, ensuring accountability by having people argue their cases in real time.
In December 2023, Nature posted that more than 10,000 papers were retracted in 2023 -- a new record.
You can read Nova's full blog post here.
No comments:
Post a Comment