Wednesday, April 1

How to be a science sleuth | Careers


A growing network of science sleuths uncovering research misconduct is sending shockwaves through the entire scientific community. High-profile cases of scientific fraud have emerged, involving prominent figures at institutions including Stanford University and the Dana-Farber Cancer Institute in the US.

Despite the prominence of this issue, there are no set guidelines for identifying and reporting fraud. ‘There is no way of standardising [science sleuthing] and no rules saying that this person should be reprimanded for their misconduct,’ explains Nick Wise, a research integrity manager for publisher Taylor & Francis.

From retracted studies to dubious breakthroughs, the ability to discern reliable studies from suspicious claims is vital for researchers. This guide focusses on two key approaches: how can you spot fraudulent research, and what steps should you take to report any concerns?

Signs of scientific misconduct

Elisabeth Bik

In fields relying on microscopy, western blotting or other visual data representation, image manipulation is a common form of scientific fraud. These are more commonly found in biomedical, nanotechnology, molecular biology and pre-clinical studies. ‘There are roughly three types of duplications’, explains Elisabeth Bik, a microbiologist turned science detective.

‘First, there are simple duplications, where two panels look identical. Second, panels might be repositioned, where panels show overlaps or are mirrored or rotated. The third type of duplication are those where parts of an image have been altered. In these photos, features such as cells, bands or groups of dots appear to be visible multiple times within the same photo or plot or have been taken from other panels to re-appear in a different panel.’ Using software tools such as ImageTwin and Proofig, these duplications can be found. ‘But they are not very useful for gels and blots, so I still look at those by eye,’ says Bik.

With the rise of AI, image fraud is increasing. ‘The real problem, unfortunately, is the ability of AI to generate fake and unique images. The technology has gotten so good that we no longer can distinguish fake from real photos,’ adds Bik. ‘[Instead] we could think about ways to watermark real photos taken by a real microscope.’

Tortured phrases and paper mills

Dorothy Bishop

Describing breast cancer as ‘bosom peril’ or artificial intelligence as ‘counterfeit consciousness’ are examples of tortured phrases – awkward, unfamiliar terms that deviate from standard scientific language. Such phrases are often telltale signs of papers produced by paper mills, organisations that fabricate research articles to sell to individuals who need publications for academic or career advancement. While the average reader may not be hunting for paper mills per se, spotting these oddities can serve as a warning that the paper in question may not be credible and could warrant further scrutiny or even reporting.

Paper mills frequently rely on automated translation tools or purposefully altered language to evade plagiarism detection, resulting in papers with these convoluted expressions. ‘I devised the term “gobbledegook sandwich” for papers that looked like an undergraduate essay on almost any topic, with a highly technical section on machine learning/AI spliced into the middle,’ explains Dorothy Bishop, a retired psychologist and expert in paper mill fraud. ‘Other markers are false email domains that look like academic addresses but aren’t; remarkably international author lists; implausible methods; fake figures, statistics and tables that make no sense; formulaic template structure; “special issue” articles that are not related to the topic of the special issue; citations to arcane sources; citation-stacking’, she adds. ‘I use free bibliometric software like the Dimensions database to check out suspicious authors – you can easily check whether they have had an implausible spike in publications, and also who they publish with.’

Aside from selling authorship, paper mills are also selling citations. To spot if a group of citations is relevant, watch for unusual patterns where a paper is cited by many others without a clear connection or context. ‘If your paper is cited by a lot of papers where it isn’t relevant, it may be that you bought those citations, and we are going to wonder whether you also bought your authorship,’ warns David Bimler, a retired psychology researcher.

Paper mills are increasingly prolific. ‘As soon as we find a way to detect them, they mutate and move on’, says Bishop. While publishers are now using AI tools to detect paper mill products, ‘at the end of the day, you need a human reader’, Bishop adds, reiterating the need to be vigilant while reading scientific research.

Data fabrication

Jack Wilkinson

Scrutinising inconsistencies in the raw data files can reveal cases of data fabrication. With the increasing reliance on AI, researchers can use tools like ChatGPT to generate data sets. While these datasets can appear convincing on the surface, they can fail to capture fundamental nuances. Jack Wilkinson, a biostatistician at the University of Manchester, UK, and clinical trial fraud expert, highlighted inconsistencies in a clinical dataset. ‘You’ve got people with two right eyes, people with different birthdays for each eye, one person has five eyes – which is probably too many eyes,’ he adds.

Fabricated data also tends to show an unusual preference for certain numbers (those ending in 0s or 5s). Real data often has a random distribution.

If data replicates look identical or there is an over-reliance on exact p-values throughout a study, it can be a sign of fraud. Bishop recommends the use of tools like granularity-related inconsistency of means (Grim) and sample parameter reconstruction via iterative techniques (Sprite) to check for mathematical consistency in studies.

Grim and Sprite: Tools for unmasking data inconsistencies

Grim checks whether reported means in a dataset align with what is mathematically possible based on the sample size and the level of granularity, such as whole numbers or decimals. It can quickly detect errors by identifying discrepancies between reported and possible mean values. This tool is particularly useful for spotting inconsistencies in simpler/smaller datasets.

Sprite, on the other hand, takes a more complex approach by reconstructing sample parameters iteratively to test for internal consistency in reported statistics. It works by recreating expected distributions and comparing them with the reported data, helping to identify discrepancies that might suggest errors or manipulation. Sprite is particularly valuable when dealing with more complex or large datasets, offering a deeper level of analysis for detecting inconsistencies. Together, these tools provide researchers with powerful methods to scrutinise study data for inconsistencies, enhancing the credibility and accuracy of scientific research.

Wilkinson iterates the importance of evaluating raw data sets. If the data appears less significant than the reported p-value suggests, it may indicate data manipulation. Recalculating the statistical measures using the reported data can reveal any inconsistencies in p-values. ‘Question whether the p-value they’ve reported is even compatible with the summary data they’ve reported,’ insists Wilkinson. ‘Quite often the answer is no, because if somebody’s just sat down and typed some numbers into a table, they haven’t stopped to think “Oh, this all needs to be consistent.”’ In some cases, blocks of data are repeated in the spreadsheet, ‘to give the impression of there being more data […] and one possible explanation is that this is a crude attempt at fabrication’, adds Wilkinson.

Verifying methodological feasibility

Scrutinising the methods section of papers could give insight into the authenticity of the results reported. ‘Consider the protocol you’ve just read. Could they have done this with the resources available to them, undertaking this barrage of assessments on 400 people in the space of two weeks, for instance,’ advises Wilkinson. ‘A lot of dodgy studies are flagged because they fall down on those kinds of things.’

It is easier to detect methodological inconsistencies in some fields than others. ‘It would be hard to commit fraud in astrophysics, because if you claimed something existed, everyone else would point their telescope at it, and it would quickly become clear if it was there or not,’ explains Bishop. ‘Fraud is easiest in experimental fields where studies are hard to do and may take some years, making it difficult for others to challenge them. I suspect this may make some areas of chemistry more susceptible than others.’

Reporting problematic papers

David Bimler

Most sleuths recommend reporting suspicious papers on PubPeer – a post-publication peer review website. ‘PubPeer frowns upon blunt accusations like “paper milling” and insist that you stick to objective scientific issues when you leave a comment in the discussion threads there, so it’s important to read their policies so they don’t have to moderate your comments,’ explains Bimler. ‘That first step saves time if you want to bring your concerns to the attention of publishers and journal editors – you can link to the PubPeer thread.’

Communicating scientific inconsistencies requires a balanced approach. ‘[The findings] should be carefully worded to avoid direct accusations of fraud, just laying out any facts regarding inconsistencies,’ advises Bishop. ‘In some case I will call journals or publishers out on X (Twitter), or I might work together with journalists trying to alert a broader audience about these problems.’

Meanwhile, the Problematic Paper Screener platform serves as a resource for identifying papers exhibiting signs of misconduct, allowing researchers to check the integrity of studies before relying on them. Checking flagged papers on the Retraction Watch database can help track whether fraudulent studies have been withdrawn or revised.

If you are going to make a formal complaint of misconduct, talk to a lawyer who specialises in academic fraud first

While directly reporting to the research integrity team in journals often kickstarts the investigative process, ‘journal editors might explain that “our policies don’t allow us to retract a paper without the permission of the authors”, or “responsibility for the paper lies with the institutions where the authors work so we have to wait for the results of an institutional inquiry before we take any action” – and meanwhile the institution prefers to leave it up to the journal,’ says Bimler. ‘In general, lower your expectations before you embark on this research integrity hobby.’

Broaching the topic of fraudulent research can be tricky and ‘there are real risks to whistleblowers,’ adds Bishop. ‘But I recently heard an interesting talk from a lawyer specialising in research fraud: if you are going to make a formal complaint of misconduct, talk to a lawyer who specialises in academic fraud first. They can handle the complaint on your behalf if they think it is valid.’

Advice from publishers

Anna Pendlebury

Journals often adhere to guidelines established by bodies like the Committee on Publication Ethics, which provides clear protocols for handling allegations of misconduct. If you suspect scientific fraud in publications, Anna Pendlebury, a publishing ethics specialist at the Royal Society of Chemistry (RSC), suggests you ‘let the handling editor or the journal’s editorial team know as soon as possible. It is best if you can share as much information as possible about your concerns so that the editor can determine whether there is enough evidence to be able to take action’.

In line with these protocols, journals also ensure that authors are given the opportunity to respond to any allegations. ‘[Then] it may be necessary to publish an update to the scientific record in the form of a correction, expression of concern, or a retraction,’ adds Pendlebury. In certain circumstances, journals may even refer the matter to an author’s institution for further investigation.

It is common for research integrity teams from different publications to collaborate in addressing research fraud. ‘The RSC is working with other publishers to tackle paper mills and help identify potential paper mill submissions during the peer review process. We are also a signatory of United2Act, an international collaboration involving a range of stakeholders committed to address the challenge of paper mills.’

Resources to start your sleuthing



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *