Scientific Fraud: How Journals Detect Image Manipulation (Part 2)

  Jan 31, 2018   Enago Academy
  : Academic Writing, Artwork, Figures, & Tables

In the first part of this series, we highlighted the challenges and takes from authors and journals on image manipulation.

So who’s really in charge of making sure that journals do not publish manipulated images? Mike Rossner, for one. While at JCB, he discovered that “about 1% of accepted papers had manipulated images that affected their conclusions; another 25% had some sort of manipulation that violated guidelines.” After Rossner left JCB, he founded Image Data Integrity (IDI), which identifies biomedical data manipulations.

Although Rossner concurs with Bik that some image manipulations may not be strictly unethical, he counsels caution, since something as innocuous as “clean[ing] up unwanted background in an image” can have unintended consequences. “[W]hat may seem to be a background band or contamination may actually be real and biologically important,” Rossner warns, “and could be recognized as such by another scientist.”

Change from Within

Outsourcing image screening to a company like IDI is one approach to identifying altered scientific images. Another solution is that of Germany’s EMBO Press, where in-house image detective Jana Christopher is a full-time image screener. “I check to see if micrographs, photographs, and data … are duplicated or illicitly manipulated,” Christopher explains. “I use tools … [that] allow you in a semi-automated procedure to adjust the contrasts and settings to highlight flaws. It’s a simple process, but you need to know where to look. For example, it is much harder to spot when the blank background on a gel has been cloned—it takes an experienced eye to spot those patterns. I’m very aware of the limitations. If someone really tries to cheat and they cheat well, it’s unlikely that we would see that. But most of what we find are genuine mistakes, which we prevent from entering the scientific literature.”

Forensic Technologies

The forensic tools Christopher and others rely on include Photoshop droplets, as well as Adobe Bridge and ImageJ. Droplets are particularly useful when screening for adjustments of light or dark areas and comparing two images. Bridge is suited to rapid processing of large image batches, and the highly versatile ImageJ freeware can produce quantitative scans of gel bands and display multiple images simultaneously.

As the technology for detecting image manipulation expands and improves, so will the technology for manipulating images. These softwares allow reviewers, editors, and scientists to identify lapses in image integrity, but are they also giving rise to unprecedented scientific misconduct?

For Better or for Worse?

“Research error and misconduct have probably always existed,” Bik points out. “Even scientific luminaries such as Darwin, Mendel, and Pasteur have been accused of manipulating or misreporting their data.” The low resolution of older illustrations likely made manipulation more difficult to discern, but Bik still believes that “the widespread availability and usage of digital image modification software in recent years may have provided greater opportunity for both error and intentional manipulation.”

In Bik’s solution, authors take responsibility: “One possible mechanism to reduce errors at the laboratory level would be to involve multiple individuals in the preparation of figures for publication. The lack of correlation between author number and the frequency of image duplication suggests that the roles of most authors are compartmentalized or diluted, such that errors or misconduct are not readily detected.”

What’s Your Take?

Should authors follow Bik’s model for conducting rigorous due diligence before submitting papers? Would involving multiple scientists in fact-checking increase or decrease the likelihood of error?

Should journals and publishing houses conduct due diligence against image manipulation internally? If so, should peer reviewers be part of that initial process, or should specialist editors screen for manipulated images in accepted papers only?

Or do you most trust a company like IDI to provide impartial, high-quality screening and detection services?

Please share your thoughts in comments! Then take a moment to find out more about how journals respond when a problematic image does make it into a published paper and how to guard against scientific misconduct in your own work.

Still not convinced that manipulation of images always constitutes research misconduct? Click on to get tips on how to edit your images ethically, acceptable and unacceptable categories of image manipulation, and specific tips and techniques for keeping on the right side of science.

You Might Also Like

Comments are closed for this post.

Comments are closed for this post.