Monday, January 16, 2006

Reuters on what journals should do in reviewing frontier science

By Maggie Fox, Health and Science Correspondent
WASHINGTON (Reuters) - Meticulous tests like those done to confirm that disgraced Korean scientists legitimately cloned a dog while faking human data may have to be used to validate scientific claims in the future, U.S. scientists said on Tuesday.

A panel at Seoul National University concluded that two reports claiming that human embryos had been cloned to provide stem cells had been completely fabricated, but also found that the same team's claims to have cloned a dog were in fact true.

Dr. Elaine Ostrander of the National Human Genome Research Institute said her team helped confirm that "Snuppy," created last year by Hwang Woo-suk of Seoul National University and colleagues, genetically matched the adult Afghan hound he was cloned from.

"We saw a complete match between the donor Afghan Tai and the putative clone Snuppy," Ostrander told reporters in a telephone briefing.

But it was not a 100 percent match -- also evidence that a method called somatic cell nuclear transfer, or SCNT, was used to clone the animal, just as Hwang had claimed, Ostrander said.

In SCNT, the nucleus of the adult cell to be cloned is removed and placed in a hollowed-out egg cell. Chemical or electrical triggers are used to start the egg growing as if it had been fertilized by a sperm.

The resulting animal will carry DNA from the adult cell, but also a tiny amount of DNA, called mitochondrial DNA, from the donor egg. This is what Ostrander's team found with Snuppy.

Ostrander said her team's methods were inexpensive and might become more common after the Hwang scandal.

"In 20-20 hindsight you would say, yes, this sort of validation should be done on this kind of scientific breakthrough," Ostrander said.

"Obviously as a result of this strict interrogation, this is something that will arise," she added.

Science, the journal that published the fabricated papers on embryonic stem cells, said it would withdraw them and would examine how it had been duped.

"I have pointed out in the past that even unusually rigorous peer review of the kind we undertook in this case may fail to detect cases of well-constructed fraud," said Dr. Donald Kennedy, editor in chief of Science.


Experts inside and outside the journal, which is published by the American Association for the Advancement of Science, would be asked what extra safeguards might prevent future scandals, including affidavits from each team member, Kennedy said.

"We are implementing improved methods of detecting image alteration, although it appears improbable that they would have detected problems in this particular case," he said.

Dr. Curt Civin, editor in chief of another journal, Stem Cells, said one of the photographs used in Hwang's Science paper had appeared in his journal in 2003, as part of a paper about human stem cells that had not been created using cloning technology.

"We have asked the authors to explain this," Civin said in a telephone interview. Hwang did not work on the 2003 paper, but members of his team did.

Scientific fraud is rare and the scandal has upset researchers whose field was already under scrutiny because some people oppose doing work with human embryos at all.

"We are all agonizing over this," Civin said. "We always assume a scientist is trying his or her hardest to present the data as truthfully as possible. Now I think every reviewer and every journal is going to look harder at papers," he added.

"In the past, if you made a big claim, you had to have big proof. Now the proof is going to have to be bigger."

Most medical and scientific research is published in journals because of the peer review process. Experts in a field are asked to look at each study, to confirm that it was conducted properly and that the data support the claims.

It is a highly competitive field, with some journals charging subscribers tens of thousands of dollars a year, and with researchers vying to get into the most prestigious publications. Science and Nature, rivals that published the three Hwang papers, are considered among the most prestigious.


Discussion of reviewing by Newsweek:

Whatever the reasons, the whims of the editors at Science and Nature loom large for many scientists. When either magazine is considering a paper for publication, the authors are told not to speak to the press lest they want to risk rejection. "Every scientists hates them and loves them," says a prominent scientist who would not speak for attribution for fear of offending the editors. "We hate them because it's so political to get an article in them. Frankly I'm astonished at some of the things they accept, and some of the things they reject."

Whether the clamor to appear in these journals has any bearing on their ability to catch fraud is another matter. The fact is, fraud is terrifically hard to spot. Consider the process Science used to evaluate Hwang's 2005 article. Science editors recognized the manuscript's import almost as soon as it arrived. As part of the standard procedure, they sent it to two members of its Board of Reviewing Editors, who recommended that it go out for peer review (about 30 percent of manuscripts pass this test). This recommendation was made not on the scientific validity of the paper, but on its "novelty, originality, and trendiness," says Denis Duboule, a geneticist at the University of Geneva and a member of Science's Board of Reviewing Editors, in the January 6 issue of Science. (Editors would not comment for this story ahead of the completion of Seoul National University's investigation, which was released today. The panel found that Hwang had fabricated all of the evidence for research that claimed to have cloned human cells, but that he had successfully cloned the dog Snuppy.)

After this, Science sent the paper to three stem-cell experts, who had a week to look it over. Their comments were favorable. How were they to know that the data was fraudulent? "You look at the data and do not assume it's fraud," says one reviewer, anonymously, in Science. At a December news conference, editor-in-chief Donald Kennedy maintained that the paper, despite its importance, was not rushed to print. "Any important paper gets careful scrutiny, and I think our peer reviewers gave it that," he said. "It's very difficult for a peer-review process to detect mistakes that are not clearly evident or are deliberate misrepresentations."

In the end, a big scandal now and then isn't likely to do much damage to the big scientific journals. What editors and scientists worry about more are the myriad smaller infractions that occur all the time, and which are almost impossible to detect. A Nature survey of scientists published last June found that one-third of all respondents had committed some forms of misconduct. These included falsifying research data and having "questionable relationships" with students and subjects—both charges leveled against Hwang. Nobody really knows if this kind of fraud is on the rise, but it is worrying.

Science editors don't have any plans to change the basic editorial peer-review process as a result of the Hwang scandal. They do have plants to scrutinize photographs more closely in an effort to spot instances of fraud, but that policy change had already been decided when the scandal struck. And even if it had been in place, it would not have revealed that Hwang had misrepresented photographs from two stem cell colonies as coming from 11 colonies. (Nature's Campbell would not answer questions about review policy or the status of Hwang's 2005 Nature paper on the cloning of Snuppy the dog, which Nature is investigating.) With the financial and deadline pressures of the publishing industry, it's unlikely that the journals are going to take markedly stronger measures to vet manuscripts. Beyond replicating the experiments themselves, which would be impractical, it's difficult to see what they could do to make take science beyond the honor system.


Post a Comment

<< Home