It’s been a tough week for the editors of the venerable journal Science.
First came the controversy surrounding their decision, at the request of government security officials, to omit key details in a paper describing how researchers had managed to create an aerosolized form of the deadly H5N1 bird flu virus. The government feared that such research could fall into the wrong hands and fuel a bioterror attempt.
Then, on Thursday came the difficult decision to retract a 2009 paper describing how a virus could be responsible for chronic fatigue syndrome. The retraction wasn’t a complete surprise, however, since the authors of the paper had already partially retracted some of its results, and many other follow-up studies had cast serious doubt on its findings.
In the original 2009 paper, the 13 co-authors described how a mouse virus known as xenotropic murine leukemia (XMRV) had consistently shown up in the blood of 67% of patients with chronic fatigue syndrome, but in only 4% of those without the condition. Led by senior author Judy Mikovits at the Whittemore Peterson Institute in Reno, the researchers maintained that they had isolated XMRV from chronic fatigue patients and that the virus was infectious and could be transmitted from person to person.
The news provided some relief to the nearly 17 million people worldwide believed to be affected by chronic fatigue; while its symptoms of lethargy, pain and difficulty concentrating are real, for years doctors have struggled to find its cause. Initially, many doctors discounted sufferers’ feelings of generalized malaise as nothing more than stress or normal fatigue.
The problem with the finding was that no other labs could replicate it. Other researchers consistently failed to find the same virus at the same rates that the first group did, raising doubts about whether the connection between XMRV and chronic fatigue was legit. In May, two new papers confirmed that the original findings were due to laboratory contamination of the blood samples. In September, a re-analysis of the original data sounded a death knell, as nine separate labs, including Mikovits’ group, could not find evidence of XMRV in the original chronic fatigue patients who had previously tested positive for the virus. The authors issued a partial retraction.
A day after the re-analysis and partial retraction were published in Science, however, Mikovits defended her original results in front of an international conference of experts in chronic fatigue, arguing that the disease was caused by a virus “highly related” to XMRV.
It wasn’t enough to convince the experts or the editors at Science. On Thursday, the journal decided to retract the paper completely, saying [PDF] that “Science has lost confidence in the report and the validity of its conclusions” and that the journal “regret[s] the time and resources that the scientific community has devoted to unsuccessful attempts to replicate these results.”
It’s always regrettable when science goes wrong, but not always avoidable. While journals like Science conduct rigorous reviews of the data submitted by potential authors, both with experts and non-experts in field, short of re-doing every experiment, at some point, the reviewers must trust that the information provided in submissions is accurate. It’s not clear what went wrong with the XMRV work in question, but there is solace in one thing, both for scientists and for patients who rely on the results of their work: the scientific field is a community of peers who rely on, replicate and share each other’s work, and that community can help reveal cases when the data aren’t solid, or, in the worst cases, when they’re fraudulent.
In 2006, Science was forced to retract the results of South Korean scientist Woo Suk Hwang, who claimed to have perfected the cloning process in human cells. When other researchers brought to the editors’ attention problems with the cell images appearing in Hwang’s groundbreaking study, it was revealed that Hwang’s “cloned” cells were fake. (They had in fact arisen from a well-established and familiar technique.)
Since that incident, editors at journals like Science and Nature have taken a hard look at the way they review submissions for publication, in particular those that are rushed into issues in an effort to be first to print. Science, especially good science, takes time.
Alice Park is a writer at TIME. Find her on Twitter at @aliceparkny. You can also continue the discussion on TIME’s Facebook page and on Twitter at @TIME.