A mysterious company’s coronavirus papers in top medical journals may be unraveling

A hydroxychloroquine study is being audited.

AP Photo/John Locher

Science‘s COVID-19 reporting is supported by the Pulitzer Center.

On its face, it was a major finding: Antimalarial drugs touted by the White House as possible COVID-19 treatments looked to be not just ineffective, but downright deadly. A study published on 22 May in The Lancet used hospital records procured by a little-known data analytics company called Surgisphere to conclude that coronavirus patients taking chloroquine or hydroxychloroquine were more likely to show an irregular heart rhythm—a known side effect thought to be rare—and were more likely to die in the hospital.

Within days, some large randomized trials of the drugs—the type that might prove or disprove the retrospective study’s analysis—screeched to a halt. Solidarity, the World Health Organization’s (WHO’s) megatrial of potential COVID-19 treatments, paused recruitment into its hydroxychloroquine arm, for example.

But just as quickly, the Lancet results have begun to unravel—and Surgisphere, which provided patient data for two other high-profile COVID-19 papers, has come under withering online scrutiny from researchers and amateur sleuths. They have pointed out many red flags in the Lancet paper, including the astonishing number of patients involved and details about their demographics and prescribed dosing that seem implausible. “It began to stretch and stretch and stretch credulity,” says Nicholas White, a malaria researcher at Mahidol University in Bangkok.

Today, The Lancet issued an Expression of Concern (EOC) saying “important scientific questions have been raised about data” in the paper and noting that “an independent audit of the provenance and validity of the data has been commissioned by the authors not affiliated with Surgisphere and is ongoing, with results expected very shortly.”

Hours earlier, The New England Journal of Medicine (NEJM) issued its own EOC about a second study using Surgisphere data, published on 1 May. The paper reported that taking certain blood pressure drugs including angiotensin-converting enzyme (ACE) inhibitors didn’t appear to increase the risk of death among COVID-19 patients, as some researchers had suggested. (Several studies analyzing other groups of COVID-19 patients support the NEJM results.) “Recently, substantive concerns have been raised about the quality of the information in that database,” an NEJM statement noted. “We have asked the authors to provide evidence that the data are reliable.”

A third COVID-19 study using Surgisphere data has also drawn fire. In a preprint first posted in early April, Surgisphere founder and CEO Sapan Desai and co-authors conclude that ivermectin, an antiparasitic drug, dramatically reduced mortality in COVID-19 patients. In Latin America, where ivermectin is widely available, that study has led government officials to authorize the drug—although with precautions—creating a surge in demand in several countries.

Chicago-based Surgisphere has not publicly released the data underlying the studies, but today Desai told Science through a spokesperson that he was “arranging a nondisclosure agreement that will provide the authors of the NEJM paper with the data access requested by NEJM.”

Meanwhile, the questions swirling around the Lancet paper have left leaders of the halted chloroquine trials weighing whether to restart. “The problem is, we are left with all the damage that has been done,” says White, a co-investigator on a trial of hydroxychloroquine for COVID-19 prevention that was halted at the request of U.K. regulators last week. Headlines proclaiming deadly effects will make it hard to recruit patients to key studies, he says. “The whole world thinks now that these drugs are poisonous.”

A striking observation

Desai’s co-authors on the Lancet paper were cardiologist Mandeep Mehra of Harvard University’s Brigham and Women’s Hospital (BWH), cardiologist Frank Ruschitzka of the University Hospital Zürich, and cardiac surgeon Amit Patel, who listed affiliations with the University of Utah and HCA Research Institute in Nashville, Tennessee. Their study described an analysis of Surgisphere-provided electronic health record data from patients already treated for COVID-19 at 671 hospitals on six continents.

According to the paper, the analysis included nearly 15,000 patients prescribed chloroquine or hydroxychloroquine, alone or in combination with a class of antibiotics that has been suggested to boost its effects. A control group consisted of more than 81,000 patients who hadn’t gotten the experimental drugs. After controlling for potentially confounding factors such as age, race, pre-existing disease, and COVID-19 severity, the researchers found that the risk of dying in the hospital was 9.3% for the control group versus 23.8% for those getting hydroxychloroquine alongside an antibiotic—apparently the riskiest of the treatment combinations. The results echoed a preprint published last month, based on a much smaller group of patients in U.S. Veterans Health Administration medical centers, which suggested an increased risk of death for patients who were prescribed hydroxychloroquine alone (though not in combination with an antibiotic).

In 25 May media briefing, WHO Director-General Tedros Adhanom Ghebreyesus cited the Lancet results in announcing a “temporary pause” in Solidarity’s hydroxychloroquine arm. Regulators in France and the United Kingdom also instructed investigators, including White’s team, to halt enrollment in trials of the malaria drug. And Sanofi, which manufactures the branded hydroxychloroquine drug Plaquenil, said it would temporarily stop recruiting patients to its two clinical trials of the drug.

The Lancet authors acknowledged that their results needed confirmation from more rigorous randomized trials, but in an interview with TRT World, a Turkish channel for international news, Desai expressed confidence. “The real question is: With data like this, do we even need a randomized controlled trial?” he said.

Other researchers immediately took issue with the analysis. The study doesn’t properly control for the likelihood that patients getting the experimental drugs were sicker than the controls, says Matthew Semler, a critical care physician at Vanderbilt University. If you have a physician sitting with two patients who have coronavirus, and the physician chooses to give one of them hydroxychloroquine, theyre doing it for a reason,” he says. The patient may be relying on high levels of supplemental oxygen, for example, or getting worse over time. But those kinds of details aren’t available about the patients in the Lancet study, he notes.

Other researchers were befuddled by the data themselves. Though 66% of the patients were reportedly treated in North America, the reported doses tended to be higher than the guidelines set by the U.S. Food and Drug Administration, White notes. The authors claim to have included 4402 patients in Africa, 561 of whom died, but it seems unlikely that African hospitals would have detailed electronic health records for so many patients, White says.

The study also reported more deaths in Australian hospitals than the country’s official COVID-19 death statistics, The Guardian reported. On 29 May, The Lancet issued a correction updating a supplemental table  and saying that a hospital assigned to the study’s “Australasia” group should have been assigned to Asia. There have been no changes to the findings of the paper,” the correction notice said.

Deepening skepticism

The brief response left some researchers frustrated. “This was very, very annoying, that The Lancet were just going to let them write this absurd reply … without addressing any of the other concerns,” says James Watson, a statistician at Mahidol who on 28 May published an open letter to the journal and the study’s co-authors, signed by more than 200 clinicians and researchers, that calls for the release of Surgisphere’s hospital-level data, an independent validation of the results, and publication of the peer review comments that led to the Lancet publication.

Today, many of the same researchers published an open letter to NEJM and the authors of the ACE inhibitor study, citing similar problems in that journal’s paper. The letter notes a discrepancy between the small number of hospitals in each country that are reported to have shared patient data with Surgisphere and the high proportion of those countries’ confirmed COVID-19 cases reported in the study. It also notes inconsistencies in the reported increases in the risk of COVID-19 death with increasing age of participants.

Mehra and Patel declined to speak to reporters about the various papers, referring inquiries to BWH, which released a statement on Mehra’s behalf this evening saying “independent of Surgisphere, the remaining co-authors of the recent studies published in The Lancet and The New England Journal of Medicine have initiated independent reviews of the data used in both papers after learning of the concerns that have been raised about the reliability of the database.” (Ruschitzka, who is on the Lancet paper, has not yet responded to Science’s requests for comment.)

Oddities also appear in the ivermectin study, says Carlos Chaccour of the Institute for Global Health in Barcelona, who knows the drug well because he’s studying its potential role in mosquito control. There’s evidence that ivermectin has antiviral properties, and a study from an Australian team published in Antiviral Research on 3 April showed that it inhibits SARS-CoV-2 in a test tube. A 6 April preprint co-authored by Patel, Desai, and Mehra, along with David Grainger of the University of Utah, used Surgisphere data reportedly collected at 169 hospitals around the world between 1 January and 1 March. It included three patients in Africa who received ivermectin—despite the fact that only two COVID-19 cases had been reported in all of Africa by 1 March, Chaccour and two colleagues note in a recent blog post.

Chaccour says after he inquired about the discrepancy, the authors posted a second, longer version of the manuscript on 19 April, containing data collected between 1 January and 31 March. (The first version was removed but Chaccour has archived it on his institute’s website.) The new manuscript contained data on 704 COVID-19 patients treated with ivermectin and 704 controls in 169 hospitals on three continents. It reported that ivermectin reduced the need for mechanical ventilation by 65% and slashed the death rate by 83%.

But the revision had other problems, Chaccour and his colleagues wrote in their blog post. For example, the mortality rate for patients who received mechanical ventilation but no ivermectin was just 21%, which is strikingly low; a recent case series from New York City area found that 88% of COVID-19 patients who needed ventilation died. Also, the data shown in a figure were wildly different from those reported in the text. (Science also attempted to reach Grainger, but received no reply to an email and call.)

The ivermectin study has had a significant impact in Latin America, where the drug is well known and often sold over the counter to treat scabies, Chaccour says. The Peruvian Health Ministry modified its COVID-19 treatment protocol to include ivermectin (as well as hydroxychloroquine) for mild and severe cases of COVID-19. Demand for the drug in Peru has surged, and in the San Martín de Porres district, police confiscated 20,000 bottles of veterinary ivermectin intended to be sold for human treatments. In Trinidad, Bolivia, the city government aimed to hand out more than 350,000 free doses of ivermectin after the country’s Ministry of Health authorized its use against COVID-19. Physicians in the Dominican Republic, Peru, and Chile, citing the test-tube study and the Surgisphere preprint, say they performed informal trials of ivermectin with COVID-19 patients and saw good outcomes.

(In a guest editorial in The American Journal of Hygiene and Tropical Medicine, Chaccour and three colleagues cautioned against the risks of using ivermectin without solid evidence and urged rigorous clinical trials. Eighteen such studies are ongoing, according to the website ClinicalTrials.gov, including one led by Chaccour in Pamplona, Spain.)

Surgisphere’s sparse online presence—the website doesn’t list any of its partner hospitals by name or identify its scientific advisory board, for example—have prompted intense skepticism. Physician and entrepreneur James Todaro of the investment fund Blocktown Capital wondered in a blog post why Surgisphere’s enormous database doesn’t appear to have been used in peer-reviewed research studies until May. Another post, from data scientist Peter Ellis of the management consulting firm Nous Group, questioned how LinkedIn could list only five Surgisphere employees—all but Desai apparently lacking a scientific or medical background—if the company really provides software to hundreds of hospitals to coordinate the collection of sensitive data from electronic health records. (This morning, the number of employees on LinkedIn had dropped to three.) And Chaccour wonders how such a tiny company was able to reach data-sharing agreements with hundreds of hospitals around the world that use many different languages and data recording systems, while adhering to the rules of 46 different countries on research ethics and data protection.

Desai’s spokesperson responded to inquiries about the company by saying it has 11 employees and has been developing its database since 2008. Desai, through the spokesperson, also said of the company’s work with patient data: “We use a great deal of artificial intelligence and machine learning to automate this process as much as possible, which is the only way a task like this is even possible.”

What next?

The potential of hydroxychloroquine for treating COVID-19 has become a political flashpoint, and the questions around the Lancet paper have provided new fodder to the drug’s supporters. French microbiologist Didier Raoult, whose own widely criticized studies suggested a benefit from the drug, derided the new study in a video posted today, calling the authors “incompetent.” On social media, some speculated that the paper was part of a conspiracy against hydroxychloroquine.

For scientists running randomized trials of hydroxychloroquine, an urgent question has been how to respond to the paper and the many questions raised about it. Some studies were not halted at all. A hydroxychloroquine trial known as ORCHID, funded by the U.S. National Heart, Lung, and Blood Institute, opted to keep running after its data and safety monitoring board (DSMB) reviewed safety data from already enrolled participants, says Semler, a co-investigator on the study.

WHO’s paused Solidarity trial is awaiting a similar review from its DSMB, says Soumya Swaminathan, the organization’s chief scientist. The pause will allow time for a review of published studies and interim data from Solidarity itself, she says. WHO paused the trial to show investigators and potential study participants that the agency takes safety issues seriously, she says. We want to reassure people that the WHO didnt make any kind of value judgment on the use of hydroxychloroquine.”

But some say WHO had a knee-jerk reaction to a questionable study. “This is a drug that has been used for decades. Its not like we know nothing about its safety,” says Miguel Hernán, a Harvard epidemiologist and co-investigator on an ongoing trial of hydroxychloroquine in Spain and Latin America for COVID-19 prevention in health care workers.

The controversy has been an unfortunate distraction, Hernán adds. If you do something as inflammatory as this without a solid foundation, you are going to make a lot of people waste time trying to understand what is going on.”

Chaccour says both NEJM and The Lancet should have scrutinized the provenance of Surgisphere’s data more closely before publishing the studies. “Here we are in the middle of a pandemic with hundreds of thousands of deaths, and the two most prestigious medical journals have failed us,” he says.

With reporting by Rodrigo Pérez Ortega, Charlie Piller, and John Travis.

source: sciencemag.org