Increasingly, insurance company algorithms and controversial treatment guidelines require their presence as a condition for determining the medical necessity of a treatment.
But depending on how they're used RCTs can be incredibly dumb, as well as being the most important tool in a clinical researcher’s toolkit.
Th article has received a far share of media attention, both mainstream and social.
In case you haven’t seen any of the coverage, the study tested “if using a parachute prevents death or major traumatic injury when jumping from an aircraft.” The study compared “(j)umping from an aircraft (airplane or helicopter) with a parachute versus an empty backpack.” Setup as a rigorous, methodologically unassailable RCT, the researchers concluded “(p)arachute use did not reduce death or major traumatic injury when jumping from aircraft in the first randomized evaluation of this intervention.”What!?!? Well, as they report, they were “only able to enroll participants on small stationary aircraft on the ground.” Although they tried to recruit participants both during commercial flights and on the ground, saying they would be randomized to either a parachute or an empty backpack before they jumped from the plane, they were only able to recruit subjects when on the ground.
This information is shared with social media services, sponsorship, analytics and other third-party service providers. “We were unable to identify any randomised controlled trials of parachute intervention,” the authors admitted.They explained further: “As with many interventions intended to prevent ill health, the effectiveness of parachutes has not been subjected to rigorous evaluation by using randomised controlled trials.The researchers also said, “Opponents of evidence-based medicine have frequently argued that no one would perform a randomized trial of parachute use.We have shown this argument to be flawed, having conclusively shown that it is possible to randomize participants to jumping from an aircraft with versus without parachutes (albeit under limited and specific scenarios).” By the way, no participants actually deployed their parachutes—if you throw around square yards of fabric and feet of strings, somebody could get hurt.Most generally it’s a powerful reminder not to trust headlines.When it comes to science, you really do have to read the sometimes dense Methods and Results sections.Otherwise too many dumb RCTs will inform clinical decision making.Too often studies conclude with something like the following from the "parachute study" just because it's the conclusion from an RCT: "Our findings should give momentary pause to experts who advocate for routine use of parachutes for jumps from aircraft in recreational or military settings."So, an RCT taken out of the total clinical context is more of a lead standard than a gold standard; RCTs can be dumb.Skip ahead a quarter of a millennium to 2003, when the (and informally to some as the Limey Medical Journal), published an article entitled, “Parachute Use to Prevent Death and Major Trauma Related to Gravitational Challenge: Systematic Review of Randomised Controlled Trials.” The write-up was a response to a long-held criticism of RCTs, namely, that you don't need them to make reasonable conclusions about certain effects of certain actions—such as jumping out of a plane without a parachute.Indeed, the 2003 paper's objective, “To determine whether parachutes are effective in preventing major trauma related to gravitational challenge,” met with a hard landing.