Review: On the problems of mixing RCTs with qualitative research: the case of the MRC framework and the evaluation of complex healthcare interventions.
Date
Authors
Advisors
Journal Title
Journal ISSN
ISSN
Volume Title
Publisher
Type
Peer reviewed
Abstract
This is a timely and important contribution to thinking about the social production of evidence in health care. The recent tendency to incorporate qualitative research into process evaluation within RCTs, in order to acknowledge the role of meanings that social actors give to health care situations, is noted. However, the authors here argue that the epistemological certainties of positivism underpinning RCTs are incommensurate with the relativism and scepticism characteristic of interpretivism. The paper lays bare the naive conception of cause-and-effect at the heart of RCTs, and suggests that critical realism may provide a framework for evaluating complex healthcare interventions. Critical realism does not assume, as do RCTs, that a constant conjunction of events can be taken to imply cause and effect. This is because whatever underlying mechanisms may be at work, in particular contexts these mechanisms may be counteracted by other mechanisms, or the mechanism may be present without being activated. The authors therefore propose a model in which results of positivist RCTs on the ‘closed’ systems of tightly controlled experimental designs are given qualified recognition, pending broader critical realism (realistic) evaluation of the operation of the key mechanisms in the ‘open’ systems of complex real-life healthcare settings.
As a challenge to the naivety of traditional RCT work this article makes an important contribution, though the resolution (critical realism, rather than interpretivism, as a handmaiden to positivist RCTs) leaves me with a number of concerns. First, there is no recognition that one of the contexts of RCTs is the social interest and power of medicine as a profession. As the work of Saks (1991) shows in the case of alternative medicine, what is first derided, then antagonistically challenged, by medical power, can finally be incorporated. If ‘realistic evaluation’ ever features in required MRC standards for RCTs, then we can safely assume this is what has transpired, as medics colonize new technologies of research for their own purposes. Second, as Byrne (1998) has pointed out, scientific systems are not as closed as one assumes, and therefore have more in common with ‘open’ social systems than might at first appear to be the case. The limitation to the authors’ characterization of RCTs is that although they appear to be aware of the work of Thomas Kuhn (briefly, science is shaped by social interests and not by evidence alone, science as actually carried out in scientific communities is not a ‘closed’ system) they do not apply these insights to the conduct of RCTs. Readers might also usefully look at Kerr et al, (1997) for an example of social interests at work when scientists contingently move the boundaries between what is considered a pure ‘closed’ system (science) and a politically contaminated ‘open’ system (health policy). Third, the characterization of experiments as ‘closed’ has been said to be misconceived (Latour, 2000). Experiments are not actually about controlling variables so much as according things maximum freedom from context. Rather than accept the positivism of RCTs, why not use critical realism here too? RCTs reconceptualised as maximizing the freedom of patients to strike back would be a start.