Making trials work

Translation of research findings into clinical practice is a key consideration when running definitive trials. Irrespective of whether the intervention is complicated (detailed but predictable) or complex (detailed and unpredictable), an understanding of the range of factors that influence the adoption of evidence is critical. As highlighted by Glasziou & Haynes over ten years ago: “evidence-based practice should not just be concerned with clinical content but also with the processes of changing care and systems of care”.

Trialists adopt a positivist approach to evaluation: two or more identical systems are created into which the intervention is introduced. Observations are then made of outcome differences that occur between experimental and control conditions. Other factors that could influence the outcome (known as “confounders”) are taken account of: a priori (e.g. stratification) or post hoc (e.g. regression). However, many complex interventions “involve the actions of people, are rarely linear, and comprise a series of steps or processes that interact, are prone to modification and exist in open and dynamic systems that change as the result of learning” (Pawson, 2013). In addition, “effect sizes do not provide policy makers with information on how an intervention might be replicated in their specific context, or whether trial outcomes will be reproduced” (Moore et al., 2015).

NWORTH is a leading Clinical Trials Unit in pragmatic trials and trials of complex interventions. We are particularly interested in improving the external validity of what we do i.e. ensuring the trials that we design and support, lead to evidence that can be translated into practice. As a result, we link strongly with Implement@Bangor and are taking forward several methodological strands that link trial design with our internationally leading research in Implementation Science.

Our key areas of mutual interest are:

  • Patient and Public Involvement
  • Co-creation and co-production
  • Trial design and efficiency (including recruitment and retention)
  • Process evaluation
  • Impact and influence

On-going projects include:

  • Implementation science in trial design
  • Realist process evaluation alongside trials
  • Efficiency in trial design (linking with TrialForge
  • Rapid Realist Reviews (oral disease in dependent older people, remuneration in primary care dentistry and preventive interventions for children
  • Experience Based Co-Design with stroke patients

See some of our published research here1 and here2.

  • Glasziou P, Haynes B. EBN 2005;8(4):36-38.
  • Moore GF, Audrey S, Barker M et al. BMJ 2015;350:h1258
  • Pawson R. The Science of Evaluation. London: Sage Publications; 2013.
  • Treweek S, Altman DG, Bower P et al. Trials 2015;16:261.