Addressing the myths behind RCTs 

01
Jul 19
Share on LinkedInTweet about this on TwitterGoogle+Share on FacebookEmail to someone

Public sector policy professionals work passionately to implement policies and address short-comings in health, education, welfare and poverty alleviation.

This work is often undermined as methods of evaluating the impact of policy and programs are often as out-dated as the policy itself, meaning many ineffective policies remain in place. 

Government is now realising the benefit of conducting randomised controlled trials- the gold standard method for assessing the true impacts of programs and policies. However policy practitioners are still fighting to overcome resistance from senior executives based on a lack of education and understanding. 

A study by Arnold Ventures revealed the most common myths associated with RCTs and why they’re largely inaccurate. 

Myth 1: RCTs are expensive and slow

Debunked: RCTs are no more expensive than any other form of evaluation. Expense can be incurred when tracking individuals or organisations after the RCT to monitor outcomes, however data collection is an expense for any form of evaluation. 

Many RCTs can be done cheaply because government agencies already track key data, meaning policy experts can ‘piggyback’ on that data without the expense of collecting the data themselves. 

Myth 2: RCTs are unethical

Debunked: Critics argue if researchers believe an intervention will improve learning, health, living conditions, etc, then withholding it from a group could be seen as unnecessary and unethical. The report cites the example of a parachute which has never undergone RCTs as it would be absurd to allow one group to jump out of a plane without one.

In reality, many social interventions are not as obvious as parachutes and occur in more complex environments where it is “impossible to infer causation through simple intuition, observation, or even before/ after comparisons”. 

Myth 3: RCTs are limited to narrow contexts or questions 

RCTs are sometimes viewed as being excellent methods of proving very limited facts: that a particular program, in a certain environment, for certain people, at a certain time, made a difference. 

However it is possible to replicate RCTs in wider and more “diverse settings to determine whether and how the findings generalise.” Additionally, it’s possible to undertake RCTs at multiple sites simultaneously with the appropriate consideration of other factors. 

Myth 4: RCTs are a black box

Debunked:  Some argue that RCTs are a ‘black box’ because all they can tell you how the effectiveness of a policy or treatment- that is, how well it works, but not why. This is true of any form of quantitative research. 

However RCTs can be performed to measure different variables and determine which mechanism is responsible for the outcome. The report argues “RCTs are superior to other methods of evaluation because they allow researchers to look at distinct mechanisms.” 

Myth 5: RCTs can still be biased 

Debunked: Qualitative studies with observational findings are much more vulnerable to manipulation at the hands of the researcher, while RCTs can only be evaluated by comparing the average outcome across two controlled groups. 

Join a room of like-minded policy practitioners to discuss why RCTs are the gold standard for accurate assessment of policy impact and the key steps involved in designing and implementing an RCT. 

Criterion’s Applying Randomised Controlled Trials for Program & Policy Evaluation is coming to the QT Canberra on August 27 & 28, 2019. 

The two day masterclass is facilitated by Professor Michael J Hiscox from Harvard University’s Department of Government. Professor Hiscox was the founding Director of the Behavioural Economics Team in the Department of the Prime Minister and Cabinet between 2015 and 2017. He continues to serve as an adviser to BETA. 

This is your opportunity to rethink your current evaluation methodology and implement practices which deliver meaningful, actionable results with an unparalleled leader of RCT.

Submitted by Criterion Content Team

Criterion Content Team

This post has been written by the Criterion Conferences Content Team. Based in Sydney, we are an independent research organisation, producing over 90 conferences a year across a variety of industries. Our events, attended by thousands of senior delegates from the public and private sector, are designed to enrich, inspire and motivate. Our focus is on providing innovative, value adding content via our conferences and blogs like this are extension of that principle. You can view our conferences by visiting our website http://www.criterionconferences.com/conferences.

Leave a Comment

Your email address will not be published. Required fields are marked *

Other blog posts you may enjoy: