A survey from the Palmer College of Chiropractic reveals some interesting results about Chiropractic and student abilities to understand basic research concepts that provide the foundation of an Evidence-Based practice. The study invited Fourteen institutions (two of which could not provide exact numbers of students) from Australia, Canada, US, Denmark and New Zealand participated, and of those invited over 650 responded.
The survey noted that while vast majority of respondents reported having access to medical/healthcare literature through the internet, only 11% read literature every week; even more striking is that 21% did not read literature at all.
When it came to research evidence being used, 13% said that Research Evidence had LITTLE impact in chiropractic care. Approximately the same number of respondents felt that Evidence-based practice was a “temporary fad”.
Of their facilitators, a fifth felt that their institution did not incorporate research evidence into chiropractic education well. This was reflected in the data Knowledge Question tables that demonstrated:
- About 60% reported confidence in assessing study design, but 48% failed to evaluate when critical analysis of information is needed.
- More than 53% believed that randomization in clinical trials was for the purpose of selecting a representative sample of patients for study, rather than obtaining treatment groups with comparable baseline prognoses.
- Only 27% were able to identify when to use a case-control study.
It should be noted almost ONE-THIRD did not even bother to answer ANY them. .. There were five.
Despite students having quite a positive attitude towards EBP, 71% of participants felt they need more training in EBP, and as part of the conclusion the researchers acknowledged that “based on the knowledge questions they may need further training about basic research concepts.”
The Cult of Chiropractic?
It was noted that in the US the promotion of unsupported beliefs and theories may be due to the deficient nature of the curricula, emphasizing “chiropractic philosophy” over research evidence. This has certainly apparent to myself, where I’ve had a number of Chiropractors (and those in training) go on about being “persecuted for their beliefs”, as if “belief” constituted a justification to perform non-evidence-based practice on people who are incorrectly told it is evidence-based.
It has in the past been coupled with the notion that they can be likened to Galileo Galilei, the Physicist/ Mathematician/ Philosopher/ Astronomer who was persecuted by the Catholic Church for advocating the hypothesis that Earth revolved around the Sun. This is hardly the same given the evidence that Galileo was able to produce evidence to support his assertions.
News? No, not really.
To those that have been critical of Chiropractic care, these results are not surprising. Medical Professionals and Skeptics (and I mean capital “s” skeptics) alike have been pointing to evidence that this was the case for years, with some being taken to court about their criticisms (eg. Simon Singh) in an effort to silence them.
The best that can come from this is for Chiropractic practitioners to seek out that training, and I don’t mean in some bullshit context about how it “should apply” to Chiropractic way – That’s pandering to pre-scientific chiropractic philosophy.
I mean the proper “standard by which all other therapies are handled” lessons about Evidence-Based practice.
For those thinking about entering Chiropractic, it will make a world of good; non-sense concepts can be “bred” out of the stubborn old guard opening the doors to greater scrutiny than is currently being applied. Those already doing Chiropractic may be open to this, though it may be hard for hardliners like Australia’s own Anti-Vaccination advocate Warren Sipser (You’re welcome for the SEO) to understand basic research concepts, such as.. oh, I don’t know.. the Dunning-Kruger Effect.
The Dunning-What effect?
Its a psychological phenomena – a cognitive bias. Essentially, when the unskilled make mistakes, their incompetence prevents them from acknowledging them.
How is this relevant?
The study discusses the difference between a similar survey of Allied Healthcare Professionals, and the results generated by this survey:
“our chiropractic student respondents had greater confidence in their ability to assess research study design, generalisability, evaluate bias, sample size and statistical tests. However, student responses to very basic critical appraisal concepts revealed low levels of knowledge that did not match confidence levels.”
The study speaks volumes, really.
Props to whomever made the Memes. Quite helpful.