Acute Sport Concussion Assessment Optimization: A Prospective Assessment from the CARE Consortium
Broglio S, Harezlak J, Katz B, Zhao S, McAllister T, McCrea M, CARE Consortium Investigators. Sports Medicine. doi: 10.1007/s40279-019-01155-0.
https://link.springer.com/article/10.1007/s40279-019-01155-0.
Take-Home Message
A sideline concussion assessment with a symptom inventory, Standardized Assessment of Concussion (SAC), and the Balance Error Scoring System (BESS on a firm surface) offers ideal accuracy. Annual baseline assessments provide only slightly improved diagnostic accuracy compared to just post-injury assessments alone.
Summary
Sports medicine recommendations advocate for concussion evaluations that include at least symptom, balance, and neurocognitive assessments. However, it is unclear if there is an optimal set of concussion assessments or if we should compare these results to pre-season baseline evaluations collected each year. Therefore, the authors aimed to 1) examine the specificity and sensitivity of common concussion assessment tools individually and combined within 72 hours after a concussion, and 2) establish the optimal frequency and utility of baseline assessments among collegiate athletes. They used data from the Concussion Assessment, Research, and Education (CARE) Consortium, which consisted of 1,458 athletes suffering 1,640 concussions across 29 institutions. All athletes completed annual baseline assessments and similar assessments up to three times within 72 hours after an injury. Post-injury assessments were categorized as sideline (0 to 1.25 hours), post-event (1.26 to 24 hours), or clinic (25 to 72 hours). The authors examined post-injury assessments in three ways: 1) no baseline comparison, 2) same-season (i.e., annual) baseline comparison, and 3) previous season baseline comparison (i.e., bi-annual; only 770 concussions).
The authors found that the strongest diagnostic accuracy during sideline testing was with the SCAT symptom inventory, the Standardized Assessment of Concussion (SAC), and the Balance Error Scoring System (BESS; firm surface) individually or combined, regardless of which baseline test they used. At the post-event and clinic evaluations, the symptom checklists performed best among the traditional concussion assessments. However, symptoms alone may be most accurate after the event, especially after the first day. The Vestibular Ocular Motor Screening (VOMS), which was only conducted at 9 institutions, had the strongest sensitivity among all tools. Computerized neurocognitive testing via ImPACT was among one of the weakest assessment tools. The diagnostic accuracy among post-injury assessments was slightly better when compared to their annual, pre-injury baseline assessment.
Viewpoints
These findings are similar to a previous study from the CARE Consortium, indicating that the symptom inventory is the strongest sideline assessment tool. However, using only a symptom checklist can be hazardous as this allows patients to choose whether to disclose their symptoms. The authors’ findings complement concussion guidelines that recommend we use multiple assessments (e.g., symptoms, balance) to evaluate a concussion. The VOMS assessment is a promising assessment tool but previously lacked diagnostic accuracy studies to truly evaluate its utility. The authors’ results indicate that the VOMS has some of the strongest diagnostic accuracy among assessment tools to date and may become a new consensus-recommended tool in the toolbox. However, we need to be cautious because the VOMS was only tested on a subset of the participants and it’s unclear how the VOMS would perform among a larger group of athletes at numerous institutions. The findings of relatively weak computerized neurocognitive diagnostic accuracy may seem abnormal but are consistent with previous research that indicated specificity and sensitivity of ImPACT are both 52% (2% better than flipping a coin). Lastly, these findings indicate conducting annual baselines assessments may be ideal. However, the increased value equates to only 7% greater specificity and 17% greater sensitivity. Clinicians should critically evaluate their time and resources available for conducting baseline assessments to determine whether the total cost of baseline testing is worth the additional effort.
Questions for Discussion
Is VOMS currently an assessment you are familiar with? What are some barriers to implementing VOMS in your clinical practice? In your opinion, do you think these findings among collegiate athletes are translatable to younger athletes?
Written by: Landon B. Lempke, MEd, LAT, ATC
Reviewed by: Jeffrey Driban
Related Posts
The Diagnostic Value of Concussion Assessment Tools and it’s Individual Components
The Effectiveness of Computerized Neurocognitive Testing
Concussion Assessment: How the Setting Affects the Tools Used by Athletic Trainers
Should Athletic Trainers Add Anxiety Surveys to Preseason Baseline Testing?
Hi,
From my experiences as a student-athlete and an athletic training student I agree with the statement that only using a symptom checklist isn’t a very reliable test for concussioins, because it is up to the discrestion of the athlete. Athletes want to be able to play and will lie about how they are feeling in order to do so. Also I have never heard of VOMS as an assessment for baseline concussion testing. I’ve seen a combination of the ImPACT testing, SCAT 5 and BESS test used as a baseline for concussion testing. I think that some of the barriers to implanting VOMS in clinical practice is the fact that it is unclear how VOMS would perform with a large group of athletes. This would pose a problem at colleges and universities that have to put hundreds of athletes though concussion baseline testing every year. I believe that these findings are transferable to younger athletes. However, it is difficult to say because younger athletes typically don’t go through any type of baseline concussion testing. The first time I ever went through baseline concussion testing was when I played at a university. I believe that baseline concussion testing should be implemented yearly for younger athletes, especially those that play contact sports.
Tallie,
Thanks for providing your clinical insight to the topic. I believe you’re right in that VOMS shows promise but we likely need a larger sample before clinical implementation. VOMS was completed at baseline among approximately 300 athletes in this study, as compared to approximately 1,500 for the recommended assessments (SCAT, BESS, computerized neurocognitive testing). However, one caveat here is that the article found that baseline testing added minimal improvement to diagnosing concussion and suggests baselines might not be worth the time, money, and coordinating. Ultimately, it depends upon each setting and the resources available. This proves challenging in younger athletes such as high-school or middle-school populations as resources are often limited.
I found this study to be quite interesting because it tied in concussion practices that I am very familiar with and a concussion tool that I have never heard of before in one article. As an AT Student, I have seen the Impact test baseline and post-concussion, the BESS, and a symptom sheet at the time of the injury and the days following. We are taught throughout school that the SCAT 5 is the gold standard in evaluation of a concussion. I have now read up on VOMS and believe that it is something we can look into adding for the future for the population I am working with. Being that the study linked to “VOMS assessment” in this article was done on 14 and 15 year olds, this may be something to do additional research on being that it was deemed as a valid method to assess young patients with concussions. Some barriers that I can find with the VOMS method is the limited variety of subjects studied; the AT’s that I work alongside may be reluctant to change being that what they have been doing has been working towards assessment of concussion/return to play. One problem I have with the VOMS method is that within the 5 domains it states that peripheral vestibular deficits do not effect saccades. This seems like a contradiction to what this article is stating. Overall, I will start the conversation about VOMS with my colleagues.
Olivia,
Thanks for providing your insight to the topic and facilitating discussion with your colleagues! I agree that VOMS is a promising tool and would not be surprised if it became a consensus-recommended assessment tool as research continues. Though not linked in the blog post above, VOMS has been examined across youth, high-school, and collegiate athletes with overall reliable findings for all. As you mentioned though, the challenge with new assessments is their adoption into clinical practice. I’m hopeful though that the spotlight placed on concussion assessment and management in recent years however will accelerate evidence-based practice.
I am an AT student at LIU and began applying some of the course work regarding Concussion Protocol to my clinical sites. The tests used in this research are also what we are learning in class as tools in our toolbox for post-injury concussion testing (including ImPACT despite its low grade). Recently for baseline testing I was exposed to SWAY at my clinical site, which was an app used instead of the BESS as another acute measure for a concussion and balance. I am curious as to where SWAY would fall here.
VOMS is not a measure I am familiar with until this article, and its hyperlinks, however I have been exposed to cranial nerve testing which includes some of the movements used in VOMS.
I don’t see many barriers with VOMS because it is quick and easy. The one barrier that comes to mind though is requiring the athlete to disclose to me their symptoms honestly. I am also not sure what preceding factors. (like an illness or tiredness) VOMS takes into account
In my opinion I think it will be translatable to youth athletes.
Dan,
Thanks for providing your experiences to the post. Concussion assessments are a rapidly evolving market, with new tools developed, marketed, and employed every year. SWAY balance has been around for a few years and believe it is another great “tool in the toolbox”. My only concern with SWAY, and every new tool, is the unknown diagnostic accuracy of the assessment. As a practicing clinician and researcher, I am always hesitant to use any product that requires additional time or monetary resources without knowing the value added from that tool. More research is needed with SWAY in my opinion before further clinical implementation is warranted. I’m glad to see VOMS was a new assessment for you to be exposed to. To my understanding, VOMS is a fairly robust assessment to confounding factors, but as you eluded to, symptom disclosure is still at the root of the problem.
We have established an outreach AT program for youth hockey and for the past 5 years we have conducted baseline testing using SCAT 5 and VOMS on our youth athletes; our sideline assessment uses both of these tools as well (modified VOMS) and we have found both to be extremely valuable diagnostic tools in our toolbox.
Valerie,
It’s great to hear a SCAT5 and VOMS combination is working well in your clinical practice. Though both tools indicate strong diagnostic accuracy, I would also caution relying on only these two measures as both rely on subjective symptom reporting from the patient. Implementing a motor task assessment (i.e. BESS) and a more thorough cognitive assessment than SCAT5 (i.e. computerized neurocognitive testing) can ensure an evidence-based and multi-modal concussion assessment.