Article Text

Download PDFPDF

E-156 Brainomix easpects software improves interobserver agreement and accuracy of neurologist and neuroradiologists in interpretation of aspects score and outperforms human readers in prediction of final infarct
  1. W Brinjikji1,
  2. J Benson1,
  3. N Campeau1,
  4. C Carr1,
  5. P Cogswell1,
  6. J Klaas2,
  7. G Liebo1,
  8. J Little1,
  9. P Luetmer1,
  10. S Messina1,
  11. A Nagelschneider1,
  12. K Schwartz1,
  13. C Wood1,
  14. D Nasr1,
  15. S Braksick1,
  16. D Kallmes1
  1. 1Radiology, Mayo Clinic, Rochester, MN
  2. 2Neurology, Mayo Clinic, Rochester, MN


Introduction There has been increased interest in the use of artificial intelligence based software packages in the evaluation of neuroimaging for patients with acute ischemic stroke. We performed an inter-rater agreement and accuracy study to determine if the use of the Brainomix eASPECTS software improved interobserver agreement and accuracy in detecting ASPECTS regions affected in anterior circulation LVO.

Methods We included 60 consecutive patients with anterior circulation LVO who had TICI 3 revascularization within 60 minutes of their baseline CT. A total of 16 readers; 6 senior neuroradiologists, 6 junior neuroradiologists and 4 vascular neurologists participated. Readers interpreted CT scans on an independent workstation and assessed final ASPECTS score and evaluated whether each individual ASPECTS region was affected. Two months later, the readers again evaluated the CT scans, but with the assistance of eASPECTS software. We assessed interclass correlation coefficient for total ASPECTS and interobserver agreement with Fleiss’ Kappa for each ASPECTS region with and without assistance of the eASPECTS software. We also assessed accuracy for the readers with and without eASPECTS software assistance. In our assessment of accuracy, ground truth was the 24 hour CT in this cohort of patients who had prompt and complete revascularization as determined by two neuroradiologists.

Results Interclass correlation coefficient for total ASPECTS without eASPECTS assistance was 0.395 indicating fair agreement compared to 0.574 with eASPECTS assistance indicating good agreement (P<0.01). There was significant improvement in inter-rater agreement with eASPECTS assistance for each individual region with the exception of the M6 and caudate. For example, kappa statistics improved from 0.60 to 0.83 for the M1, 0.38 to 0.67 for the M2, 0.35 to 0.57 for the insula and 0.62 to 0.82 for the insula. Overall reader accuracy improved with the use of eASPECTS for every region with the exception of the caudate. For example, accuracy improved from 80.7% to 87.1% for M1, 69.0% to 83.6% for M2 and 85.1% to 95.0% for internal capsule. The eASPECTS software had higher accuracy than the overall cohort of readers (with and without eASPECTS assistance) for every region except the caudate.

Abstract E-156 Table 1

Conclusions The use of Brainomix eASPECTS software result in significant improvements in the inter-rater agreement and accuracy of ASPECTS score evaluation in a large group of neuroradiologists and neurologists. Interestingly, the eASPECTS software was more predictive of final infarct/ASPECTS than the overall group of readers interpreting the CT scans with and without eASPECTS assistance.

Disclosures W. Brinjikji: None. J. Benson: None. N. Campeau: None. C. Carr: None. P. Cogswell: None. J. Klaas: None. G. Liebo: None. J. Little: None. P. Luetmer: None. S. Messina: None. A. Nagelschneider: None. K. Schwartz: None. C. Wood: None. D. Nasr: None. S. Braksick: None. D. Kallmes: None.

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.