(2025) Evaluating the Performance of Agreement Metrics in a Delphi Study on Chemical, Biological, Radiological and Nuclear Major Incidents Preparedness Using Classical and Machine Learning Approaches. Journal of Contingencies and Crisis Management. p. 20. ISSN 0966-0879
Full text not available from this repository.
Abstract
Delphi studies in disaster medicine lack consensus on expert agreement metrics. This study examined various metrics using a Delphi study on chemical, biological, radiological, and nuclear (CBRN) preparedness in the Middle East and North Africa region. Forty international disaster medicine experts evaluated 133 items across ten CBRN Preparedness Assessment Tool themes using a 5-point Likert scale. Agreement was measured using Kendall's W, Intraclass Correlation Coefficient, and Cohen's Kappa. Statistical and machine learning techniques compared metric performance. The overall agreement mean score was 4.91 +/- 0.71, with 89.21 average agreement. Kappa emerged as the most sensitive metric in statistical and machine learning analyses, with a feature importance score of 168.32. The Kappa coefficient showed variations across CBRN PAT themes, including medical protocols, logistics, and infrastructure. The integrated statistical and machine learning approach provides a promising method for understanding expert consensus in disaster preparedness, with potential for future refinement by incorporating additional contextual factors.
Item Type: | Article |
---|---|
Keywords: | agreement analysis Delphi study disaster medicine expert's opinion MENA reporting guidelines Business & Economics |
Page Range: | p. 20 |
Journal or Publication Title: | Journal of Contingencies and Crisis Management |
Journal Index: | ISI |
Volume: | 33 |
Number: | 2 |
Identification Number: | https://doi.org/10.1111/1468-5973.70044 |
ISSN: | 0966-0879 |
Depositing User: | خانم ناهید ضیائی |
URI: | http://eprints.mui.ac.ir/id/eprint/31342 |
Actions (login required)
![]() |
View Item |