4. Real-time Perceptually-enabled Task Guidance in the Extremes
DARPA Project
This a DARPA-funded project on the use of augmented reality for providing knowledge task guidance in extreme situations. The research challenge is to create a data-to-knowledge "pipeline" from wearable devices to an intelligent agency that can determine what information or guidance to provide to human operators in real-time.
4.1 Project overview
This project is to investigate augmented reality (AR) technologies for providing real-time knowledge-task guidance in extreme environments, including military helicopter pilot checklist completion and Army field-medic care for soldier amputee cases. Research requires application of formal task modeling approaches, including object-action identification with contingency states. User state modeling is to be conducted with real-time physiological measures and machine learning methods for state classification. User models are to be linked to AR interfaces to trigger presentation of task-relevant guidance during high-fidelity simulator trials. Guidance will be based on real-time execution of task simulation models in sync with experiment protocols being performed by actual human operators. Results will be used to support Army applications of AR devices.
My responsibility in this project is to use feature engineering and ML techniques to predict operator cognitive workload based on real-time physiological signals collected by the wearable devices.
4.2 Related papers
[C8]. Liu, Y., Grimaldi, N., Basnet, N., Zahabi, M., and Kaber, D. (2024). Classifying cognitive workload using machine learning techniques. IEEE International Conference on Human-Machine Systems. Accepted.
[C9]. Grimaldi, N., Liu, Y., McKendrick, R., Ruiz, J., and Kaber, D. (2024). Deep learning forecast of cognitive workload using fNIRS data. IEEE International Conference on Human-Machine Systems. Accepted.