作者: Ehren Biglari , Marie Feng , John Quarles , Edward Sako , John Calhoon
DOI: 10.1007/978-3-319-20684-4_26
关键词:
摘要: In this paper, we present a haptics-enabled surgical training system integrated with deep learning for characterization of particular procedures experienced surgeons to guide medical residents-in-training quantifiable patterns. The prototype virtual reality is built open-heart surgery specific steps and biopsy operation. Two abstract scenarios are designed emulate incision procedures. Using algorithm (autoencoder), the two were trained characterized. Results show that vector 30 real-valued components can quantify both These values be used compare how resident-in-training performs differently as opposed an surgeon so corrective guidance provided.