Surgical Activity Recognition Using TD-CNN-LSTM Model
- Nanthini Narayanan
- Sam Lander Capocyan
- Dr. Adam Charles
- Jayanta Dey
Abstract:
Activity recognition is one of the most essential and challenging tasks in computer vision. The development of a precise activity recognition algorithm on a surgical dataset is particularly pertinent and beneficial, since it could contribute to the guidance of a surgery robot. This project aims to utilize deep learning methods to recognize surgical activity actions. We implemented a Time Distributed CNN-LSTM model. This model was trained end-to-end on the SAR-RARP50 dataset, which consists of video segments recorded during 50 Robot-Assisted Radical Prostatectomies (RARP). The preliminary results on a subset of the data yielded an accuracy of over 90% for 4-class and 8-class classification.