Medical instrument detection with synthetically generated data

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksMedical Imaging 2024: Imaging Informatics for Healthcare, Research, and Applications
Herausgeber/-innenHiroyuki Yoshida, Shandong Wu
ErscheinungsortSan Diego
Herausgeber (Verlag)SPIE
Seitenumfang9
Band12931
ISBN (elektronisch)9781510671676
ISBN (Print)9781510671669
PublikationsstatusVeröffentlicht - 2 Apr. 2024

Abstract

The persistent need for more qualified personnel in operating theatres exacerbates the remaining staff’s workload. This increased burden can result in substantial complications during surgical procedures. To address this issue, this research project works on a comprehensive operating theatre system. The system offers real-time monitoring of all surgical instruments in the operating theatre, aiming to alleviate the problem. The foundation of this endeavor involves a neural network trained to classify and identify eight distinct instruments belonging to four distinct surgical instrument groups. A novel aspect of this study lies in the approach taken to select and generate the training and validation data sets. The data sets used in this study consist of synthetically generated image data rather than real image data. Additionally, three virtual scenes were designed to serve as the background for a generation algorithm. This algorithm randomly positions the instruments within these scenes, producing annotated rendered RGB images of the generated scenes. To assess the efficacy of this approach, a separate real data set was also created for testing the neural network. Surprisingly, it was discovered that neural networks trained solely on synthetic data performed well when applied to real data. This research paper shows that it is possible to train neural networks with purely synthetically generated data and use them to recognize surgical instruments in real images.

Zitieren

Medical instrument detection with synthetically generated data. / Wiese, Leon Vincent; Hinz, Lennart; Reithmeier, Eduard.
Medical Imaging 2024: Imaging Informatics for Healthcare, Research, and Applications. Hrsg. / Hiroyuki Yoshida; Shandong Wu. Band 12931 San Diego: SPIE, 2024. 12931 .

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Wiese, LV, Hinz, L & Reithmeier, E 2024, Medical instrument detection with synthetically generated data. in H Yoshida & S Wu (Hrsg.), Medical Imaging 2024: Imaging Informatics for Healthcare, Research, and Applications. Bd. 12931, 12931 , SPIE, San Diego. https://doi.org/10.1117/12.3005798
Wiese, L. V., Hinz, L., & Reithmeier, E. (2024). Medical instrument detection with synthetically generated data. In H. Yoshida, & S. Wu (Hrsg.), Medical Imaging 2024: Imaging Informatics for Healthcare, Research, and Applications (Band 12931). Artikel 12931 SPIE. https://doi.org/10.1117/12.3005798
Wiese LV, Hinz L, Reithmeier E. Medical instrument detection with synthetically generated data. in Yoshida H, Wu S, Hrsg., Medical Imaging 2024: Imaging Informatics for Healthcare, Research, and Applications. Band 12931. San Diego: SPIE. 2024. 12931 doi: 10.1117/12.3005798
Wiese, Leon Vincent ; Hinz, Lennart ; Reithmeier, Eduard. / Medical instrument detection with synthetically generated data. Medical Imaging 2024: Imaging Informatics for Healthcare, Research, and Applications. Hrsg. / Hiroyuki Yoshida ; Shandong Wu. Band 12931 San Diego : SPIE, 2024.
Download
@inproceedings{1e80f97a03414c2794ea4f01065a2d30,
title = "Medical instrument detection with synthetically generated data",
abstract = "The persistent need for more qualified personnel in operating theatres exacerbates the remaining staff{\textquoteright}s workload. This increased burden can result in substantial complications during surgical procedures. To address this issue, this research project works on a comprehensive operating theatre system. The system offers real-time monitoring of all surgical instruments in the operating theatre, aiming to alleviate the problem. The foundation of this endeavor involves a neural network trained to classify and identify eight distinct instruments belonging to four distinct surgical instrument groups. A novel aspect of this study lies in the approach taken to select and generate the training and validation data sets. The data sets used in this study consist of synthetically generated image data rather than real image data. Additionally, three virtual scenes were designed to serve as the background for a generation algorithm. This algorithm randomly positions the instruments within these scenes, producing annotated rendered RGB images of the generated scenes. To assess the efficacy of this approach, a separate real data set was also created for testing the neural network. Surprisingly, it was discovered that neural networks trained solely on synthetic data performed well when applied to real data. This research paper shows that it is possible to train neural networks with purely synthetically generated data and use them to recognize surgical instruments in real images.",
keywords = "Yolov8, surgical instrument detection, synthetic data, object detection, Multi-label classification, deep learning",
author = "Wiese, {Leon Vincent} and Lennart Hinz and Eduard Reithmeier",
year = "2024",
month = apr,
day = "2",
doi = "10.1117/12.3005798",
language = "English",
isbn = "9781510671669",
volume = "12931",
editor = "{ Yoshida}, {Hiroyuki } and { Wu}, {Shandong }",
booktitle = "Medical Imaging 2024: Imaging Informatics for Healthcare, Research, and Applications",
publisher = "SPIE",
address = "United States",

}

Download

TY - GEN

T1 - Medical instrument detection with synthetically generated data

AU - Wiese, Leon Vincent

AU - Hinz, Lennart

AU - Reithmeier, Eduard

PY - 2024/4/2

Y1 - 2024/4/2

N2 - The persistent need for more qualified personnel in operating theatres exacerbates the remaining staff’s workload. This increased burden can result in substantial complications during surgical procedures. To address this issue, this research project works on a comprehensive operating theatre system. The system offers real-time monitoring of all surgical instruments in the operating theatre, aiming to alleviate the problem. The foundation of this endeavor involves a neural network trained to classify and identify eight distinct instruments belonging to four distinct surgical instrument groups. A novel aspect of this study lies in the approach taken to select and generate the training and validation data sets. The data sets used in this study consist of synthetically generated image data rather than real image data. Additionally, three virtual scenes were designed to serve as the background for a generation algorithm. This algorithm randomly positions the instruments within these scenes, producing annotated rendered RGB images of the generated scenes. To assess the efficacy of this approach, a separate real data set was also created for testing the neural network. Surprisingly, it was discovered that neural networks trained solely on synthetic data performed well when applied to real data. This research paper shows that it is possible to train neural networks with purely synthetically generated data and use them to recognize surgical instruments in real images.

AB - The persistent need for more qualified personnel in operating theatres exacerbates the remaining staff’s workload. This increased burden can result in substantial complications during surgical procedures. To address this issue, this research project works on a comprehensive operating theatre system. The system offers real-time monitoring of all surgical instruments in the operating theatre, aiming to alleviate the problem. The foundation of this endeavor involves a neural network trained to classify and identify eight distinct instruments belonging to four distinct surgical instrument groups. A novel aspect of this study lies in the approach taken to select and generate the training and validation data sets. The data sets used in this study consist of synthetically generated image data rather than real image data. Additionally, three virtual scenes were designed to serve as the background for a generation algorithm. This algorithm randomly positions the instruments within these scenes, producing annotated rendered RGB images of the generated scenes. To assess the efficacy of this approach, a separate real data set was also created for testing the neural network. Surprisingly, it was discovered that neural networks trained solely on synthetic data performed well when applied to real data. This research paper shows that it is possible to train neural networks with purely synthetically generated data and use them to recognize surgical instruments in real images.

KW - Yolov8

KW - surgical instrument detection

KW - synthetic data

KW - object detection

KW - Multi-label classification

KW - deep learning

U2 - 10.1117/12.3005798

DO - 10.1117/12.3005798

M3 - Conference contribution

SN - 9781510671669

VL - 12931

BT - Medical Imaging 2024: Imaging Informatics for Healthcare, Research, and Applications

A2 - Yoshida, Hiroyuki

A2 - Wu, Shandong

PB - SPIE

CY - San Diego

ER -

Von denselben Autoren