Computer scientist Gabriel Zachmann of Bremen and visceral surgeon Dirk Weyhe of Oldenburg share a fascination with overcoming everyday challenges in the operating room with new, often futuristic technologies. Their goal is to ease the workload of operating room staff, reduce personnel costs, and improve patient safety. The shortage of skilled workers, particularly nurses, is always at the back of their minds, and it is also noticeable in the operating room.
According to the German Nursing Council, the umbrella organization for over 20 professional and interest groups in nursing and midwifery, there will be a shortage of up to 500,000 nursing staff by 2034. This forecast is sobering. Demographic change is hitting the healthcare sector particularly hard. On the one hand, the number of people relying on care in hospitals or nursing homes is rising. On the other hand, fewer young people are entering these professions.
This outlook motivates Dirk Weyhe, a professor of visceral surgery at the University of Oldenburg and the director of the University Clinic for Visceral Surgery at Pius Hospital Oldenburg. He is researching ways to optimize operating room (OR) processes using high-tech solutions that are less taxing on the remaining medical and nursing staff while being safer for patients. Together with Prof. Dr. Gabriel Zachmann, a professor of computer graphics and virtual reality, as well as the director of the Center for Computing Technologies (TZI) at the University of Bremen, Weyhe is exploring innovative approaches. Currently, the computer scientist and the physician are exploring ways to project surgical device displays and control panels onto a surgical drape using projectors and operate them via gesture control. This could potentially reduce staffing requirements.
In the operating room, numerous buttons need to be pressed and many devices controlled, including the lighting settings in the room, the height and angle of the operating table, and neuromonitoring devices. However, not all of these devices are located in the sterile field, which is the immediate area around the operating table and is subject to the highest standards of sterility. This is why the surgical team at the table includes “sprinters.” These assistants are not permitted to enter the sterile zone, but they operate devices located outside the zone when called upon.
Zachmann and Weyhe now want to bring this control capability to the operating table, freeing up “sprinters” to perform other nursing tasks in the hospital. Together with their industry partner, the Bavarian medical device manufacturer Dr. Mach, they are conducting research on behalf of the Federal Ministry of Research, Technology and Space into hands-free interaction in sterile surgical areas. “If the people at the table can use certain devices themselves, the probability of potential communication errors drops to nearly zero – which improves patient safety,” says Weyhe.
The interdisciplinary team relies on virtual user interfaces projected by three different projectors. The green surgical drape covering the patient’s body during surgery serves as the projection surface, extending from the neck up toward the ceiling. Thanks to its attachment to a mounting device, the drape provides an almost vertical surface, similar to a screen. “We use multiple projectors because this prevents a person standing in the beam from casting a shadow on the image,” explains Andre Mühlenbrock, a research assistant working with Zachmann.
Ensuring that the image, composed of three individual frames, does not result in visible double images is just one problem the Bremen-based computer science experts had to solve with clever programming. The varying angle of the drape, which is stretched differently each time, and the visible folds in the fabric made achieving a clear image difficult. “We programmed software that uses sensor data to create a 3D representation of the scene. Based on this, the software calculates how the projection needs to be adjusted so that the image is sharp,” explains the doctoral student. Up to three cameras track the doctors’ gestures. It is with said gestures that the surgeons will be able to operate equipment in the OR in the future.
In a living lab modeled on a real operating room, the Oldenburg team is investigating whether and how doctors use the system during a simulated operation. In addition to researching user-friendliness, the team is investigating the stress medical professionals experience when working with projected device interfaces. Data collected via sensors – such as brain waves, heart and respiratory rates, and sweat production – provide indications of stress and exertion. The team aims to ensure that new technologies do not cause additional strain in this way.
“This is all application-based information that we wouldn’t have at all without our Oldenburg partner,” says Zachmann. He is pleased to have found a clinical partner at the University Clinic for Visceral Surgery with a strong interest in research. Weyhe also greatly values collaborating with Zachmann and other computer science professors in Bremen. He has been conducting research with them for ten years to bring new technologies into the operating room. “Thanks to this collaboration, we are now far ahead in this field, even by international standards,” says Weyhe.
While the current project will run until the end of the year, the team is already thinking ahead. They want to explore whether the projected user interfaces can be controlled by eye movements alone. This would enable surgeons to use their hands exclusively for surgery.


