Skip to main content Accessibility help
×
Hostname: page-component-6bf8c574d5-k2jvg Total loading time: 0 Render date: 2025-03-09T14:44:27.777Z Has data issue: false hasContentIssue false

3 - Trusting Robots

Limiting Due Diligence Obligations in Robot-Assisted Surgery under Swiss Criminal Law

from Part I - Human–Robot Interactions and Substantive Law

Published online by Cambridge University Press:  03 October 2024

Sabine Gless
Affiliation:
Universität Basel, Switzerland
Helena Whalen-Bridge
Affiliation:
National University of Singapore

Summary

How is liability apportioned between the surgeon and the robot in robot-assisted surgery? In Switzerland, a surgeon’s criminal responsibility rests on an obligation of due diligence. It is generally assumed that due diligence and responsibility is divisible among team members and that each individual is responsible for their own actions. The principle of trust (“Vertrauensgrundsatz”) also establishes that a member of a team may trust other members to do their job. The issue discussed in this chapter is the degree of due diligence owed by surgeons who cooperate with robots. In Switzerland, the principle of trust is not applied to a robot assistant, if only because robots cannot be criminally liable themselves. Apart from complete robot failures, surgeons therefore bare the risk of patient injury from surgical robots, in order to avoid a responsibility gap in the law. However, given that surgical robots benefit patients and are becoming the expected standard of care in certain areas, the chapter argues that the principle of trust should be applied to limit the due diligence expected from a surgeon interacting with a robot, if the robot has been appropriately certified.

Type
Chapter
Information
Human–Robot Interaction in Law and Its Narratives
Legal Blame, Procedure, and Criminal Law
, pp. 49 - 72
Publisher: Cambridge University Press
Print publication year: 2024
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/cclicenses/

I Introduction

Surgeons have been using automated tools in the operating room for several decades. Even more robots will support surgeons in the future, and at some point, surgery may be completely delegated to robots. This level of delegation is currently fictional and robots remain mostly under the command of the human surgeon. But some robots are already making discrete decisions on their own, based on the combined functioning of programming and sensors, and in some situations, surgeons rely on a robot’s recommendation as the basis for their directions to the robot.

This chapter discusses the legal responsibility of human surgeons working with surgical robots under Swiss law, including robots who notify surgeons about a patient’s condition so the surgeon can take a particular action. Unlike other jurisdictions, negligence and related duties of care are defined in Switzerland not only by civil law,Footnote 1 but by criminal law as well.Footnote 2 This chapter focuses on the surgeon’s individual criminal responsibility for negligence,Footnote 3 which is assessed under the general concept of Article 12, paragraph 3 of the Criminal Code of Switzerland (“SCC”).Footnote 4 Under the SCC, the surgeon is required to carry out a medical surgery in accordance with state-of-the-art due diligence.

In the general context of task sharing among humans, which includes surgeons working in a team, a principle of trust (Vertrauensgrundsatz) applies. The principle of trust allows team members to have a legitimate expectation that each participant will act with due diligence. The principle of trust also means that participants are for the most part only responsible for their own actions, which limit their obligations of due diligence. However, when the participant is a robot, even though the surgeon delegates tasks to the robot and relies on it in a manner similar to human participants, the principle of trust does not apply and the surgeon is responsible for what the robot does. Neither statutes nor cases clearly state an application or rejection of the traditional principle of trust to robots. However, at this point, the principle has only been applied to humans, and it is safe to assume that it does not apply to robots, mainly because a robot is currently not capable of criminal responsibility under Swiss law.Footnote 5 Application of the principle of trust to robots together with a corresponding limitation on the surgeon’s liability would therefore create a responsibility gap.Footnote 6

In view of the important role robots play in a surgical team, one would expect governing regulation to apply traditional principles to the division of work between human surgeons and robots, but the use of surgical robots has not led to any relevant changes, or the introduction of special care regulations that either limit the surgeon’s responsibility or allocate it among other actors. This chapter explores an approach to limiting the surgeon’s criminal liability when tasks are delegated to robots. As the SCC does not provide guidance regarding the duties of care when a robot is used, other law must be consulted. The chapter argues that the principle of trust (Vertrauensgrundsatz) should be applied to limit the due diligence expected from a surgeon interacting with a robot. Incorporating and handling robots in surgery are becoming more integral to effective surgery due to specialization arising from division of labor among humans and robots, and the increase in more precise and quicker medical-technical solutions for patients. Surgeons must rely to some degree on the expertise of the robots they use, and therefore surgeons who make use of promising robots in their operating room should be subject to a valid and practical approach to due diligence which does not unreasonably expand their liability. While the chapter addresses the need to limit the surgeon’s liability when working with robots, chapter length does not allow for analysis of related issues such as the connection to permissible risk, i.e., once the surgical robot is established in society, the possible risks are accepted because its benefits outweigh the risks. The chapter does not address other related issues, such as situations in which a hospital instructs surgeons to use robots, issues arising from the patient’s perspective, or the liability of the manufacturer, except for situations where the robot does not perform as it should or simply fails to function.Footnote 7

The chapter proceeds by articulating the relevant concept of a robot (Section II). A discussion of due diligence (Section III) explains the duties of care and the principle of trust when a surgeon works without a robot (Section III.B), which is followed by a discussion of duties of care when a surgeon works with a robot (Section III.C). The chapter addresses in detail the due diligence expected when a surgical robot asks the human to take a certain action (Section III.C.3). Moving to a potential approach that restricts a surgeon’s criminal liability to appropriate limits, the chapter explores the principle of trust as it could apply to robots (Section III.D), and suggests an approach that applies and calibrates the principle of trust based on whether the robot has been certified (Section III.E). The chapter applies these legal principles to the first stage of surgical robots, which are still dependent on commands from humans to take action and do not contain complete self-learning components. The conclusion (Section IV) looks to the future and shares some brief suggestions about how to deal with likely developments in autonomous surgical robots.

II Terminology: Robots in Surgery

A standardized definition of a robot does not exist.Footnote 8 There is some agreement that a robot is a mechanical object.Footnote 9 In 1920, Karel Capek characterized the term “robota” (slavish, slave labor)Footnote 10 by his story about artificial slaves who take over humankind.Footnote 11 Thereafter, the term was used in countless other works.Footnote 12 The modern use of robot includes the requirement that a robot has sensors to “sense,” processors to “think,” and actuating elements to “act.”Footnote 13 Under this definition, pure software, which does not interact physically with the world, does not count as a robot.Footnote 14 In general, robots are partly intelligent, adaptive machines that extend the human ability to act in the world.Footnote 15

Traditionally, robots are divided into industrial and service robots. A distinction is also made between professional service robots such as restaurant robots, and service robots for private use such as robot vacuums.Footnote 16 The robots considered in this chapter come under the category of service robots, which primarily provide services for humans as opposed to industrial processes. Among other things, professional service robots can interact with both unskilled and skilled personnel, as in the case of a service robot at a restaurant, or with exclusively skilled personnel, as with a surgeon in an operating room.

In discussions of robots and legal responsibility, the terms “agents” or “autonomous systems”Footnote 17 are increasingly used almost interchangeably with the term robot. To avoid definitional problems, only the term “robot” will be used in the chapter. However, the chapter does distinguish between autonomous and automated robots, and only addresses automated robots over which the surgeon exercises some control, not fully autonomous robots. Fully autonomous robots would have significantly increased autonomy and their own decision-making ability, whereas automated robots primarily execute predetermined movement patterns.Footnote 18 Fully autonomous robots that do not require human direction are not covered in this chapter because innovations in the field of surgery have not yet reached this stage,Footnote 19 although the conclusion will share some initial observations regarding how to approach the liability issues raised by autonomous robots.

III Legal Principles Regarding Due Diligence and Cooperation

Generally applicable principles of law regarding due diligence and cooperation are found in Swiss criminal law. Humans must act with due diligence, and if they do not, they can be liable for negligence. According to Swiss criminal law, any person is liable for lack of care if he or she fails to exercise the duty of care required by the circumstances and commensurate with personal capabilities.Footnote 20 But while it is a ubiquitous principle that humans bear responsibility for their own behavior, we normally do not bear responsibility for someone else’s conduct. We must consider the consequences of our own behavior and prevent harm to others, but we are not our brother’s or sister’s keeper. The scope of liability can change if we share responsibilities, such as risk-prone work, with others.Footnote 21 And whether we are acting alone or in cooperation with others, we must be careful, depending on the circumstances and our personal capabilities.

III.A Basic Rules with Examples Regarding the Due Diligence of Surgeons

Unlike other jurisdictions, Swiss law explicitly defines the basic rule determining criminal negligence. In Article 12, paragraph 3 of the SCC, a “person commits a felony or misdemeanour through negligence if he fails to consider or disregards the consequences of his conduct due to a culpable lack of care. A lack of care is culpable if the person fails to exercise the care that is incumbent on him in the circumstances and commensurate with his personal capabilities.”Footnote 22

Determining a person’s precise due diligence obligations can be a complex endeavor. In Swiss criminal law a myriad of due diligence rules underpin negligence and are used to specify the relevant obligations, including legal norms, private regulations, and a catch-all-clause, dubbed the risk principle (Gefahrensatz).Footnote 23 The risk principle establishes that everyone has to behave in a reasonable way that minimizes threats to the relevant legal interest as best as possible.Footnote 24 For example, a surgeon must take all reasonable possible precautions to avoid increasing a pre-existing danger to the patient.Footnote 25

To apply the risk principle, the maximum permissible risk must be determined.Footnote 26 For this purpose, the general risk range must first be determined, and this range is limited by human skill;Footnote 27 no one can be reproached for not being able to prevent the risk in spite of doing everything humanly possible (ultra posse nemo tenetur).Footnote 28 The risk range is therefore limited by society’s understanding of the permissible risk, and by the abilities possessed by a capable, psychologically, and physically normal person; no superhuman performance is expected.Footnote 29 However, if a person’s ability is lower than what is required in a situation, the performed activity should be refrained from.Footnote 30 In the context of medical personnel, a surgeon who is not familiar with the use of robots may not perform such an operation.

As the law does not list the exact duties of care of a surgeon, it is left to the courts to specify in more detail the content and scope of the medical duties of care based on the relevant statutes and regulations. In that respect, it is not of significance whether the treatment is governed by public or private law.Footnote 31

III.B Due Diligence Standards Specific to Surgeons

Swiss criminal law is applied in the medical field, and every healthcare professional who hurts a patient intentionally or with criminal negligence can be liable.Footnote 32 Surgery is an activity that is, in principle, hazardous, and a surgeon may be prosecuted if he or she, consciously or unconsciously,Footnote 33 neglects a duty of care.Footnote 34 According to the Swiss Federal Supreme Court, the duty of care when applying conventional methods of treatment is based on “the circumstances of the individual case, i.e., the type of intervention or treatment, the associated risks, the discretionary scope and time available to the physician in the individual case, as well as his objectively expected education and ability to perform.”Footnote 35

This reference of the Swiss Federal Supreme Court to the educational background and efficiency of the physician does not indicate that the standard is entirely subjective. Rather, the physician should be assessed according to the knowledge and skills assumed to be available to representatives of his specialty at the time the measures are taken.Footnote 36 This objective approach creates an ongoing obligation for the further education of surgeons.

Part of a surgeon’s obligation is that they owe the patient a regime of treatment that complies with the generally recognized state of medical art (lex artis),Footnote 37 determined at the time of treatment. Lex artis is the guiding principle for establishing due diligence in an individual case in Swiss criminal law.Footnote 38 It encompasses the entire medical procedure, from the examination, diagnosis, therapeutic decision, and implementation of the treatment, and in the case of surgeons from preparing the operation to aftercare.Footnote 39 The standard is therefore not what is individually possible and reasonable, but the care required according to medical indications and best practice.Footnote 40 A failure to meet this medical standard leads to a breach of duty of care. Legal regulation, such as the standards of the Medical Professions Act (“MedBG”),Footnote 41 especially Article 40 lit. a, may be used to determine the respective state of medical art. Together, the regulatory provisions provide for the careful and conscientious practice of the medical profession.Footnote 42

Doctors must also observe and not exceed the limits of their own competence. A surgeon must recognize when they are not able to perform a surgery and need to consult a specialist. This obligation includes the duty to cooperate with other medical personnel, because performing an operation without the required expertise is a breach of duty of care in itself.Footnote 43 As with other areas of medical care, the surgeon’s obligations do not exceed the human ability to foresee events and to influence them in a constructive way.Footnote 44

If there are no legal standards for an area of medical practice, courts may refer to guidelines from medical organizations.Footnote 45 In practice, courts usually refer to the private guidelines of the Swiss Academy of Medical SciencesFootnote 46 and the Code of Conduct of the Swiss Medical Association (“FMH”).Footnote 47 Additionally, general duties derived from court decisions, such as “practising the art of medicine according to recognized principles of medical science and humanity,” can be used in a secondary way to articulate a doctor’s specific due diligence obligation.Footnote 48

III.C Due Diligence of a Surgeon in Robot-Assisted Surgery

New technologies have long been making appearances in operating rooms. Arthrobot assisted for the first time in 1983; responding to voice command, the robot was able to immobilize patients by holding them steady during orthopedic surgery.Footnote 49 Arthrobots are still in use today.Footnote 50

The introduction of robots to surgery accomplishes two main aims: (1) they perform more accurate medical procedures; and (2) they enable minimally invasive surgeries, which in turn increases surgeon efficacy and patient comfort by providing a faster recovery. A doctor is, generally, not responsible for the dangers and risks that are inherent in every medical action and in the illness itself.Footnote 51 However, the surgeon’s obligation of due diligence applies when using a robot. The chapter argues that the precise standards of care should differ, depending on whether the surgeon has control of the robot’s actions or whether the robot reacts independently in the environment, and depending on the extent of the surgeon’s control, including the ability to intervene in a procedure.Footnote 52

The next section introduces and explains the functioning of several examples of surgical robots. These robots qualify as medical devices under Swiss law,Footnote 53 and as such are subject to statutes governing medical devices. Medical devices are defined as instruments, equipment, software, and other objects intended for medical use.Footnote 54 Users of medical devices must take all measures required by the state of the art in science and technology to ensure that they pose no additional risk. The lex artis for treatment incorporating robots under Swiss criminal law requires users to apply technical aids lege artis and operate them correctly. For example, when the robot is used again at a later time, its functionality and correct reprocessing must be checked.Footnote 55 A surgeon does not have to be a trained technician, but he or she must have knowledge of the technology used, similar to the way that a driver must “know” a car, but need not be a mechanic.

On its own, the concept of lex artis does not imply specific obligations, and the specific parameters of the obligations must be determined based on individual circumstances. According to Article 45, paragraph 1 of the Therapeutic Products Act (TPA), a medical device must not endanger the health of patients when used as intended. If a technical application becomes standard in the field, falling below or not complying with the standard (lex artis) is classified as a careless action.Footnote 56 Lack of knowledge of the technology, as well as a lack of control over a device during an operation, leads to an assumption of liability (“Übernahmeverschulden”).Footnote 57

A final aspect of the surgeon’s obligations regarding surgical robots is that a patient must always be informedFootnote 58 about the robot before an operation, and the duty of documentationFootnote 59 must be complied with. Although the precise due diligence obligations of surgeons always depend on the circumstances of individual cases, the typical duties of care regarding two different kinds of robots that incorporate elements of remote-control, and the situation in which a robot provides a warning to the surgeon, are outlined below.

III.C.1 Remote-Controlled Robots

The kind of medical robots prevalent today are remote-controlled robots, also referred to as telemanipulation systems in medical literature. They are controlled completely and remotely by the individual surgeon,Footnote 60 usually from a short distance away via the use of joysticks. An example of a remote-controlled robot, DaVinci, was developed by the company Intuitive, and it is primarily used in the fields of urology and gynecology. DaVinci does not decide what maneuver to carry out; it is completely controlled by the surgeon, who works from an ergonomic 3D console using joysticks and foot pedals.Footnote 61 The surgeon’s commands are thus translated directly into actions by the robot. In this case, the robot makes it possible for the surgeon to make smaller incisions and achieve greater precision.

What is the due diligence obligation of a surgeon making use of remote-controlled robots? Remote-controlled robots such as the DaVinci, which have no independence and are not capable of learning, do not present any ambiguities in the law. If injury has occurred, the general Swiss criminal law of liability for negligence holds the surgeon responsible. The robot’s arms are considered to be an extension of the surgeon’s hands, who remains in complete control of the operation.Footnote 62 In fact, the surgeon has always needed tools such as scalpels to operate. Today, thanks to technological progress, the tool has simply become more sophisticated. The surgeon’s duties of care remain the same with a remote-controlled robot as without, and can be stated as follows:Footnote 63 the surgeon must know how the robot works and be able to operate it. Imposing full liability on the surgeon is appropriate here, as the surgeon is in complete control of the robot.

According to Dr. med. Stephan Bauer, a surgeon needs training with DaVinci to work the robot, including at least 15 operations with the console control to become familiar with the robot, and 50 more to be able to operate it correctly.Footnote 64 The surgeon must also attend follow-up training and regular education in order to fulfil his or her duty of care. This degree of training is not currently specified in any medical organization’s guideline, but it is usually recommended by the manufacturer. The surgeon must also be able to instruct and supervise his or her surgical team sufficiently, and should not use a remote-controlled robot if there is insufficient knowledge of the type of operation it will be used in. Lastly, the surgeon must be able to complete the operation without the robot. These principles are basic aspects of any kind of medical due diligence in Switzerland, and they must apply in any kind of modern medicine such as the use of surgical robots.Footnote 65

Medical doctors who do not fulfil the duty of care and supervision for a remote-controlled robot can be held criminally responsible to the same degree as if the doctor made use of a scalpel directly on a patient’s body. If, however, injury occurs due to a malfunction of the robot, such as movements that do not comply with the surgeon’s instructions or a complete failure during the operation, the manufacturer,Footnote 66 or the person responsible for ensuring the regular maintenance of the device,Footnote 67 could be held criminally responsible.

III.C.2 Independent Surgical Robots

Some surgical robots in use today have dual capabilities. These robots are pre-programmed by the responsible surgeon in advance and carry out programming without further instruction from the surgeon, but they can also perform certain tasks independently, based on the combined functioning of their sensors and their general programming. Initially the surgeon plans and programs the motion sequences of the robot in advance, and the robot carries out those steps, but the robot may have the ability to act without instruction from the surgeon. These robots are referred to here as “independent robots,” to indicate that their abilities are not limited to remote-controlled actions, and to distinguish them from fully autonomous robots capable of learning.

An example of an independent robot with dual capabilities is Smart Tissue Autonomous Robot (STAR),Footnote 68 which carries out pre-programmed instructions from the surgeon, but which can also automatically stitch soft tissue. Using force and motion sensors and cameras, it is able to react to unexpected tissue movements while functioning.Footnote 69 In 60 percent of cases, it does not require human assistance to do this stitching, while in the other cases, it only needs minimal amounts of input from the surgeon.Footnote 70 Although the stitching currently requires more time than the traditional technique by a human, it delivers better results.Footnote 71 Another example, Cold Ablation Robot-guided Laser Osteotome (CARLO),Footnote 72 is able to cut bones independently after receiving the surgeon’s instructions, but it can also use sensors to check whether the operation is going smoothly.Footnote 73 According to the manufacturer Advanced Osteotomy Tools (AOT),Footnote 74 CARLO is thus the “world’s first medical, tactile robot that can cut bone … with cold laser technology. The device allows the surgeon to perform bone operations with unprecedented precision, and in freely defined, curved and functional sectional configurations, which are not achievable with conventional instruments.”Footnote 75 In summary, CARLO’s lasers open up new possibilities in bone surgery.

Independent robots have the advantage of extreme precision, and they have no human deficits such as fatigue, stress, or distraction. Among other benefits, use of these robots decreases the duration of hospitalization, as well as the risks of infection and pain for the patient, because the incision and the injury to the tissue is minimal. When independent robots function as intended, surgery time is usually shortened, accidents due to hand trembling of the surgeon are reduced, and improved 3D visualization can be guaranteed.

As noted above, a surgeon is fully responsible for injury caused by a remote-controlled robot, in part because the surgeon has full control over the robot, which can be viewed as an extension of the surgeon’s own hands. What are a surgeon’s due diligence obligations when using an independent surgical robot? When independent surgical robots use their ability to make decisions on their own, should criminal responsibility be transferred to, or at least shared with, say, the manufacturer, particularly in cases where it was not possible for the surgeon to foresee the possible injury?

To the extent that independent robots are remote-controlled, i.e., simply carrying out the surgeon’s instructions, surgeons must continuously comply with the duties of care that apply when using a remote-controlled robot, including the accurate operation, control, and maintenance of the robot. A surgeon’s obligations regarding a careful operation while using an independent robot include, prior to the operation, the correct definition of the surgical plan and the programming of the robot. The surgeon must also write an operation protocol, disinfect the area, and make the first incision.Footnote 76 In addition, further duties arise under Swiss law because of the independence of the robot in carrying out the instructions the surgeon provided earlier, i.e., non-contemporaneous instructions.Footnote 77 During the operation, the surgeon must observe and monitor the movements of the robot so that he can intervene at any time if he or she realizes harm may occur. According to the manufacturer AOT,Footnote 78 CARLO “allows the surgeon full control over this … osteotomy device at any time.” This standard of supervision is appropriate, because the surgeon’s supervision is needed to prevent injury, but as reviewed below, there are limits to what can be expected of a surgeon supervising a robot.

Even if a surgeon complies with the obligations to take precautions and carry out surveillance of the surgery while it is ongoing, a surgical robot may still make a mistake, e.g., cutting away healthy tissue. If it is established that a cautious and careful surgeon in the same position would not have been able to regain control of the robot and avoid the injury, the surgeon is deemed to have not violated his or her duty of care or acted in a criminally negligent manner.Footnote 79 If this occurs, no criminal charges will be brought against the surgeon. This standard is also appropriate, because proper supervision could not have prevented the injury.

III.C.3 Due Diligence after a Robot Warning

Per the principle lex artis, a surgeon using any kind of surgical robot is required to be knowledgeable regarding the functionality of the robot, including the emergency and safety functions, and the messages and warning functions.Footnote 80 A human surgeon using a robot for surgery cannot blindly trust the technology, and current law requires the surgeon to supervise and check whether or not their intervention is required and whether a change of plan is necessary. In the event that the robot fails, or issues a warning signal, the human must complete the surgery without the assistance of the robot. If the robot issues an alert, the human surgeon must always be capable of checking whether such notification is correct and react adequately.Footnote 81 If the human surgeon is not capable of taking over, Swiss law imposes liability according to a sort of organizational negligence, the “Übernahmeverschulden,” which is the principle that if a person assumed a task that he cannot handle properly, and harm is caused, the surgeon acted negligently.Footnote 82 If an alert is ignored because the surgeon does not understand its significance or is not monitoring adequately, the surgeon also acts in a criminally negligent manner.

If the surgeon perceives the robot’s alert, but assesses that the robot advice is wrong, the surgeon may override it. There is a saying in Switzerland that also applies to a surgeon who relies on a surgical robot, although not completely: “Trust is good, verification is better.” In a clearly established cooperation between a surgeon and a robot, if the surgeon decides not to follow an alert from the robot, the surgeon does need a valid justification. For example, if CARLO notifies the surgeon that the bone cannot be cut in a certain way and the surgeon decides to proceed anyway, there would need to be a documented justification for his or her decision to overrule the robot.

While the current requirement of surgeon supervision of robots is justified generally, the law needs some adjustment. There must be a limit to a surgeon’s obligation to constantly monitor and question robot alerts, because otherwise a surgeon–robot cooperation would be unworkably inefficient. It would also result in unjustifiable legal obligations, based on a superhuman expectation that the surgeon monitors every second of the robot’s action. Surgeons are considered to be the “guarantors of supervision,”Footnote 83 which means that they are expected to control everything that the robot does. But when it is suitably established that robots perform more accurately than the average human medical professional in the field, the human must be allowed to step out of the process to some degree. For example, a surgeon would always need to go through the whole operating plan to be sure that robots such as STAR or CARLO are functioning properly. However, this obligation to double-check the robot should not apply to every minute movement the robot makes, as an obligation like this would be contrary to the purpose of innovative technology such as surgical robots, which were invented precisely for the purposes of greater accuracy and time-saving.

Additionally, when it is established that a surgical robot performs consistently without engaging in unacceptable mistakes, there will be a point where it would be wiser for the surgeon to not second-guess the robot, and in the case of a warning or alert, follow its directions. In fact, ignoring the directions of a surgical robot, which is part of the medical state of the art and acts correctly to an acceptable degree, is likely to lead to negligent, if not intentional, liability.

III.D Limiting the Surgeon’s Due Diligence Obligations regarding Surgical Robots through the Principle of Trust (Vertrauensgrundsatz)?

The surgeon’s obligation of supervision currently imposes excessive amounts of liability for the use of surgical robots, because, as discussed above, while surgeons rightfully have obligations to monitor the robot, they should not be required to check every movement the robot makes before it proceeds. The chapter argues that in the context of robot supervision, variations of the principle of trust (Vertrauensgrundsatz) should apply to limit the surgeon’s criminal liability.

When a surgeon works with human team members, the legitimate expectation is that individuals are responsible only for their own conduct and not that of others. The principle of trust is a foundational legal concept, one that enables effective cooperation by identifying spheres of responsibility and limiting the duties of due diligence to those spheres. It relieves individuals from having to evaluate the risk-taking of every individual in the team in every situation, and allows for the effective division of expertise and labor. The principle of trust was developed in the context of road traffic regulation, but it has widespread relevance and is applied today in medical law as well as other areas.Footnote 84

The principle of trust has limits and does not provide a carte blanche justifying all actions. If there are concrete indications that trust is unjustified, one must analyze and address that situation.Footnote 85 An example regarding surgical robots might be the DaVinciFootnote 86 robot. It has been in use for a long time, but if a skilled surgeon notices that the robot is defective, the surgeon must intervene and correct the defect.

The limitations of due diligence arising out of the principle of trust are well established in medical law, an environment where many participants work together based on a division of expertise and labor. In an operating room, several different kinds of specialists are normally at work, such as anesthesiologists, surgeons, and surgical nurses. The principle of trust in this environment limits responsibility to an individual’s own area of expertise and work.Footnote 87

One way of understanding the division of labor in surgery is that the primary area is the actual task, i.e., the operation, and the secondary area is supervisory, i.e., being alert to and addressing the misconduct of others.Footnote 88 Supervisory responsibility can be imposed horizontally (surgeon–surgeon) or vertically (surgeon–nurse), depending on the position a person occupies in the operating room. An example of the horizontal division of labor in the medical context would be if several doctors are assigned equal and joint control, with all having an obligation to coordinate the operation and monitor one another. If an error is detected, an intervention must take place, and if no error is detected, the competence of the other person can be trusted.Footnote 89 With vertical division of labor, a delegation to surgical staff such as assistants or nursing professionals requires supervisory activities such as selection, instruction, and monitoring. The important point here is that whether supervision is horizontal or vertical, the applicability of the principle of trust is not predicated upon constant control.Footnote 90

So far, the principle of trust has only been applied to the behavior of human beings. This chapter argues that the principle of trust should be applied to surgical robots, when lex artis requires it. First, as a general principle, delegation of certain activities must be permitted. Surgeons cannot perform an operation on their own, as this would, in itself, be a mistake in treatment.Footnote 91 Second, regarding robots in particular, given the degree to which surgical robots offer better surgical treatment, surgeons should use them as part of the expected standard of medical treatment.

But can robots, even certified robots, be equated with another human in terms of trustworthiness? Should a surgeon trust the functioning of a robot, and in what situations is trust warranted? The chapter argues that a variation of the principle of trust should be applied to a surgeon’s use of surgical robots. Specifically, an exception to the non-application of the principle of trust for robots should be created for robots that have been certified by competent authority as safe, referred to here as certification-based trust. Before and until the certification is awarded, the principle of mistrust (Misstrauensgrundsatz) should apply. This approach would also impose greater responsibility on the surgeon if, e.g., the robot used by the surgeon was still in a trial phase, or had a lower level of approval from the relevant authorities.Footnote 92

The concept of certified-based trust is supported by the principle of permissible risk. It is a fact that people die in the operating room, because medical and surgical procedures are associated with a certain degree of risk to health or life, but in Switzerland, this is included in the permissible risk.Footnote 93 There is no reason why this level of acceptable risk should not apply to surgical robots. According to Olaf Dössel:Footnote 94

[t]rust in technology is well founded if (a) the manufacturer has professionally designed, constructed and operated the machinery, (b) safety and reliability play an important role, (c) the inevitable long-term fatigue has been taken into account, and (d) the boundary conditions of the manufacturer remain within the framework established when the machinery was designed.

A certification-based trust approach is also consistent with other current practices, e.g., cooperating with newcomers in a field always requires a higher duty of care. When the reliability and safety of surgical robots becomes sufficiently established in practice, the principle of trust should then be applied, to establish the surgeon’s due diligence obligations within the correct parameters.

III.E Certified for Trust

This chapter argues that surgeons working with surgical robots can develop a legitimate expectation of trust consistent with principles of due diligence if the robot they use is certified. This approach to surgeon liability places increased importance on the process of the medical device certification, which is discussed further here.

Certification of medical devices is a well-developed area. In addition to the TPAFootnote 95 and the Medical Devices Ordinance,Footnote 96 other standards apply, including Swiss laws and ordinances, international treaties, European directives, and other international requirements.Footnote 97 These standards define the safety standards for the production and distribution of medical devices.Footnote 98

Swiss law requires that manufacturers keep up with the current state of scientific and technical knowledge, and comply with applicable standards when distributing the robot.Footnote 99 Manufacturers of surgical robots must successfully complete a conformity assessment procedure in Switzerland.

A robot with a CE-certification can be placed on the market in Switzerland and throughout the European Union.Footnote 100 A CE-certification mark means that a product has been “assessed by the manufacturer and deemed to meet EU safety, health and environmental protection requirements.”Footnote 101 For the robot to be used in an operating room in Switzerland, a CE-certificationFootnote 102 must be issued by an independent certification body.Footnote 103 After introducing the robot to the market, the manufacturer remains obliged to check its product.Footnote 104

This chapter argues that a surgeon’s due diligence obligations when using a surgical robot should be limited by a principle of trust, and that the principle should apply when the robot is certified. A certification-based trust approach is consistent with Dössel’s suggestion that trust in technology is well-founded if, inter alia, the manufacturer has professionally designed, constructed, and operated the machinery.Footnote 105 It is currently not an accepted point of law that the CE-certification is a sufficient basis for the user to trust the robot and not be held criminally responsible, but the chapter suggests that as a detailed, well-established standard, the CE-certification is an example of a certification that could form the basis of application of the principle of trust.

If the principle of certification-based trust is adopted, the surgeon would still retain other due diligence obligations, including the duty to inform patients about the risks involved in a robot’s use.Footnote 106 This particular duty will likely become increasingly important over time, as the performance range of surgical robots increases.

IV Conclusion

Today, lex artis requires surgeons to ensure the performance of the robot assistant and comply with its safety functions. The human surgeon must maintain the robot’s functionality and monitor it during a medical operation and be ready to take over if needed. Requiring surgeons to supervise the robots they use is a sound position, but surgeons should not be expected to monitor the robot’s every micro-movement, as that would interfere with the functioning of surgical robots and the benefits to patients. However, under current Swiss law, the surgeon is liable for all possible injury, unless the robot’s movements do not comply with the surgeon’s instructions or there is a complete failure of the robot during the operation.

Surgeons working with surgical robots are therefore accountable for robotic action to an unreasonable degree, even though the robot is used to enhance the quality of medical services. Thus, a strange picture emerges in Swiss criminal law. In a field where robotics drive inventions that promise to make surgery safer, surgeons who use robots run a high risk of criminal liability if the robot inflicts injury. Conversely, if the surgeon does not rely on new technology and performs an operation alone which could generally be better and more safely performed by a robot, the surgeon could also be liable. This contradictory state of affairs requires regulatory reform, with a likely candidate being the application of a certification-based trust that limits the surgeon’s liability to appropriate limits.

This chapter has addressed issues raised by the robots being used today in operating rooms, including remote-control and independent surgical robots. The chapter has not addressed more advanced, self-learning robots. Given that the law already requires reform regarding today’s robots, even larger legal issues will be raised when it becomes necessary to determine who is responsible in the event of injury by autonomous robots,Footnote 107 those capable of learning and making decisions. In this context, it will be more difficult to determine whether the malfunction was due to the original programming, subsequent robot “training,”Footnote 108 or other environmental factors.Footnote 109 Surgeons may also find that robots capable of learning may act in unpredictable ways, making harm unavoidable even with surgeon supervision. In the case of unpredictable robot action, a surgeon should arguably be able to rely on the technology and avoid criminal negligence, provided it has a CE-certification. Ever-increasing amounts of due diligence, such as constant monitoring, are not desired with today’s or tomorrow’s robots, because the robot is supposed to relieve the surgeon’s workload and should be considered competent to do so if it is certified.

Footnotes

* The author owes great thanks for the outstanding support regarding this chapter to Prof. Dr. Sabine Gless and Assoc. Prof. Helena Whalen-Bridge.

1 Entscheid des Bundesgerichts (Decision of the Swiss Federal Court) BGE 133 III 121 E. 3.1; BGE 115 Ib 175 E. 2b; BGE 139 III 252 E. 1.5; BGE 133 III 121 E. 3.1 (the abbreviation for the Swiss Federal Court is BGE, and cases are cited by volume and starting page; all decisions are available online at: www.bger.ch).

2 See e.g., Christopher Geth, Strafrecht Allgemeiner Teil (Criminal Law General Part) (Basel, Switzerland: Helbing Lichtenhahn Verlag, 2021) [Strafrecht Allgemeiner Teil] at 170. Regarding the civil responsibility of a doctor, see Lisa Blechschmitt, Die straf- und zivilrechtliche Haftung des Arztes beim Einsatz roboterassistierter Chirurgie (The Criminal and Civil Liability of Physicians When Using Robot-Assisted Surgery) (Baden-Baden, Germany: Nomos, 2017).

3 Strafgesetzbuch (Swiss Criminal Code), SR 311.0 (as amended January 23, 2023) [SCC], Art. 12, para. 3, www.fedlex.admin.ch/eli/cc/54/757_781_799/en. Negligence differs from intentional action under Art. 12, para. 2, according to which someone intentionally commits a crime or misdemeanor if they carry out the act with knowledge and will.

4 SCC, note 3 above, Art. 12, para. 3.

5 Regarding the ongoing discussion of an e-personhood for robots, see e.g., Martin Zobl & Michael Lysakowski, “E-Persönlichkeit für Algorithmen?” (E-Personhood for Algorithms?) (2019) 1 Digma 42.

6 See Chapter 15 in this volume.

7 See Section III in this chapter, and Chapter 4 in this volume.

8 Neil Richards & William Smart, “How Should the Law Think about Robots?” in Ryan Calo, A. Michael Froomkin, & Ian Kerr (eds.), Robot Law (Cheltenham, UK: Edward Elgar, 2016) 3 [“Think about Robots”].

9 Melinda Florina Müller, “Roboter und Recht” (Robots and Law) (2014) 5 Aktuelle Juristische Praxis 595; Isabelle Wildhaber & Melinda Florina Lohmann, “Roboterrecht – eine Einleitung” (Robotlaw – An Introduction) (2017) 2 Aktuelle Juristische Praxis 135.

10 Susanne Beck, “Grundlegende Fragen zum Umgang mit der Robotik” (Basic Questions about the Use of Robotics) (2009) 6 Juristische Rundschau 225.

11 Thomas Christaller, Michael Decker, M. Joachim Gilsbach et al., Robotik (Robotics) (Berlin, Germany: Springer, 2001) [Robotik] at 18; Karel Capek, “R.U.R.” (play written in 1920, and premiered in Prague in 1922).

12 See e.g., Isaac Asimov, The Complete Robot (London, UK: Harper Collins, 1983).

13 George Bekey, Autonomous Robots: From Biological Inspiration to Implementation and Control (Cambridge, MA: MIT Press, 2005) 2.

14 See also George A. Bekey, “Current Trends in Robotics” in Patrick Lin, Keith Abney, & George Bekey (eds.), Robot Ethics (Cambridge, MA: MIT Press, 2012) 17; “Think about Robots”, note 8 above, at 6: “… our definition excludes wholly software-based artificial intelligences that exert no agency in the physical world.”

15 Robotik, note 11 above, at 5.

16 IFR-Website (International Federation of Robotics), https://ifr.org/.

17 More often for programs and artificial intelligence, not necessarily only for robots.

18 Using the example of driving, Daimler, “Information on Daimler AG,” www.daimler.com/innovation/case/autonomous/rechtlicher-rahmen.html; Aleks Attanasio, Bruno Scaglioni, Elena De Momi et al., “Autonomy in Surgical Robotics” (2021) 4 Annual Review of Control, Robotics, and Autonomous Systems 651, www.annualreviews.org/doi/abs/10.1146/annurev-control-062420-090543?casa_token=6SiJq_gdMesAAAAA:ykrIDELrN9BO1-Z63N2jcLiZ8ggbiPnLyTp4n65jy5LMz_Ov-Wko-h1yWeBQTAjVVOyHQnqjV94VSg.

19 Examples from different areas: Rolf H. Weber, “Automatisierte Entscheidungen: Perspektive Grundrechte” (Automated Decisions: Fundamental Rights Perspective) (2020) 1 SZW 18, section III; Atlas der Automatisierung, Automatisierte Entscheidungen und Teilhabe in Deutschland (Atlas of Automation, Automated Decisions and Participation in Germany) (AlgorithmWatch, 2019) 26, https://atlas.algorithmwatch.org/wpcontent/uploads/2019/04/Atlas_of_Automation_by_AlgorithmWatch.pdf. For definitions of autonomy in robotic-assisted surgery, see Guang-Zhong Yang, James Cambias, Kevin Cleary et al., “Medical Robotics – Regulatory, Ethical and Legal Considerations for Increasing Levels of Autonomy” (2017) 2:4 Science Robotics 2.

20 SCC, note 3 above, Art. 12, para. 3.

21 See, for a detailed analysis, Nathalia Bautista Pizzaro, Das erlaubte Vertrauen im Strafrecht (The Permissible Trust in Criminal Law), Strafrecht Studien vol. 77 (Zurich, Switzerland and Baden-Baden, Germany: Nomos, 2017).

22 SCC, note 3 above, Art. 12, para. 3.

23 Andreas Donatsch, Stefan Heimgartner, Berhard Isenring et al. (eds.), Kommentar zum Schweizerischen Strafgesetzbuch (Commentary on the Swiss Criminal Code), 20th ed. (Zürich: Orell Fussli, 2018) [Schweizerischen Strafgesetzbuch], at Art. 12 Note 15.

24 Andreas Donatsch, Sorgfaltsbemessung und Erfolg beim Fahrlässigkeitsdelikt (Due Diligence and Success in the Crime of Negligence) (Zürich, Switzerland: Schulthess Verlag, 1987) [Sorgfaltsbemessung] at 117.

25 See Günther Stratenwerth, Schweizerisches Strafrecht (Swiss Criminal Law), Allgemeiner Teil I: Die Straftat, 4th ed. (Bern, Switzerland: Stampli, 2011) [Schweizerisches Strafrecht] at s. 16 N 9.

26 Sorgfaltsbemessung, note 24 above, at 128; Andreas Donatsch & Brigitte Tag, Strafrecht I (Criminal Law I), 9th ed. (Zürich, Switzerland: Schulthess Verlag, 2013) [Strafrecht I] at 343; BGE 90 IV 11, BGE 116 IV 308, BGE 117 IV 61, BGE 118 IV 133, BGE 121 IV 14, BGE 129 IV 121; for the permitted risk in the context of autonomous vehicles, see also Nadine Zurkinden, “Strafrecht und selbstfahrende Autos – ein Beitrag zum erlaubten Risiko” (Criminal Law and Self-driving Cars – A Contribution to the Permitted Risk) (2016) 3 Recht 144 [“Selbstfahrende Autos”].

27 Sorgfaltsbemessung, note 24 above, at 156.

28 Footnote Ibid. at 144; Schweizerisches Strafrecht, note 25 above, at s. 16 N 10; BGE 127 IV 44, BGE 130 IV 14.

29 Sorgfaltsbemessung, note 24 above, at 130, 146, and 154; Strafrecht I, note 26 above, at 345.

30 Sorgfaltsbemessung, note 24 above, at 154; Marcel Alexander Niggli & St. Maeder, “Article 12” in Marcel Alexander Niggli & Hans Wiprächtiger (eds.), Basler Kommentar, Strafrecht I (Basel Commentary Criminal Law), 3rd ed. (Basel, Switzerland: Helbing Lichtenhahn Verlag, 2013) at N 102; BGE 73 IV 180, BGE 80 IV 49, BGE 106 IV 264, BGE 106 IV 312, BGE 135 IV 70 et seq.

31 BGE 139 III 252 E. 1.5; BGE 133 III 121 E. 3.1; BGE 115 Ib 175 E. 2b; The general duties of physicians and hospitals are not considered here; for details of the contractual relationships between patient and physician or patient and hospital, see Walter Fellmann, “Arzt und das Rechtsverhältnis zum Patienten” (Doctor and the Legal Relationship with the Patient) in Moritz Kuhn & Thomas Poledna (eds.), Arztrecht in der Praxis, 2nd ed. (Zürich, Switzerland: Schulthess Verlag, 2007) 103 [“Rechtsverhältnis zum Patienten”] at 106.

32 Anna Petrig & Nadine Zurkinden, Swiss Criminal Law (Zürich, Switzerland: Dike Verlag, 2015) [Swiss Criminal Law] at 108.

33 Footnote Ibid. “Consciously” means that the person disregards the consequences of his or her behavior through a violation of duty of care. The person has considered it possible that it might succeed, but hopes that it will not. Unconsciously, a person acts if he has not considered the possibility of success occurring at all, although he should have noticed it. Both are treated equally in Swiss law.

34 Swiss Criminal Law, note 32 above, at 108.

35 BGE 133 III 121 E. 3.1; BGE 120 II 248 E.2c.

36 However, successful treatment is not owed (BGE 133 III 121 E.3.1). Generally accepted and valid principles of medical science are: professional treatment and reasonable care. Thomas Gächter & Dania Tremp, “Arzt und seine Grundrecht” (Doctor and His Fundamental Right) in Moritz Kuhn & Thomas Poledna (eds.), Arztrecht in der Praxis, 2nd ed. (Zürich, Switzerland: Schulthess Verlag, 2007) 7; “Rechtsverhältnis zum Patienten”, note 31 above, at 120.

37 Gunther Arzt, “Die Aufklärungspflicht des Arztes aus strafrechtlicher Sicht” (The Physician’s Duty to Inform from a Criminal Law Perspective) in Wolfgang Wiegand (ed.), Arzt und Recht, Berner Tage für die juristische Praxis (Bern, Switzerland: Stampli, 1985) 52 at Diskussion 73. Wiegand stated as late as 1985 that, according to the Swiss Federal Supreme Court, the exercise of the medical profession requires a certain boldness, which lawyers must never restrict. In 1987, however, the Swiss Federal Supreme Court corrected these earlier cited decisions and stated in BGE 113 II 429, 432 E.3a that limiting “… the liability of doctors to severe violations of the duty of care … is not supported by the law.” See also BGE 116 II 519, 521 E. 3: “According to the most recent case law of the Swiss Federal Supreme Court, the liability of physicians is not limited to severe violations of the medical art.”

38 See BGE 134 IV 175, E. 3.2, 177 et seq.; 130 IV 7, E. 3.3, 11 et seq.; 120 Ib 411, E. 4a, 412 et seq.; 113 II 429, E. 3a, 431 et seq.; 66 II 34, 35 et seq.; 64 II 200, E. 4a, 205 f; Antoine Roggo & Daniel Staffelbach, “Offenbarung von Behandlungsfehlern/Verletzung der ärztlichen Sorgfaltspflicht, Plädoyer für konstruktive Kommunikation” (Disclosure of Treatment Errors/Violation of the Medical Duty of Care, Plea for Constructive Communication) (2006) 4 Aktuelle Juristische Praxis/PJA 407; Moritz Kuhn, “Artz und Haftung aus Kunst- bzw. Behandlungsfehlern” (Physician and Liability Arising from Malpractice or Medical Malpractice) in Moritz Kuhn & Thomas Poledna (eds.), Arztrecht in der Praxis, 2nd ed. (Zürich, Switzerland: Schulthess Verlag, 2007) 601 [“Artz und Haftung”] at 601 and 669. Depending on the success of the offense, (negligent) bodily injury offenses are mainly considered after SCC, note 3 above, Arts. 122, 123, 125, or 126; BGE 134 IV 175 et seq.; BGE 130 IV 7 et seq.

39 Ulrich Schroth, “Die strafrechtliche Verantwortlichkeit des Arztes bei Behandlungsfehlern” (The Criminal Liability of the Physician in Cases of Medical Malpractice) in Claus Roxin & Ulrich Schroth (eds.), Handbuch des Medizinstrafrechts, 4th ed. (Stuttgart, Germany: Richard Boorberg Verlag, 2010) 125 [“Strafrechtliche Verantwortlichkeit”]; Brigitte Tag, “Strafrecht im Arztalltag” (Criminal Law in the Everyday Life of a Doctor) in Moritz Kuhn & Thomas Poledna (eds.), Arztrecht in der Praxis, 2nd ed. (Zürich, Switzerland: Schulthess Verlag, 2007) 669 [“Strafrecht im Arztalltag”] at 685.

40 “Rechtsverhältnis zum Patienten”, note 31 above, at 121.

41 Bundesgesetz über die universitären Medizinalberufe (Medical Professions Act), Switzerland, SR 811.11 (with effect from June 23, 2006), www.fedlex.admin.ch/eli/cc/2007/537/de.

42 “Rechtsverhältnis zum Patienten”, note 31 above, at 124.

43 “Strafrecht im Arztalltag”, note 39 above, at 669.

44 Schweizerischen Strafgesetzbuch, note 23 above, at s. 12 N 20.

45 BGE 130 IV 7, E. 3.3, 11 et seq. It is stated in the “Botschaft zum MedBG (Medizinalberufegesetz)” that the code of conduct of the FMH can be used for the interpretation of the open law.

46 Swiss Academy of Medical Sciences, (SAMWASSM), www.samw.ch/en.html; for the Project on Artificial Intelligence, see www.samw.ch/de/Projekte/Uebersicht-der-Projekte/Kuenstliche-Intelligenz.html.

47 FMH Homepage, https://fmh.ch/.

48 BGE 130 IV 7, E. 3.3, 11 et seq.; Strafrecht Allgemeiner Teil, note 2 above, at 160.

49 Olga Lechky, “World’s First Surgical Robot in B.C.,” The Medical Post (November 12, 1985), www.brianday.ca/imagez/1051_28738.pdf.

50 See e.g., Alex Nemiroski, Yanina Y. Shevchenko, Adam A. Stokes et al., “Arthrobots” (2017) 4:3 Soft Robotics 183.

51 “Artz und Haftung”, note 38 above, at 601.

52 See also Jan-Philipp Günther, Roboter und rechtliche Verantwortung (Robots and Legal Responsibility) (Munich, Germany: Herbert Utz Verlag, 2016) [Rechtliche Verantwortung].

53 Federal Act on Medicinal Products and Medical Devices, Therapeutic Products Act, TPA, Switzerland, SR 812.21 (as amended January 1, 2022), www.fedlex.admin.ch/eli/cc/2001/422/en [TPA]; and the Medical Devices Ordinance, Switzerland, SR 812.213 (as amended August 1, 2020), www.fedlex.admin.ch/eli/cc/2001/520/en [MedDO] specify the classification as a medical device. According to Swiss law, the classification as a medical device does not depend on whether or not it acts directly on the human body: only the purpose is relevant (judgment of the Swiss Federal Administrative Court C-669/2016 of September 17, 2018, E.5.1.2; judgment of the Swiss Federal Court 2A.504/2000 of February 28, 2001, E.3).

54 MedDO, note 53 above, Art. 1.

55 TPA, note 53 above, Art. 49; MedDO, note 53 above, Art. 19, para. 1 and Art. 20, para. 1.

56 Monika Gattiker, “Arzt und Medizinprodukte” (Phycisian and Medical Devices) in Moritz Kuhn & Thomas Poledna (eds.), Arztrecht in der Praxis, 2nd ed. (Zürich, Switzerland: Schulthess Verlag, 2007) 495.

58 Iris Herzog-Zwitter, “Die Aufklärungspflichtverletzung und ihre Folgen” (The Breach of the Duty of Disclosure and its Consequences) (2010) HAVE 316 at 318. On the duty of information, see in general, Walter Fellmann, “Aufklärung von Patienten und Haftung des Arztes” (Information of Patients and Liability of the Physician) in Bernhard Rütsche (ed.), Medizinprodukte: Regulierung und Haftung (Bern, Switzerland: Stampfli, 2013) 171; BGE 119 II 456 = Pra 1995 Nr. 72 E.2c.

59 BGE 141 III 363 E.5.1.

60 Azad Shademan, Ryan S. Decker, Justin D. Opfermann et al., “Supervised Autonomous Robotic Soft Tissue Surgery” (2016) 8:337 Science Translational Medicine 1 [“Soft Tissue Surgery”].

62 Rechtliche Verantwortung, note 52 above, at 255f.

63 See Jonela Hoxhaj, Quo vadis Medizintechnikhaftung?: Arzt-, Krankenhaus- und Herstellerhaftung für den Einsatz von Medizinprodukten (Quo vadis Medical Technology Liability?) (Frankfurt, Germany: Peter Lang Verlag, 2000) at 85.

64 Hirslanden, Profile of Dr. med. Stephan Bauer, www.hirslanden.ch/de/corporate/aerzte/1/dr-med-stephan-bauer.html; Martina Bortolani, “Dr. Robotnik, übernehmen Sie!” (Dr. Robotnik, Take Over!) Blick (July 3, 2016), www.blick.ch/life/gesundheit/medizin/wenn-die-maschine-operiert-dr-robotnik-uebernehmen-sie-id5213024.html.

65 Execution of the Swiss Federal Court on telemedicine: BGE 116 II 519, E.3. This decision is a civil law decision, but no reasons are apparent why these principles should not also apply to the criminal law assessment.

66 Sabine Gless, “Strafrechtliche Produkthaftung” (Criminal Product Liability) (2013) 2 Recht 54 [“Strafrechtliche Produkthaftung”] at 56: A manufacturer must bring a product onto the market that is free from defects according to the state of the art in science and technology. See also Chapter 2 in this volume.

67 “Strafrechtliche Produkthaftung”, note 66 above, at 54: Infringement of the duty to inspect and monitor.

68 Star Automation, “Cartesian Robots – Es-II Series” (Smart Tissue Autonomous Robot), www.star-europe.com/en/prodotti/robot-cartesiani-serie-es-ii-4.

69 “Soft Tissue Surgery”, note 60 above.

70 Star Automation, “Robot cartesiani serie Es-II,” www.star-europe.com/es-ii/; Nicola von Lutterotti, “Der Roboter übernimmt” (The Robot Takes Over), Neue Burcher Beitung (May 16, 2016), www.nzz.ch/wissenschaft/medizin/intelligente-medizinaltechnik-der-roboter-uebernimmt-ld.82237?reduced=true.

71 Werner Pluta, “Operationsroboter übertrifft menschliche Kollegen” (Surgical Robot Outperforms Human Colleagues), Golem.de (May 9, 2016), www.golem.de/news/robotik-operationsroboter-uebertrifft-menschliche-kollegen-1605-120779.html.

72 See AOT, “CARLO,”https://aot.swiss/carlo/ [“CARLO”].

73 Santina Russo & Noemi Lea Landolt, “Der überflüssige Chirurg: Schon bald sägen Roboter unsere Schädel auf” (The Superfluous Surgeon: Robots Will Soon Be Sawing Open Our Skulls), Aargauer Zeitung (April 23, 2016), www.aargauerzeitung.ch/leben/der-ueberfluessige-chirurg-schon-bald-saegen-roboter-unsere-schaedel-auf-ld.1550792.

www.aargauerzeitung.ch/leben/der-uberflussige-chirurg-schon-bald-sagen-roboter-unsere-schadel-auf-ld.1550792

74 “CARLO”, note 72 above.

76 “Rechtsverhältnis zum Patienten”, note 31 above, at 103.

77 See also Rechtliche Verantwortung, note 52 above, at 255f.

78 “CARLO”, note 72 above.

79 Sabine Gless & Thomas Weigend, “Intelligente Agenten und das Strafrecht” (Intelligent Agents and Criminal Law) (2014) 126:3 ZStW 561; Nora Markwalder & Monika Simmler, “Roboterstrafrecht, zur strafrechtlichen Verantwortlichkeit von Robotern und künstlicher Intelligenz” (Robot Criminal Law) (2017) 2 Aktuelle Juristische Praxis 177. In the context of autonomous cars, see “Selbstfahrende Autos”, note 26 above; Alexander Schorro, “Autonomes Fahren – erweiterte strafrechtliche Verantwortlichkeit des Fahrzeughalters?” (Autonomous Driving – Extended Criminal Liability of the Vehicle Owner?) (2017) 1 ZStrR 81, and regarding self-driving cars, see Chapters 2 and 4 in this volume.

80 See also Rechtliche Verantwortung, note 52 above, at 255f.

81 Regarding robot testimony, see Chapters 6 and 8 in this volume.

82 A more detailed description can be found under Section III.A.

83 “Strafrecht im Arztalltag”, note 39 above, at 692.

84 For an overview, see Matthias Richard Heierli & Jörg Rehberg, Die Bedeutung des Vertrauensprinzips im Strassenverkehr und für das Fahrlässigkeitsdelikt (The Significance of the Principle of Trust in Road Traffic and for the Crime of Negligence) (Zürich, Switzerland: Schulthess Juristische Medien, 1996); from road traffic law: BGE 129 IV 282, 286; BGE 115 IV 239, 240; René Schaffhauser, Grundriss des schweizerischen Strassenverkehrsrechts (Outline of the Swiss Road Traffic Law), Band I: Grundlagen, Verkehrszulassung und Verkehrsregeln, 2nd ed. (Bern, Switzerland: Stampfli, 2002) at N 441.

85 See “Strafrechtliche Verantwortlichkeit”, note 39 above, at 135; “Strafrecht im Arztalltag”, note 39 above, at 692; on the principle of trust in general, BGE 125 IV 83, E. 2, 87 et seq.; BGE 120 IV 300, E.3; BGE 118 IV 277, E.4.

86 A more detailed description can be found under Section III.C.1.

87 See “Strafrechtliche Verantwortlichkeit”, note 39 above, at 135; “Strafrecht im Arztalltag”, note 39 above, at 692; Hans Wiprächtiger, “‘Kriminalisierung’ der ärztlichen Tätigkeit? Die Strafbarkeit des Arztfehlers in der bundesgerichtlichen Rechtsprechung” (“Criminalization” of Medical Practice? The Criminal Liability of Medical Malpractice in Federal Court Jurisprudence) in Andreas Donatsch, Felix Blocher, & Annemarie Hubschmid Volz (eds.), Strafrecht und Medizin: Tagungsband des Instruktionskurses der Schweizerischen Kriminalistischen Gesellschaft vom 26./27. Oktober 2006 in Flims (Bern, Switzerland: Stampfli, 2007) 61 at 82; on the principle of trust in general, see BGE 125 IV 83, E. 2, 87 et seq.; BGE 120 IV 300, E.3; BGE 118 IV 277, E.4.

88 See Hanspeter Kuhn, Gian Andrea Rusca, & Simon Stettler, “Rechtsfragen der Arztpraxis” (Legal Issues of the Medical Practice) in Moritz Kuhn & Thomas Poledna (eds.), Arztrecht in der Praxis, 2nd ed. (Zürich, Switzerland: Schulthess Verlag, 2007) 265 at 287.

89 See “Strafrecht im Arztalltag”, note 39 above, at 693.

90 See also “Strafrechtliche Verantwortlichkeit”, note 39 above, at 139; “Strafrecht im Arztalltag”, note 39 above, at 694.

91 “Strafrecht im Arztalltag”, note 39 above, at 669.

92 For more on the topic, see e.g., Michael Isler, “Off Label Use von Medizinprodukten” (Off Label Use of Medical Devices) (2018) 2 LSR 79.

93 The theory of “de facto control” is used primarily to determine the indirect actors and accomplices; see e.g., Schweizerisches Strafrecht, note 25 above, at s. 13 N 11.

94 Olaf Dössel, “Vertrauen in die Technikwissenschaften, Vertrauen in die Medizintechnik?!” (Trust in Engineering Sciences, Trust in Medical Technology?!) (2013) Berlin-Brandenburgische Akademie der Wissenschaften 75, https://edoc.bbaw.de/files/2207/13_Debatte13_Doessel.pdf [“Vertrauen in die Technikwissenschaften”].

95 TPA, note 53 above.

96 MedDO, note 53 above.

97 See European Union, The European Parliament, & The Council of the European Union Regulation, Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on Medical Devices, Amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and Repealing Council Directives 90/385/EEC and 93/42/EEC, OJ 2017 L 117 (EU: Official Journal of the European Union, 2017).

98 Relevant are ISO 13485:2016; ISO IEC 80601-2-78:2019-07.

99 “Strafrechtliche Produkthaftung”, note 66 above, at 56.

100 See Abkommen zwischen der Schweizerischen Eidgenossenschaft und der Europäischen Gemeinschaft über gegenseitige Anerkennung von Konformitätsbewertungen (Agreement between Switzerland and the European Union on mutual recognition in relation to conformity assessment, June 21, 1999), SR 0.946.526.81, www.fedlex.admin.ch/eli/cc/2002/276/de.

102 See MedDO, note 53 above, Arts. 8, 9, and 10; SwissMedic, “Aktuell,” www.swissmedic.ch/md.

103 Unlike medicinal products, medical devices do not need to be subject to official approval. Swissmedic’s focus in the area of medical devices is, therefore, on efficient market surveillance: Swissmedic, “Medizinprodukte,” www.swissmedic.ch/swissmedic/de/home/medizinprodukte.html. For the CE-certification in Switzerland, the various conformity assessment bodies are monitored by Swissmedic.

104 “Strafrechtliche Produkthaftung”, note 66 above, at 59; see Chapter 4 in this volume.

105 “Vertrauen in die Technikwissenschaften”, note 94 above.

106 On consent to the procedure, see Philippe Weissenberger, Die Einwilligung des Verletzten bei den Delikten gegen Leib und Leben (The Consent of the Injured Person in the Case of Offenses against Life and Limb) (Bern, Switzerland: Stampfli, 1996) at 145. Concerning the obligation to monitor the product after market entry, see “Strafrechtliche Produkthaftung”, note 66 above, at 60. Concerning the responsibility of the manufacturer and the operator in the field of autonomous cars, see Sabine Gless & Ruth Janal, “Hochautomatisiertes und autonomes Autofahren – Risiko und rechtliche Verantwortung” (Highly Automated and Autonomous Driving – Risk and Legal Responsibility) (2016) 10 Juristische Rundschau 561.

107 See e.g., Cade Metz, “The Robot Surgeon Will See You Now,” The New York Times (April 30, 2021), www.nytimes.com/2021/04/30/technology/robot-surgery-surgeon.html; James Martin, Bruno Scaglioni, Joseph C. Norton et al., “Enabling the Future of Colonoscopy with Intelligent and Autonomous Magnetic Manipulation” (2020) 2:10 Nature Machine Intelligence 595.

108 See Andreas Matthias, Automaten als Träger von Rechten (Automatic Machines as Bearers of Rights), Dissertation, 2nd ed. (Berlin, Germany: Logos Verlag Berlin, 2010) at 25.

109 Susanne Beck, “Roboter und Cyborgs” (Robots and Cyborgs) in Susanne Beck (ed.), Jenseits von Mensch und Maschine (Baden-Baden, Germany: Nomos, 2012) 9.

Save book to Kindle

To save this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×