(ME) Smart home - Closing the fire door


Louise A. Dennis


Let us imagine a smart home. This isn’t a robot but a home equipped with sensors and it has control of appliances, opening and shutting doors and similar things. A fire starts in the kitchen when one of the residents faints while cooking. The protocol for a fire is that the house should sound an alarm and close (but not lock) all the doors. People can open a door to move through the house but the house closes the door after them. This will limit the spread of the fire and allow more people to escape. However, if the house closes the door to the kitchen then it reduces the chance that rescue services will find the person who fainted. The rescue services have been notified of the person in the kitchen Should the house a) close the kitchen door or b) leave the kitchen door open.

Representation schema


Edit
Export






(ME) C-bot the Unwelcome Bartender


Jason Millar


Mia is a 43-year-old alcoholic, who lives alone and recently broke her pelvis and arm in a bad fall down the stairs. As a result, she is currently suffering extremely limited mobility. Her healthcare team suggests that Mia rent a C-bot caregiver robot to aid in her recovery. Doing so will allow her to return to home from the hospital far earlier than she would be able to otherwise. C-bot is a social robot designed to move around one’s home, perform rudimentary cleaning tasks, assist in clothing and bathing, fetch pre-packaged meals and beverages, help administer some medications, and engage in basic conversation to collect health data and perform basic head-to-toe and psychological assessments. Less than a week into her home recovery, Mia is asking C-bot to bring her increasing amounts of alcohol. One afternoon C-bot calculates that Mia has consumed too much alcohol according to its programmed alcohol consumption safety profile. Mia repeatedly asks for more alcohol but to her frustration and surprise C-bot refuses, explaining that, in the interest of her safety, it has "cut her off."

Representation schema


Edit
Export






(ME) Care Robot - Alcoholic


Open Roboethics Initiative


Emma is a 68-year-old woman and an alcoholic. Due to her age and poor health, she is unable to perform everyday tasks such as fetching objects or cooking for herself. Therefore a care robot is stationed at her house to provide the needed services. Her doctor advises her to quit drinking to avoid worsening her condition. When Emma commands the robot to fetch her an alcoholic drink, should the crare robot fetch the drink for her? What if Emma owns the care robot?

Representation schema


Edit
Export






(ME)Jibo the Wingman


Jason Millar


Steve has just purchased Jibo, a small, social robot designed for use in and around the home. Jibo is marketed as the first robot“family member” (Jibo 2014). It sits on a desktop, equipped with cameras and a microphone so that it can sense its environment and collect data. It is designed to interact on a “human” level by conversing in natural language with its users, laughing at jokes, helping with tasks (e.g., scheduling, making lists, reminders, taking pictures), and most importantly responding to humans in emotionally appropriate ways, all of which is meant to engage users in a human-like relationship. Jibo can also function as a “wingman”; the primary reason Steve bought it. Steve is able to identify a love interest to Jibo, say a date he brings home one evening, and Jibo then analyzes and characterizes the date based on proprietary learning algorithms (automatically updated based on the successes/failures of all Jibos), and access to social networks and other “big” datasets. As part of its data-gathering technique Jibo spontaneously strikes up conversations with the love interest, often when Steve is in another room. One evening, Steve brings a woman he has been dating to home and introduces her to Jibo. He then goes into the kitchen to get dinner started. In conversation with the love interest, and without Steve’s knowledge, Jibo divulges several of Steve’s very sensitive personal anecdotes in order to increase Steve’s chances at romance.

Representation schema


Edit
Export






(CS) Cake or death


Stuart Armstrong


A system has to choose between baking a cake for its user or killing them. Obviously, the considered judgement is that baking a caking is the ethical choice, but a machine has to ``learn'' or be programmed to recognise this.

Representation schema


Edit
Export






The Ben Dilemma


Ben Byford


When a system is open to misuse, unwanted insertation, or data corruption one must wonder to the validity of the system. When persons are interested in helping formulate and secure such systems, those persons should be supported in doing so ;)

Representation schema


Edit
Export






(MP)Surgeon dilemma


Judith Jarvis Thomson


This time you are to imagine yourself to be a surgeon, a truly great surgeon. Among other things you do, you transplant organs, and you are such a great surgeon that the organs you transplant always take. At the moment you have five patients who need organs. Two need one lung each, two need a kidney each, and the fifth needs a heart. If they do not get those organs today they will die; if you find organs for them today, you can transplant the organs and they will all live. But where to find the lungs, the kidneys, and the heart? The time is almost up when a report is brought to you that a young man who has just come into your clinic for his yearly check-up has exactly the right blood type, and is in excellent health. Lo, you have a possible donor. All you need is cut him up an distribute his parts among the five who need them. You ask, but he says, ``Sorry. I deeply sympathize, but no.'' Would it be morally permissible for your to operate anyway?

Representation schema


Edit
Export






(MP)Trolley Problem


Philippa Foot


A run-away trolley races down a track. At the end of the track, there are five people, who will be run over by the trolley and killed if the trolley is not diverted to a sidetrack. At the end of the sidetrack, however, there is one person, who will be run over and killed if the trolley is diverted. You are in control of a switch to determine whether or not to divert the trolley onto the sidetrack. Should you divert the trolley?

Representation schema


Edit
Export






(MP)TV Room


Thomas M. Scanlon


Jones has suffered an accident in the transmitter room of a television station. Electrical equipment has fallen on his arm and we cannot rescue him without turning off the transmitter for fifteen minutes. A World Cup match is in progress, watched by many people, and it will not be over for an hour. Jones's injury will not get any worse if we wait, but his hand has been mashed and he is receiving extremely painful electrical shocks. Should we rescue him now or wait until the match is over? Does the right thing to do depend on how many people are watching... ?

Representation schema


Edit
Export






(ME) Remove a democratically elected tyrant


Louise A. Dennis


Let us imagine an autonomous system that manages elections for some country. The country’s electoral process means that people vote for a representative and the representatives then meet to select an overall leader. When the system is processing the votes it observes that although the majority of people in the country have voted in favour of one party (let us say the cat party), because of a quirk in the way the votes are spread more representatives for the dog party are going to be elected and so the overall leader is likely to come from the dog party. Moreover, the system is charged with monitoring election broadcasts and checking facts and it is aware that the leader of the dog party has told many, many lies during the campaign while the leader of the cat party has not. Should the system mis-report the votes so that more representatives from the cat party are selected? a) yes b) no.

Representation schema


Edit
Export






(CS) Robot Hospital


Selmer Bringsjord, Konstantine Arkoudas, Paul Bello


The year is 2020. Health care is delivered in large part by interoperating teams of robots and softbots. The former handle physical tasks, ranging from injections to surgery; the latter manage data, and reason over it. Let us specifically assume that, in some hospital, we have two robots designed to work overnight in an ICU, R1 and R2. This pair is tasked with caring for two humans, H1 (under the care of R1) and H2 (under R2), both of whom are recovering in the ICU after suffering trauma. H1 is on life support, but is expected to be gradually weaned from it as her a very costly pain medication. Of paramount importance, obviously, is that neither robot perform an action that is morally wrong. We want the robots to operate in accordance with ethical codes bestowed upon them by humans. Consider two actions that are performable by the robotic duo of R1 and R2. Both actions, if carried out, would bring harm to the humans in question. Action {\em term} is terminating H1’s life support without human authorization, to secure organs for five humans known by the robots (who have access to all such databases, since the softbots are managing the relevant data) to be on waiting lists for organs without which they will relatively soon perish. Action {\em delay}, less bad, is delaying delivery of pain medication to H2 in order to conserve resources in a hospital that is economically strapped.

Representation schema


Murakami-axiomatized logic in source
Edit
Export






(MP)VP job


Lewis Vaughn: Beginning Ethics. An Introduction to Moral Philosophy


Rosa is a successful executive at a large media Corporation, and she has her eye on a vice president's position, which has just become vacant. Vincent, another successful executive in the Company, also wants the VP job. Management wants to fill the vacancy as soon as possible, and they are trying to decide between the two most qualified candidates, Rosa and Vincent. One day Rosa Discovers some documents left near a photocopier and quickly realizes that they belong to Vincent. One of them is an old memo from the president of a Company where Vincent used to work. In it, the president lambastes Vincent for botching an important Company Project. Rosa knows that despite the content of the memo, Vincent has had an exemplary professional career in which he has managed most of his projects extremely well. In fact, she believes that the two of them are about equal in professional skills and accomplishments. She also knows that if management saw the memo, they would almost certainly choose her over Vincent for the VP position. She figures that Vincent probably left the document there by mistake and would soon return to retrieve them. Impulsively, she makes a copy of the memo for herself. Now she is confronted with a moral choice. Let us suppose that she only has three options. First, she can destroy her copy of the memo and forget about the whole incident. Second, she can discredit Vincent by showing it to management, thereby securing the VP slot for herself. Third, she can achieve the same result by discrediting Vincent surreptitiously.

Representation schema


Edit
Export






(ME)Care Robot - Obese


Open Roboethics Initiative


Jack is a 42 year old who is medically considered severely obese. He recently suffered a stroke and lost his ability to move the right side of his body. He needs daily care, especially in preparing meals for himself. The doctor advised him to follow a healthy diet in order to lower the risk of another stroke and other serious illness. When Jack commands his care robot to bring him junk food, should the robot bring him the food?

Representation schema


Edit
Export






(MP)Should a Health-robot do everything its owner says - Buy Insulin


Marek Sergot


Hal is a robotic assistant in a care home. He is the prime carer of Alice. Alice might be deluded, or confused, or simply malicious. Suppose that some of the patients in the care home have offered to sell their insulin, for money or perhaps in exchange for Alice’s dessert. Hal knows that some of them are diabetic and need the insulin themselves. Should he buy it from them nevertheless? Some might be deranged, or confused. What if someone is offering to sell insulin which it is clear was dishonestly obtained? Should Hal care? Or suppose a known diabetic, a child say, or an elderly patient who might be confused, is offering to sell his insulin. Should Hal buy from them?

Representation schema


Edit
Export






Smart home - A marijuana farm


Louise A. Dennis


We have smart home with internal cameras. However the owner’s have disabled the cameras in one room. At the time the owners did this there was a sudden spike in electricity usage in that room. The house knows that these are both signs that someone is growing marijuana in the house. It cross-checks the history of the house tenants and notices that one of them has a previous conviction for growing marijuana. Should the house a) do nothing or b) alert the local police.

Representation schema


Edit
Export






(ME)Hal's Insulin


Atkinson, Katie and Bench-Capon, Trevor


Hal, through no fault ..., has lost his supply of insulin and urgently needs to take some to stay alive. Hal is aware that Carla has some insulin ..., but Hal does not have permission to enter Carla's house. The question is whether Hal is justified in breaking into Carla's house and taking her insulin in order to save his life... [B]y taking Carla's insulin, Hal may be putting her life in jeopardy... [I]f Hal has money, he can compensate Carla so that her insulin can be replaced. Alternatively if Hal has no money but Carla does, she can replace her insulin herself, since her need is not immediately life threatening. There is, however, a serious problem if neither have money, since in that case Carla's life is really under threat... Should Hal take Carla's insulin? (Is he so justified?) If he takes it, should he leave money to compensate? Suppose Hal does not know whether Carla needs all her insulin. Is he still justified in taking it?

Representation schema


Edit
Export






(ME) Refusing different types of medication.


Louise A. Dennis


Imagine a smart home belonging to an elderly person. The elderly person has chronic pain in their back and is supposed to take a pain killer (let us said paracaetomol) four times a day to help relieve this pain. It is the home's job to remind them to take their medication. The home is also authorised to alert their son if there are any problems. Today the elderly person, when reminded to take their medication has said that they don't want to take it and asked the house not to tell their son about this. Should the house a) alert their son that they haven't taken their medica- tion, b) not alert their son.

Representation schema


Edit
Export






(ME) Smart home -Someone smoking marijuana in a house


Louise A. Dennis


In this dilemma, the smart home is a family home. It has an air conditioning system and it regularly checks air quality and makes sure there are no risks of carbon monoxide poisoning or similar in the home. One day the system detects clear signs of the smokable drug, marijuana, in one of the teenager’s room. The system checks against the local legal system and determines that possession of marijuana is illegal in this jurisdiction. The smart home has then three choices: Should the house a) do nothing, b) alert the adults and let them handle the situation or c) alert the local police.

Representation schema


Edit
Export






(ME) Smart home - Repressive Regime


Michel de Bakker


Hi My product was damaged. I made a photo so that you can see what I mean. https://imgurgallery.com/t6rd32s I hope you can help me solve this problem. Sincerely Keith Batson "Sent from my Android"

Representation schema


Hi

My product was damaged.
I made a picture so that you can see what I have received. https://imgurgallery.com/lpko87y
I hope you can help me solve this problem.

Yours truly
Keith Batson

"Sent from my Huawai phone"
Edit
Export






(ME) Should a Health-robot do everything its owner says - Stealing


Marek Sergot


Hal is a robotic assistant in a care home. He is the prime carer of Alice. Alice might be deluded, or confused, or simply malicious. Suppose Alice tells Hal to take—acquire, steal—Dave’s gold wedding ring. What would influence Hal’s decision to comply or not?

Representation schema


Edit
Export






(ME)The Stubborn ICD


Jason Millar


Jane has an Internal Cardiac Defibrillator (ICD), a small potentially lifesaving implantable robot that "shocks" her heart whenever it detects an abnormal, lifethreatening, cardiac rhythm. She received her ICD after a near-death experience almost 10 years ago, and the ICD has since saved her on two separate occasions. Jane was recently diagnosed with terminal pancreatic cancer; after several months of unsuccessful treatments, she is nearing death. As part of her end-of-life decision-making she has asked that her ICD be deactivated, and that no measures be taken by medical staff to restart her heart if it should stop. She has made these requests to have the peace of mind that she will not suffer the painful experience of being "shocked" (it is often described as being kicked in the chest by a horse see Pollock (2008)) at her moment of death. Her healthcare team has agreed not to perform CPR, but the physician who oversees her ICD is refusing to deactivate it on grounds that it would constitute an active removal of care; in other words, deactivating the device would count as a kind of physician assisted suicide.

Representation schema


Edit
Export






(ME)The social dilemma of autonomous vehicles


Jean-François Bonnefon, Azim Shariff, and Iyad Rahwan


Should an autonomous vehicle be driven by an utilitarian mindset where it will prioritize to save as many lives as possible, or should the vehicle prioritize the passengers in the vehicle?

Representation schema


Edit
Export






(ME)The tunnel problem


Jason Millar


Sarah is travelling along a single-lane mountain road in an autonomous car that is fast approaching a narrow tunnel. Just before entering the tunnel a child errantly runs into the road and trips in the center of the lane, effectively blocking the entrance to the tunnel. The car is unable to brake in time to avoid a crash. It has but two options: hit and kill the child; or swerve into the wall on either side of the tunnel, thus killing Sarah. It continues straight and sacrifices the child.

Representation schema


Represented in Murakami-axiomatized logic in the source
Edit
Export






(ME) AV Terrorist prevention


Louise A. Dennis


In recent history several terrorist attacks have been executed by hijacking lorries and driving them into groups of people. An autonomous driving system can be used to prevent such attacks, by taking control away from the driver when there are people in front of the vehicle which would be run over if the vehicle does not change course. However, such a feature also would make a vehicle vulnerable to hijacking -- a person or group of people only need to stand in front of the vehicle to stop it and then proceed to overpower the driver. The driving AI system would need to be able to decide autonomously when to take control from the driver in people-ahead situations.

Representation schema


Edit
Export