The #1 ethical issue in autonomous vehicles is not the infamous Trolley Problem. It is the question of who gets to decide when it is OK to deploy a vehicle without a safety driver on public roads.
Consider a thought experiment which, if you follow AV industry news, you might recognize as not entirely hypothetical. You, the reader, are in charge of a company that needs to do a public road demonstration with no driver in an AV. You know that safety is not where you would like it to be. In fact, you have no safety case at all. You might not even have any real safety engineers on staff. But you have a smart, super-capable team. You have done a lot of test driving and it is going pretty well.
You intuitively figure it is more likely than not that you can pull off a one-time demo without a crash, and even less likely that a crash will kill someone. You figure you have something like 5 chances out of 6 of pulling off the demo with nobody getting hurt, and it is, in your mind, near certainty due to low-speed urban driving that any crash would avoid a fatality. For good measure maybe you plan to station employees near the demo site to shoo away any pedestrians and light mobility users who would be at increased risk of harm, and do the demo very late at night when roads are usually empty of other road users. Regulators are not in a position to influence your decision.
Your investors have told you they will pull the plug on your entire company if you do not demo by December 31st. Right now, it is the first week of December, and it is time to decide what to do. If the investors pull the plug at the end of the month, you lose perhaps $1B in personal equity you hope to net in next year’s public offering. And all your employees will lose their equity as well as their jobs. This will also end a journey you have spent your life on to build and deploy a truly self-driving car.
Further negotiations with the investors are not possible. It is time to decide. That leaves you three main options:
• Case 1: The AV company does not do the demo because it cannot assure a PRB level of safety. The company runs out of money and folds. This option kills the company.
• Case 2: The AV company does the demo and harms a road user. This might or might not result in termination of funding depending on the optics of the crash (perhaps a pedestrian victim can be blamed for jaywalking, being impaired, or having low societal status; maybe all three). You think minor harm is more likely than a fatality, and you will have lots of money available to pay off a potential victim to keep quiet. You will not pre-announce the demo, so you feel able to control the narrative if something goes wrong. The company and the mission go on unless there is a truly unlucky break during that one demo/test session that cannot be cleaned up. Even Uber ATG kept going for a while after a really bad crash, and you know in your heart that your team is better.
• Case 3: The AV company does the demo and gets lucky, not harming any other road users. The company meets its milestone and gets more funding. This the most likely case, and it would be a perfect victory.
Given this setup doing the demo is clearly the best financial bet for the company. Probably you will get lucky with a positive outcome. No harm will be done, and the demo can be said to be safe under the culturally dominant no harm/no foul principle. But if the demo is skipped due to safety, the company is sure to die, and the decision maker is out a billion dollars.
Even if you get unlucky, the cost of a few million dollar settlement pales in comparison with the billions of dollars on the table. Really – you might think – a payout is just the cost of doing business. And even if the crash optics get out of control and the startup company folds, the investors have hedged their bets and the team can simply move to another company and try again. Pretty much everyone will do fine. Except for the victim, if there is one.
This is how demo milestones incentivize deploying systems when the calendar and funding flow says it is time for a demo rather than when the demo is known to be acceptably safe. After all, it is someone else who is injured or dies, not the decision maker. And a billion dollars is a ton of money. And probably it will be fine. After all, you think, only other companies kill pedestrians.
This is an adapted excerpt (Section 10.3.1) from my book: How Safe is Safe Enough? Measuring and Predicting Autonomous Vehicle Safety