Last month, two events occurred that stood in stark contrast with each other. First, in early March, the U.S. House of Representatives rejected the “right-to-try” bill, which essentially would have removed the Food and Drug Administration from any decisions regarding using unapproved medication on terminally ill patients, leaving it to patients, their doctors and the drug companies.

There were impassioned speeches from both sides. Eventually, later in the month, the bill passed the House on its second try and heads to the Senate. However, for our purposes here, it’s not whether the bill passed or not that’s important, but that the safety and the rights of patients and the public at large were given a lot of weight in the discussion. 

That brings us to the other event. Later in the month, an autonomous vehicle—a car meant to navigate the roads by itself without the need for a human operator—struck and killed a pedestrian at night in Tempe, Arizona. Though there was an operator behind the wheel, it appears the car was in autonomous mode at the time of the crash. Some pundits have speculated that the vehicle’s advanced radar and other sensors should have detected the woman as she walked her bike across the wide, flat road,1 while others have said, preliminarily at least, that the experimental car’s owner, Uber, isn’t at fault for the fatal collision.2 The real issue, however, isn’t whether the accident could have been avoided, it’s why it was allowed to happen at all. 

The woman who was killed didn’t realize she was taking part in a potentially dangerous product test, and neither she, nor the other residents of the state, signed any informed-consent documents. However, if she were to undergo medical experimentation where there was a risk of death, she’d be inundated with reams of informed-consent papers, hours of discussions and months of intense monitoring. But all autonomous vehicle companies have to do to use people as beta testers is to get a state permit. 

The FDA gets a lot of flak for its approval process, and is often accused of stifling innovation or limiting patients’ access to drugs. On the other end of the spectrum, however, is the practice of just going out and involving the general public in your testing without their consent. No doubt there’s a middle ground between the two extremes, and eventually a very wise person will hit upon the right combination of safety and expediency that satisfies both sides. For now, however, I prefer the “flawed” FDA approach.  REVIEW

—Walt Bethke, Editor in Chief

1. Kerr D. Was Uber’s driverless car crash avoidable? Experts say yes. Available at Accessed 23 March 2018.
2. Maki S, Sage A.  Self-driving Uber car kills Arizona woman crossing street. Accessed 23 March 2018.