Google's Self-Driving Car Costs $75K: Not Enough Urban Miles for Reliable Safety Statistics; Is It Moral?
Technology has introduced many marvelous innovations in the world, and perhaps one of the most impressive to date is self-driving car technology. These vehicles are designed to have a human driver who can take over in an emergency or when controlled driving is desirable for some other reason, but they generally guide themselves based on built-in sensors and robotic controls. Google has famously been testing them since 2010 in the United States. However, one question lingers as the technology moves forward: "How safe are they?"
The answer depends on whom you ask. Google's position is the technology is "safer than human drivers," according to a study of Google's driverless car testing released last December. The study found that many indications of safe driving, such as following distance, are superior for automated cars relative to those same cars when driven by humans. That said, the data's applicability is not certain. Driverless cars simply do not have enough miles logged to provide reliable statistics about their safety just yet, especially in navigating the complexities of urban driving; most of Google's tests to date have been on highways. This complaint has been voiced by Noah Goodall, a scientist from the University of Virginia, in a recent study, in which he calculated that at least quadruple the data collected so far is required to demonstrate comfortably that these vehicles are truly safer.
Goodall is not only concerned about figuring out accident rates, he is also interested in figuring out how to teach these robotic cars to have act in ways their drivers would be morally okay with. He pointed out that a person driving a car might react to a dangerous situation differently according to their circumstances. For example, imagine a school bus veering off suddenly into approaching traffic. The approaching car's driver could swerve in a way more dangerous to herself and less dangerous to the bus's occupants if she can see that there are dozens of schoolchildren as opposed to a lone driver. This, he said, poses a problem just as important as simply avoiding accidents. Encoding full human morals into software is not just feasible, said Goodall, but approximating it is vital for ensuring that accidents, when they happen, are handled in way the driver would want them handled.
Goodall explained that a human driver deals with a huge amount of data in milliseconds during accidents. A self-driving car is the same, but is much quicker, as it only processes the already completed decisions, which were efficiently programmed months or years before when it was created. In other words, a robotic vehicle does not think, but only processes data according to rules it already has.
No self-driving car is planned to hit the market any time soon, but Google's is expected to cost $75,000.