Ever since companies like Google and Tesla first started pitching, and experimenting with, driverless cars, we've been in uncharted territory. For some, these automated autos are a wonder and a source of tremendous opportunity. As the market grows and these vehicles are in widespread use, hours a week- for some commuters hours a day- can be devoted to more productive tasks. Or to leisure. Developers will potentially have a whole new markets for apps that can be easily used while in a car.
Even, senior citizens have expressed interest in self-driving vehicles, as an alternative to giving up driving altogether as their reflexes and vision fade. Indeed, no less than the AARP has called driverless vehicles a "Godsend for older Americans".
Yet, while their are clear benefits, the technology has always faced some roadblocks. In an April Morning consult poll just 26% of respondents expressed inretest in buying a self-driving car. There's also some skepticism in the trucking industry, where self-driving vehicles are making the most headway. While self-driving trucks could full some of the nationwide shortage of long-haul drivers, and allow existing drivers to rest more frequently, by picking up some of the slack on the highway milez, it could also potentially take jobs from truckers.
Ultimately, though, safety is the biggest source of skepticism for self-driving vehicles. Are they safe for their occupants and for others on the road? Well, last week The National Highway Traffic Safety Adminstration opened an investigation to determine the cause of the first autonomous vehicle fatality. On May 7th, in Williston, Florida Joshua D. Brown died in a crash while using the Beta autopilot feature on the Tesla Model S. In a statement Tesla said:
"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
Tesla was at pains to stress the rarity of the tragedy. This was the only fatality in just over 130 million miles of automated driving. Ordinary driving results in a fatality every 94 million miles. And while this model offered an auto-pilot mode, because the auto-pilot is in beta, it comes automatically disabled, and when activated drivers are instructed to keep their hands on the wheel and be prepared to take over in the event of an emergency. The vehicle is even slowed down and eventually stopped if the users are away from the wheel for too long.
In short, Tesla's defenders would point to this as an unavoidable tragedy. And maybe it was. It's important to note that this same driver was saved by the same Tesla model in April when it avoided a truck that was merging unsafely. But the accident raises troubling questions about the industry's ability to gain consumer confidence and navigate government regulations.
One company with a huge stake in this market is Google, and they've done some interesting work trying to shore up these deficiencies. The June edition of their self-driving car report emphasizes the strides its software is making in distinguishing bicyclists from other motorists. It recognizes cyclists as unique users and makes adjustments to accommodate them. They're not only given more room, but the software can detect events like a car door opening on the side of the road and will adjust to allow cyclists to occupy the center of the lane. Most impressively, it can even distinguish, and store in memory, hand signals for turning, allowing cars to predict a rider's likely route.
And because of the risk of odd, one off situations like the one that killed Joshua Brown, Google has an entire division devoted to what it calls "edge cases"-infrequent hazards which require a nimble and appropriate response. These "edge case" teams vary their tests extensively by making slight changes in order to anticipate a variety of possibilities on the road. For instance, they might conduct the same test of avoiding a car veering into the wrong lane at different times of the day (and night), with different colors and sizes of vehicles, and so on.
So, while there are still kinks to be worked out, there are also talented and devoted problem-solvers doing the unkinking. If you dream of lounging in the back, playing on your smart phone as your car drives you cross-country, you may have awhile to wait. But that doesn't mean the self-driving car has stalled- it's just slowing down.