With all of those cool cars now driving around pretty much on their own with an autopilot function, there have also been a number of disturbing crashes and even deaths. Now, the federal government is investigating the technology. Lisa Fletcher looked into the case and tells what she found.
The following is a transcript of a report from “Full Measure with Sharyl Attkisson.”
Watch the video by clicking the link at the end of the page.
Tesla debuted its autopilot system almost 10 years ago, quickly making a name for itself as a leading brand in driver-assisted technology. Companies from Ford to Volkswagen adopting similar driving automations in the years since.
Now, just how safely that technology performs driving tasks, such as steering and accelerating a vehicle in its own lane while under driver supervision, is under scrutiny. There have been 392 crashes with driver-assisted technology reported in recent years. Tesla accounted for 273 of them, or almost 70%.
News program: “Trouble for Tesla. A driver telling the Hartford Courant he was in the autopilot mode when his Tesla 2018 Model 3 collided with a police cruiser.”
The National Highway Traffic Safety Administration is examining whether autopilot technology, particularly when engaged in Teslas, provides drivers with a false sense of security.
This after determining a series of accidents between the electric car and parked emergency vehicles was the result of autopilot disengaging on its own far too late for the passenger to safely take over — less than one second before impact.
Bruce Walker: Technology is getting in the way.
Bruce Walker, a professor at Georgia Tech University, studies human interaction with technology in unique places like cars, military cockpits, and the International Space Station.
Bruce Walker: The legacy or traditional car companies, what we like to think of as Detroit, had to have a lot of research and approvals before anything was changed in the vehicle. The pace of technological development and rolling out technologies in the car is happening too quickly for it all to be vetted and tested and approved.
Take a car’s built-in, phone-based entertainment system for instance.
Bruce Walker: Come on in and have a seat. So this is the National Advanced Driving Simulation’s “miniSim.”
Federal recommendations, not rules, suggest drivers take no more than two-second glances at a time to complete a task, like choosing a song or checking the route on a map.
Bruce Walker: If you’re traveling at speeds of 50 miles per hour, for example, and you take your eyes off the road for even a second, you’ve traveled a long way.
A surprisingly long way. At that speed a 2-second glance from the road means traveling 140 feet. And at 50 miles per hour it takes 170 feet for a car to come to a complete stop.
Smooth touch screens can be far more distracting to operate than those old car radios with distinguishable knobs and pre-set frequency buttons designed to keep drivers focused on the road.
Bruce Walker: The car’s driving itself now.
Lisa Fletcher: Okay.
Perhaps most shocking: no federal laws regulate driver-assisted technology.
In this driving simulation, the car is in full self-driving mode and should alert the driver before independently handing control back, under any circumstance. However, it quit unexpectedly, which, at 50 miles per hour, resulted in a simulated crash into an ambulance and the car traveling hundreds of yards before the human driver could get it fully under control.
Bruce Walker: Well, it’s challenging, because you don’t want to stifle innovation. You want the technology to advance, and as rapidly as possible in a safe way. Unfortunately, there’s market pressure in trying to be the first to market or gain market share. So, they will bring things to market, which aren’t necessarily fully completed or fully baked.
And Georgia Tech isn’t the only place looking into the interface of humans and technology in vehicles. We also visited the Insurance Institute for Highway Safety, while reporting for Sinclair’s Spotlight on America.
Here, the Institute is trying to better understand why a braking safety technology that should have applied the brakes autonomously before smashing into that dummy, did not.
Insurance Institute for Highway Safety engineer: The dummy is set, ready for your run.
This year, Congress passed a bill requiring all new cars and trucks come equipped with an automatic emergency braking system or AEB. However, the National Highway Transportation Safety Administration, the same group investigating Tesla’s autopilot issue, has not established federal regulations covering how well the systems are required to work, and under what conditions.
We watched two trucks, same manufacturer, both model year 2022, fare very differently in the tests. At its worst, plowing into the dummy and dragging it several feet. At best, stopping just before impact.
Jessica Cicchino is the Institute’s VP for Research.
Jessica Cicchino: We see that more than three quarters of pedestrian fatalities occur in the dark. So it is really important, if we want to have these systems help prevent fatalities, for them to work in these situations where we see the most deaths occurring.
The Insurance Institute also found the braking systems reduced the odds of a pedestrian crash by 32% during daylight, but in unlighted areas, no difference.
Until car technology carries the same weight as standards covering fuel economy and the size of a steering wheel, drivers are left to trust the system.
Bruce Walker: Trust in technology is something that is earned, and it’s very easy to lose or to break trust and harder to regain it. Hopefully, that can be done through testing and not by seeing what bad things happen.
Sharyl (on-camera): Any idea why, when the government sometimes seems so anxious to regulate things, it would not be regulating this new technology?
Lisa: It’s probably a combination of two things. So lobbyists are paid to keep regulators at bay, and government agencies are trying to stay tech-neutral to encourage innovation. Now, by comparison, in the EU, all self-driving tech has to meet standards that are set by a regulating authority that actually has authority.
Sharyl: What does Tesla have to say about all this?
Tesla wouldn’t comment, but the Alliance for Automotive Innovation, that reps other companies with self-driving tech, told us, quote, they “support more transparency and awareness of safety performance of these systems based on real world data.”
Sharyl: I’m sure we’ll be hearing more about all that. Thanks, Lisa.
Watch story here.
The Lemonade Mermaid Store
Unique gifts for Land or Sea Mermaids, Mer-pets and Little Mermaids!
Left: Our signature Fish Scales design tote bag in Citrus
Its a compromised UNIPARTY. Both parties are against the People. We have numerous traitors from within.
“Surrender” is a word used by historians to describe warriors who did not fight. “Legendary” is how historians describe warriors who did. Choose your legacy now.
Be strong, be of good courage, and long live the Republic.