Chevy Bolt EV Forum banner

1 - 20 of 41 Posts

·
Registered
Joined
·
770 Posts
Discussion Starter #1
Most recall the tragedy of a Tesla driver dying in an accident with Autopilot engaged last year. The NTSB final report effectively puts most of the blame on, the way I read it, the way Tesla markets Autopilot;
On Tuesday, the National Transportation Safety Board announced that “operational limitations” in Tesla’s Autopilot system played a “major role” in a fatal collision in May 2016 between a Tesla Model S and a semitrailer truck on a Florida highway. The regulator initially concluded that Tesla’s self-driving software was not responsible for the accident. But the NTSB said on Tuesday that the driver had relied too much on the Autopilot system, touching the steering wheel for an estimated 25 seconds during a 37-minute period. The vehicle’s software failed to respond to a semitrailer that crossed an intersection and moved in front of the Model S. The NTSB’s reversal marks a significant setback for Tesla, which has championed the development of autonomous vehicles based on claims that such technology could ultimately lead to a decline in car accidents.​
 

·
Registered
Joined
·
107 Posts
There are certain situations that have to be extremely difficult to automate and cross traffic is probably one of those. Let's face it, even we humans make mistakes in those situations where we fail to see someone crossing in front of us when we didn't expect it. Computer will not be any less fallible. Especially if the intersection is partially obscured and you can't see far enough to the right or left to react in time to a fast crossing vehicle. The camera has to see the vehicle, determine its speed and direction, that it is on a collision course and react, all in a blink of the eye. On a ship we take bearing to other ships we think are on a collision course. If the bearing doesn't change we know we are on a collision course and must take action. Hopefully, this will result in improvements to their software to be able to better recognize this type of dangerous situation, like someone running the red light in front of us.
 

·
Registered
Joined
·
3,791 Posts
So the fact that a truck pulled out into oncoming traffic wasn't listed among the causes of the accident?

Computer will not be any less fallible... Hopefully, this will result in improvements to their software to be able to better recognize this type of dangerous situation, like someone running the red light in front of us.
A computer will be less fallible in the future. It may already be at that point. Computers don't get distracted, they can begin responding in fractions of a second, and they never grow weary. Their only limitation is their sensors and their algorithms. Already they have a larger field of view than humans.

Tesla has updated the software following that incident so that it should not occur again.
 

·
Registered
Joined
·
4,799 Posts
So the fact that a truck pulled out into oncoming traffic wasn't listed among the causes of the accident?
There were apparently 10 seconds between the time the truck became an obstacle and the collision. So while the truck caused the accident, it was an avoidable one if the driver had been paying attention.
 

·
Registered
Joined
·
770 Posts
Discussion Starter #5
There were apparently 10 seconds between the time the truck became an obstacle and the collision. So while the truck caused the accident, it was an avoidable one if the driver had been paying attention.
The language used in the report was somewhat vague regarding this most important evidence. At first thought, it could mean there is a stationary 60' tractor trailer straddling the highway, and at 74MPH, 10 seconds would mean the Tesla Autopilot had just under a 1/4 mile to react. However both vehicles were in motion, so the Tesla Driver did have close to 1/4 mile to take action.

This visualization may help:


Autopilot is a feature that is ostensibly marketed as what its name implies: Auto-Pilot. The driver need not concern themselves with the drudgery of actually driving, as the autonomous [by definition: acting independently] systems drives the vehicle. In this event, NTSB's report indicates that the driver cavalierly ignored continuous warnings from his Tesla to keep his hands on the wheel. If the Autopilot caveat (stipulations, conditions, or limitations) is that the driver must keep their hands on the wheel, feet near the pedals, and eyes attuned to the road, then that is not autonomous at all. That's glorified cruise-control or "assisted-pilot" at best. But Assisted-pilot just doesn't have the cutting-edge techno-dynamism that Autopilot does with respect to marketing.
 

·
Registered
Joined
·
3,791 Posts
The language used in the report was somewhat vague regarding this most important evidence. At first thought, it could mean there is a stationary 60' tractor trailer straddling the highway, and at 74MPH, 10 seconds would mean the Tesla Autopilot had just under a 1/4 mile to react. However both vehicles were in motion, so the Tesla Driver did have close to 1/4 mile to take action.

This visualization may help:

https://www.youtube.com/watch?v=uIV6sGHZo1U
The visualization shows the tractor trailer moving perpendicular to the Tesla, which neither lengthens or shortens the available stopping distance.

Autopilot is a feature that is ostensibly marketed as what its name implies: Auto-Pilot. The driver need not concern themselves with the drudgery of actually driving, as the autonomous [by definition: acting independently] systems drives the vehicle. In this event, NTSB's report indicates that the driver cavalierly ignored continuous warnings from his Tesla to keep his hands on the wheel. If the Autopilot caveat (stipulations, conditions, or limitations) is that the driver must keep their hands on the wheel, feet near the pedals, and eyes attuned to the road, then that is not autonomous at all. That's glorified cruise-control or "assisted-pilot" at best.
It doesn't sound misleading to me. The term comes from the aviation industry, and pilots are still expected to fly the plane, and have responsibility for the safe operation regardless of the shortcomings of the automated systems. Sometimes pilots have crashed planes due to an over-reliance on automation, or by ignoring warnings, and the crash is blamed on pilot error, not autopilot.

1 person killed by over-reliance on autonomous driving aides, while tragic, is not something that keeps me up at night. If the system can be shown to be more safe than people ostensibly giving their full attention to driving a non-automated car, then what is there to discuss?
 

·
Registered
Joined
·
1,486 Posts
People just need to realize that auto-pilot systems aren't fool proof, especially on roads with intersections where other drivers may not be aware of their surrounding. I see it as more of a very advanced cruise control system for highway driving only and wouldn't dare brave using it on normal roads.
 

·
Registered
Joined
·
4,799 Posts
1 person killed by over-reliance on autonomous driving aides, while tragic, is not something that keeps me up at night. If the system can be shown to be more safe than people ostensibly giving their full attention to driving a non-automated car, then what is there to discuss?
Well of course the issue in the Tesla incident is that the system was obviously far less safe than a driver who was giving his full attention. There are no production cars that can fully substitute for driver yet.

As technology progresses toward full autonomy, we will be entering a very dangerous time when the car can supplement the driver but not replace it. As the Tesla accident shows, a partially autonomous car that still relies on the driver runs a very serious risk of lulling the driver into a sense of complacency. Some auto companies have said that they simply won't produce partially autonomous cars for precisely that reason.

We've had cruise control for a long time, but the red line seems to be auto-steering. If it's possible to drive the car without putting your hands on the steering wheel then for a certain percentage of drivers it's probably just too much of a temptation to zone out. That may well be why GM's "Lane Keep Assist" function is engineered to just ping-pong the car from one side of the lane to the other rather than drive right down the middle.
 

·
Registered
Joined
·
770 Posts
Discussion Starter #9
It doesn't sound misleading to me. The term comes from the aviation industry, and pilots are still expected to fly the plane, and have responsibility for the safe operation regardless of the shortcomings of the automated systems. Sometimes pilots have crashed planes due to an over-reliance on automation, or by ignoring warnings, and the crash is blamed on pilot error, not autopilot.
I concur with your opinion regarding Autopilot system not being at fault. It's not, as there is no such a 'thing' as Autopilot on any Tesla, other than the name of a optional package of components. This package group is actually called "Traffic-Aware Cruise Control", as detailed beginning on page 66 of the 2016 Tesla owners manual. From page 66 - 89, which explain the functionality of the the various components, every page contains nothing but warnings. It's like a pharmaceutical commercial where the first 10 seconds say its a wonder drug, and the remaining 50 seconds disclaiming the 345 fatal side effects of using it:

Warning: Traffic-Aware Cruise Control is
designed for your driving comfort and
convenience and is not a collision warning
or avoidance system. It is your
responsibility to stay alert, drive safely,
and be in control of the vehicle at all
times
. Never depend on Traffic-Aware
Cruise Control
to adequately slow down
Model S. Always watch the road in front
of you and be prepared to take corrective
action at all times. Failure to do so can
result in serious injury or death.​

Yet, at this very moment on the Tesla website is this:
Full Self-Driving Hardware on your Model S

Most Tesla owners understand this is nothing more than a slick gimmick to differentiate Tesla from other brands. A feature to brag about, but use only under the very narrow conditions outlined in the manual. My fear are the few people, with more money than brains, who buy into the marketing hype. The driver was the only victim of this event, but what if it were a bus full of Sunday School'ers, and not a tractor trailer?

Sure, the term Autopilot relates to the aviation industry. The safest method of transportation. That is due to a sophisticated network of every aircraft knowing where every other aircraft is in relation to them, plane-to-plane communication, ground-to-plane communication, a rigid set of training, procedures, fail-safes, redundancies, as well as air traffic control orchestrating it all. True automobile automation will only happen when a similar environment is in place on all cars, all traffic management systems, and embedded in all roads. I won't hold my breath.
 

·
Registered
Joined
·
782 Posts
I drive a P85D Tesla to work every day using autopilot - it requires attention and routinely makes mistakes - that being said it still is a wonderful system as a driving assistant but you do need to pay full attention and be ready to override it's decisions.

It's a god send in 0-25 mph stop and go traffic and works very very well with very little opportunity to screw up - given the industry standard front facing radar and low speeds there is little if any room for an actual screw up, and the the low speeds work to advantage in that lane keeping is easy with very very little chance for error in the steering inputs - it's way better than driving yourself in these conditions and based on my own personal experience is way safer than a human driver in stop/go traffic in that it never becomes distracted and impact the car in front of you due to a momentarily lapse of attention…flying up HWY 85 north at 65 mph or greater however with other cars dodging in/out of lanes and accelerating/decelerating to jockey for some mythical lane advantage requires one to pay more attention to what Autopilot is doing and seeing, and anticipating when it's about to make a mistake. Interstate 5 @ 70 mph with virtually no curves, merges, or difficult situations the system is more trustworthy. As a driver you have to evaluate the environment Tesla's AP is operating in and with experience you learn to anticipate what it handles really well, and when to disengage it and drive the car itself (HWY 17 north or south in the Santa Cruz mountains is _NEVER_ a good idea, and you simply do not use auto-pilot on this road - even adaptive cruise control is iffy - in that all these systems use line of sight style sensors and the curves are bendy enough that the system will not detect a car 30-40 degrees off center as you come flying around a corner at 60 mph). In twisty road situations my personal experience with both AP1 and AP2 is that neither system can accurately keep the car "in lane" and routinely cross the center line (or push wide onto the shoulder) if allowed to do so - I can personally demonstrate over 50 places in the Bay Area where AP1/AP2 simply do not correctly keep the car in lane, and all of them are medium speed single lane turns that are more sharp than one encounters on faster HWY's - couple this with the lack of "vision" around the corner to detect a stop or slower vehicle once the turn is finished and using Tesla's AP on a twisty road is simply not a good idea in it's current production state.

also at the moment it will run a stop light - so unless there is a car in front of you for the adaptive cruise control to "detect" that it's stopping - you will fly through that intersection even though the light was clearly RED.

Autopilot version 1.0 is better than Autopilot 2.0 (pre-accident system vs. post accident system, AP 2.0 is the new software based on the Full self driving sensor suite Elon introduced about a year ago). I routinely go back and forth between AP1 and AP2 cars, and AP2 just isn't as good as AP1 and is far less smooth or predictable, and often makes wild "dives" around off ramps and during low lane marking visibility.

I am highly dubious of Elon's claims regarding Full Self Driving - but looking forward to future enhancements in driving assistance - AP2 continues to get better (it was even worse 6 months ago) but still has much room for improvement vs. AP1, and has yet to full fill Tesla's full set of promises regarding "enhanced autopilot" (which is a separate set of feature set claims from full self driving).

since the accident Tesla has "improved/changed/made-worse" the Driver nagging to pay attention - Tesla attempts to determine you are paying attention by requiring "Driver" input by detecting light force on the steering wheel - prior to the accident Tesla would notify you of "time to touch the wheel" via Dash board based notices and audible sound - post accident their Drive attention nags are often only visual (not audible) - i.e. you will NOT notice the nag unless you are actually watching the dash board - i.e. you can't have your head turned away watching a Harry Potter DVD on a portable DVD play in the passenger seat…also post accident the auto-pilot nags are less deterministic in terms of their frequency - sometimes you get them back to back, and other times you can go 10-20 minutes with out a single nag (it so seems to be based on the "complexity" of the driving environment, more complex environments seems to illicit more nags, and simple going straight at 65 mph with no cars detected has very few nag notifications - but this isn't based on internal knowledge of the system, but rather as a keen observer).

_IF_ you do not respond to the driver attention nags Tesla will disengage autopilot until the car is turned off and back on (you have to pull to the side of the road, go into park for at least 15 seconds) and then go back into "drive"....

The best description I've come across regarding this horrible accident is as follows with precise words chosen carefully.

"Tesla's autopilot certainly did not cause the accident, rather it simply did not prevent it."

I agree with that assessment. There is no question in my mind that an alert driver would have avoided the accident, and the driving aide failed to properly recover from the driver's inattention.

however let me repeat:

I am highly dubious of Elon's claims regarding Full Self Driving

I also think Elon's going to miss, and miss badly his claim of FSD by the end of the year, and I also predict (much to the Tesla fan boy's chagrin) that Elon will eventually have to refund anyone's money who prepaid for for Full Self Driving when in another 12-18 months it becomes clear:

1. his sensor suite isn't up the task
2. his computational platform isn't up the task
3. software learning isn't knocking down the hurdles fast enough
4. there are so many real world situations that simply can't be anticipated
5. there is no end date that can be reliably estimated and Tesla has already held on to it's customer's money for too long given that they haven't delivered the promised feature set.

We will have FDS vehicles in the next 5-10 years - but I don't believe they will be universally FSD, but rather automated systems that are tuned/optimized/hardened for constrained areas and well defined operating conditions (i.e. bus routes), and over time you will "add" areas that FSD will work, but I don't foresee FSD working well in "random" places in "random" conditions - but it can be made to work reliably in known circumstances in known conditions.
 

·
Registered
Joined
·
782 Posts
NTSB's report indicates that the driver cavalierly ignored continuous warnings from his Tesla to keep his hands on the wheel
having owned a Tesla for over 3 years (4??) and a P85D w/autopilot as my daily driver - it's worth adding some more context here - the driver did not ignore Tesla's warnings, the system is designed to disengage if you "ignore it" (you can not ignore the system for long). Rather the Driver responded to Tesla's continuous warnings by providing inputs indicating to the system he was paying attention when in fact he was not. In the approximate time frame of this tragic accident Tesla's "driver attention" system was both visual and audible - and the audible cues were frequent enough and predictable enough that a driver could actually look away from the road for an extended period of time and simply rely on the audible cues to touch the steering wheel and never actually have to turn one's head to respond to the system - picture watching a DVD on a player in the passenger seat or your lap and simply touching the steering wheel when you "heard" the sound but never actually looking up from your viewing…

since the accident Tesla's driver attention cues are now a random mix of visual only cues (presented on the dash) and sometimes audible - you can no longer simply look away for long periods of time and have auto-pilot remain engaged.

In the time frame of this accident the Driver was not (and could not ignore the system) - but you could provide input to the system without actually paying attention.
 

·
Registered
Joined
·
782 Posts
there has also been a recent change in the AP2 hardware - with AP 2.5 making it's appearance in recent Model S/X and Model 3 the difference so far seem to be (Tesla is keeping quiet)

1. beefier computer platform
2. at least one new camera facing the driver (previous systems had no camera trained on the driver)

I'm betting the camera trained on the driver is to beef up the AI to no longer rely solely on the "steering wheel" input as an attention "test" - but also AI will attempt to determine for itself if you're paying attention based on what it sees in it's camera…
 

·
Registered
Joined
·
1,218 Posts
I concur with your opinion regarding Autopilot system not being at fault. It's not, as there is no such a 'thing' as Autopilot on any Tesla, other than the name of a optional package of components. This package group is actually called "Traffic-Aware Cruise Control", as detailed beginning on page 66 of the 2016 Tesla owners manual. From page 66 - 89, which explain the functionality of the the various components, every page contains nothing but warnings. It's like a pharmaceutical commercial where the first 10 seconds say its a wonder drug, and the remaining 50 seconds disclaiming the 345 fatal side effects of using it:

.
Bingo. I am sure that Musk wouldn't embark on such tech adventure without having his legal behind covered by very thoroughly explaining in writing the exact function and the limitations of the TACC.

So, it's up to every Teslan to sort out in his mind the difference between the (A) Concept of a market-ready fully self-driving car (which doesn't exist and may still be years away), and the (B) Reality of a car with advanced cruise control that makes driving easier, but still requires 100% of driver's attention.

Which some enthusiasts may find difficult, because who wants sobering reality to disturb the smooth flow of a beautiful dream. An extract of a conversation:
 

·
Registered
Joined
·
4,799 Posts
I'm betting the camera trained on the driver is to beef up the AI to no longer rely solely on the "steering wheel" input as an attention "test" - but also AI will attempt to determine for itself if you're paying attention based on what it sees in it's camera…
Of course. And this is where the existing Tesla autopilot falls short - it can't guarantee that the driver is paying attention and in the absence of that guarantee some drivers are bound to get complacent. Sure, it's the driver's fault for not paying attention, but in a world where car manufacturers are legally required to prevent the car from being put into drive when the driver isn't pressing on the brake pedal then the Tesla autopilot is, in my opinion, even more egregious.
 

·
Registered
Joined
·
770 Posts
Discussion Starter #16
having owned a Tesla for over 3 years (4??) and a P85D w/autopilot as my daily driver - it's worth adding some more context here - ...
David, thanks for your candid assessment of the incident and your personal experiences.

Without getting to far off in the weeds, my issue isn't with the technology at all. Give me as many driving aids as possible. But indicate that these are aids, and not some futuristic pipe dream of fully autonomous driving.

I am conflicted with respect to Tesla; Tony Stark...um, er...I mean Elon Musk is the current Steve Jobs with respect to totally understanding the base desires of the consuming public. He has single handedly made EV's cool, covetable, and sexy. Tesla product is unmatched in terms of overall functionality and environmental kindness. Tesla represents the most American made and parts sourced manufacturer in the U.S. The TM3 mass scale production is a challenge of epic proportions, that no company would dare, other than Tesla.

It is my position that the unintended consequences of living by the techno-sword, is that one can die by the techo-sword (literally and figuratively). The Autopilot tragedy, and the spin around it needlessly scare the common consumer away. Early adopters love critical edge technology, but in order to cross the chasm, the masses need to be comfortable with all aspects of EV ownership. It is now important to frame the EV solution in such a way as not to intimidate potential consumers. Tesla, GM, and Nissan (and to a degree Ford) are the 'Big 3' in this market, representing 90% of all EV's on the roads. I think it time to move from the wiz-bang tech approach, to the real economics of EV's.

We inherited a 2013 Volt a few years back. Wife drives 52 miles round trip every day. She initially HATED the Volt...too small, weird to drive, tree-hugger perception, etc. But compared to the old Mercedes Turbo-diesel, and being allowed to re-charge at her workplace, we found that the Volt was basically a FREE car in contrast to the $300/Month she paid BigOil for the privilege of driving to work. Like someone had given her an extra $4000/year.

It's so frustrating for me that, to a person, people think our 11Kw roof-top Solar cost $70K. when in reality, , it was $21K up front, and about $13K after tax credits. Similarly, people wonder why I am so interested in a Bolt, when a Honda Fit is half the price?? Trying to explain that an entry level Tesla is actually less costly than a fully loaded Honda FIt...over 10-years...So a Bolt is even more of a deal is, to them, an incredulous statement.

Time to move away from positioning EV's as tecno-wonders, to positioning EV's as the personal economic miracles they are.
 

·
Registered
Joined
·
782 Posts
as I've said - I'm dubious of Elon's claims and always have been - I agree Tesla has oversold "Autopilot" - but I wouldn't want it removed - and it does need to evolve to be a more responsible implementation.

there were similar tragic incidents of disastrous rear-ends when cruise control was initially rolled out…but over time the systems have developed into pretty safe and well understood. Elon's hype doesn't help, but we are getting there in terms of understanding our assistants limitations.

one thing I will note - is we don't know how many accidents Tesla's auto-pilot system has prevented, because we do not hear, nor do we tabulate "near misses" where the intervention of the system saved some physical damage - I'm willing to bet things are actually safer with Auto-pilot deployed, but not yet safe enough.
 

·
Registered
Joined
·
544 Posts
as I've said - I'm dubious of Elon's claims and always have been - I agree Tesla has oversold "Autopilot" - but I wouldn't want it removed - and it does need to evolve to be a more responsible implementation.

there were similar tragic incidents of disastrous rear-ends when cruise control was initially rolled out…but over time the systems have developed into pretty safe and well understood. Elon's hype doesn't help, but we are getting there in terms of understanding our assistants limitations.

one thing I will note - is we don't know how many accidents Tesla's auto-pilot system has prevented, because we do not hear, nor do we tabulate "near misses" where the intervention of the system saved some physical damage - I'm willing to bet things are actually safer with Auto-pilot deployed, but not yet safe enough.
I'd be pretty confident saying that autopilot has saved more people than it has harmed. The NHSTA seems to agree. https://techcrunch.com/2017/01/19/nhtsas-full-final-investigation-into-teslas-autopilot-shows-40-crash-rate-reduction/

Now it is of course nothing close to what Elon is selling it as, it is most certainly not true autonomous driving. To even claim that without having Lidar as part of the sensor array is to me absurd. Camera arrays and Radar can do a lot, but they can't give the car a true 3d representation of its environment like Lidar can. The reality is you need all three systems working together to give the computer the full true picture of its environment. Add on top of that even with machine learning without true AI the computer will never be completely autonomous because it will be unable to react to things it hasn't been taught. Granted with our current technology we can get to a point where cars will be more reliable and safer than a human and will handle 99.9% of everything we see on a daily basis. So not having true AI isn't a deal breaker.
 

·
Registered
Joined
·
204 Posts
Interesting about Lidar. I don't really understand it's capabilities, but the above video simulation brought up a question in my mind:

The narrator stated that because the semi-trailer was light colored, neither the driver nor the Tesla detected it against a light sky. Aside from the absurdity of a human not seeing a truck because of the color, why wouldn't the Tesla's radar have detected it and reacted accordingly? Would LIDAR have worked better?

In watching the video, I was also struck by how inattentive the driver must have been not to react at all to the semi-truck as it first began to negotiate its left turn: He apparently didn't slow down from 74 mph with an fast-evolving, enormous and imminent threat. As a motorcycle driver (where I can't afford to daydream at all!), as soon as that truck turned into the cross-over, I would have backed of the throttle, did a head check, and changed into the right lane, preparing for either a full-on braking or further evasive maneuver.

My closing thought: every day I see hundreds of drivers not paying attention in their cars, whether it's talking on the phone, texting, eating, putting on makeup, shaving, reading newspapers spread out across the steering wheel (yes!).

And this is when they are SUPPOSED to be personally responsible for their operating a lethal object. Giving any of these drivers aids like are in the Tesla, would almost certainly instantaneously result in further removal of their consciousness from their crucial task. As David wrote above, "As a driver you have to evaluate the environment Tesla's AP is operating in." This sounds WAY too demanding for the drivers I encounter daily. Not a good formula for success.
 

·
Registered
Joined
·
782 Posts
The narrator stated that because the semi-trailer was light colored, neither the driver nor the Tesla detected it against a light sky. Aside from the absurdity of a human not seeing a truck because of the color, why wouldn't the Tesla's radar have detected it and reacted accordingly? Would LIDAR have worked better?
the truck trailer was crossing the road - the Car's front facing radar doesn't look "high" enough, and the forward facing camera didn't distinguish the object properly - but there is a big space between the road and the bottom of the semi - if you're driving between the wheels, and radar didn't not detect an obstacle.
 
1 - 20 of 41 Posts
Top