Chevy Bolt EV Forum banner
  • Hey Guest, welcome to ChevyBolt.org. We encourage you to register to engage in conversations about your Bolt.
  • Battery replacement master thread - please do not create a new thread.
21 - 40 of 47 Posts
That I can see becoming a first for these systems. Audi is a good example of this, they rolled out a prototype that could do exactly this nearly flawlessly, that was a few years ago. By now you can imagine where they're at and what more they can do.

Communicating with other cars on the road will be the real issue.
 
Unfortunately, the first fatality while using autopilot has been reported. As a result, I expect to see much closer scrutiny of all such systems.

Tesla's position is that the driver is still responsible for keeping alert and responding to situations. This is common sense as it is not a completely autonomous system. One of the issues is that use of autopilot, by it's very nature, increases reaction time. Not only does the driver need to recognize a potentially hazardous situation, they also then need to decide if autopilot has recognized the hazard and is acting appropriately. Ideally, a drive wouldn't wait to see if the car is reacting, but reality dictates that the very purpose of the system is to handle mundane tasks. If you take control every time a vehicle is merging onto the freeway, you are in a trucks blind spot, traffic is slowing ahead, you'll never have autopilot on.

This is a systme that has very little room for middle ground. IT should be either:
1) An assist system (auto braking, lane departure warning, etc.)
or
2) Fully autonomous. Asking the driver to remain at least as alert as when not using the sytem is unrealistic and potentially dangerous.

There is a valid (perhaps) argument that autopilot is safer than human drivers. But how good is good enough? 99% is not even close (1 out of 100 situations would result in a collision). In order for autopilot to be viable, it can't be just "safer than human drivers", it must be very, very nearly perfect.

The fatality while using autopilot is still under investigation, and the left turning vehicle will almost certainly be found at fault. It's unlikely we'll ever know for sure if the accident would have been avoided or at least some braking/steering done that would have lessened the severity of the accident if autopilot had not been engaged.

As a motorcyclist, I'm hyper-sensitive to left turning vehicles and this carries over to operating a car as well. If I was using an autopilot system, there is no way I would maintain the same level of awareness, it goes against human nature.
 
Very well thought out Post. I especially agree with your point 2. It will be interesting to see what comes out of the NHTSA investigation. This accident will increase the discussions about autonomous driving exponentially.
 
I still think Tesla has a good autopilot system - based on all of the reviews and videos I've seen.. However, as has been stated may times, the system is not perfect - you have to exercise caution when using it. The latest information I've seen from the fatal accident states that the driver may have been traveling in excess of 90 mph and may also have been distracted by a video - not a good idea in either case. The accident may have been unavoidable even if the driver or vehicle had been able to brake before impact.

The Tesla autopilot system will get better with more advanced hardware and software. The system is constantly recording more training data as drivers continue to use it. My guess is that the system is at least partially based on AI models, so that new training data will probably can be applied to the models in order to further improve the system. Building a system in this way - using live data from over the world in many different environments - is a good way to build a robust system that is adaptable to the many situations that will be encountered by an autonomous driving system. The system will also gain experience faster, since you have thousands of drivers using it - instead of just a few engineers and/or test drivers.

The accident was tragic, but I think the media is drawing unfair conclusions about Tesla's vehicles - as was the case with the fires that occurred several years back.
 
I'm not pointing the finger at Tesla's Autopilot, but at autopilot systems in general. People are way too easily distracted when they actually have to pay attention in order to keep a car in it's lane and on the road. Remove that requirement and expecting them to pay the same amount of attention (or actually more) is the real "ludicrous mode".

We don't yet know enough about the details to draw any meaningful conclusion as to fault, but there is certainly plenty to go around.

If Tesla's autopilot allows the driver to set a speed of 90+ in a 65 zone, that needs to be addressed (whether or not that was true in this accident). I was under the impression that autopilot recognized speed limit signs and adjusted accordingly. If it indeed lets you set a value 30+ mph over the speed limit, that's likely to be an issue and may be ruled a contributing factor.

It's ironic the the person involved posted one of the most viewed videos of how well autopilot CAN work, and is likely (from current information) the first example of how badly things can go wrong when the system is abused (which I consider inevitable). Tesla's own videos promoting Autopilot show testers blatantly ignoring all the warnings about how to use the system, so it's no real surprise that the same type of behavior occurs in real world situations.
 
Apparently, the driver may have been possibly watching harry potter when this was happening. The fatality is a combination of system failure and the driver being inattentive and I don't think cars will be 100% accurate anytime soon. Improvements will be made but it's hard to take into account obstructions like weather or debris.

The Tesla sensors couldn't differentiate the truck's white colored body from the sky, maybe if the driver was paying attention and notices that the car wasn't slowing down, he could have stomped on the brakes and maybe come out with injuries only. This may serve as a warning to manufacturers and buyers alike, companies need to emphasize that this is like an advanced cruise control designed to be used when conditions are clear and drivers need to pay attention to the road.

I just hope that this wouldn't set back Tesla's, or any other companies autopilot development.
 
People seem to be confusing Autonomous Driving with Tesla's Autopilot (actually Autosteer).

It is a beginning implementation that still requires monitoring and input from the driver - true Autonomous Driving requires neither.

It seems that people are trying to use autosteer like a true autonomous system with predictable results. Part of the blame lies with Tesla and the way they market it (The Autopilot name for one).
 
Companies should just call the system advanced cruise control or something similar.

I assume when people hear "Autonomous", they think of a system similar to the one in Asimo the robot, something that can think for itself and make decisions based on its surroundings.
 
They are from Oregon but GM doesn't have a production plant there and the Bolt is produced in the Orion plant. These Bolt EVs we are seeing could be in the midst of a long distance road test.
Yes, GM actually pays people to drive pre-produciton models thousands of miles to log data. Not just the Bolt, it's part of bringing a new /refreshed model to market (Silverado, Corvette, etc).

Tesla skips this step, and it's my opinion that quality suffers because of it. On the relatively low volume S and X, they've been able to handle it (although the service centers have been booked months out for routine S service while dealing with all the X problems). I hope they do some serious testing on the 3, and don't start shoving them out the door as soon as the first unit comes off the line. I doubt they'll change their ways, but maybe they'll hire someone to oversee production that will convince them of the wisdom of thorough testing before shipping products.
 
Tesla as a new automaker is in a difficult spot. Their warranty work appears to be nearly double other mainstream manufacturers, yet they receive very high marks for customer service. The challenge will be when they have hundreds of thousands of Model 3's being delivered will they be able to keep up their high grades for service. I know one potential Tesla customer who said he is not interested in a Tesla because his neighbor's Model S is in the repair shop so frequently. I don't know if that is an early model known to have more defects. Tesla has made progress since then. It remains to be seen how reliable the early Model 3's will be. I certainly hope GM does well with the Bolt but it is always buyer beware in the first year production run.
 
How does one get those test driving jobs? Not that I'm interested in applying or anything. :D

I assumed long distance test drives were a standard in the automotive industry, didn't think that Tesla skipped this all important step and use their buyers as guinea pigs. How many test miles will the Bolt log before we get them?
 
21 - 40 of 47 Posts