Chevy Bolt EV Forum banner
  • Hey Guest, welcome to ChevyBolt.org. We encourage you to register to engage in conversations about your Bolt.
  • Battery replacement master thread - please do not create a new thread.

Autonomous Chevy Bolts Already In Testing Phase

17K views 46 replies 14 participants last post by  ElderGeek  
#1 ·
Image


General Motors has just recently acquired a self-driving car start-up called Cruise Automation and they’ve already mounted sensor arrays onto the roofs of a few Chevy Bolts.


The above specimen was caught testing by SpiedBilde on the streets of San Francisco, uploaded by The Verge and interesting enough, one of the drivers in that Bolt is Cruise Automation co-founder Kyle Vogt. He’s obviously not taking a back seat with this project and their site stated as much, “we are testing our autonomous technology on the all new Chevrolet Bolt EV in San Francisco.”

GM has been pretty open about the fact that they see autonomous cars as an integral part of its future, that’s why they put down a $500 million investment in Lyft. They could actually be testing the passenger ferrying abilities of the autonomous Bolt EV since all four seats seems to be occupied in the spy shot.

This shot of their self-driving Bolt is certainly putting pressure on those who are in the autonomous fleet race because soon after the spied photo was uploaded, Uber decided to respond with a shot of their own Ford Fusion hybrid equipped with radars, laser scanners, and high resolution cameras.

Who do you think will win the autonomous fleet arms race?
 
#2 ·
The technology will likely be the easiest hurdle.

California's draft regulations on autonomous driving would prohibit commercial use (Lyft, Uber, etc) and would require a driver and steering wheel. If other states follow Californias lead, it could be a long while before we see "fleets".

“We’re gravely disappointed that California is already writing a ceiling on the potential for fully self-driving cars,” Google said in a statement. Chris Urmson, the director of the company’s self-driving car project, says Google will continue to work with the DMV, but that the proposed regulation “maintains the same old status quo and falls short on allowing this technology to reach its full potential.”1 The rules would bar another appealing use case for autonomous cars: replacing Uber drivers with robots. Any company looking to use autonomous technology for a commercial purpose in the state, like trucking or operating buses, is SOL.
 
#3 ·
I'd assume legislators will cave when manufacturing giants like GM and Ford are pushing for the commercial use of self driving vehicles. Once the self driving systems have been perfected, or as close to perfection as they can get, we'll see them push for the the complete removal of drivers in their fleet.

But, this won't be for a long while yet. Hopefully self driving cars will be more affordable and common when I'm too old to drive.
 
#14 ·
Not quite correct. The test vehicle must pass their goals under bad weather conditions, too. Would you drive (or ride in) a vehicle that hasn't pass rain or snow certification? If I were in charge, I would throw all possible conditions at the test vehicles and observe ther reactions. One will be an encounter with a skidding vehicle on a wet and slippery road, knowing that any quick change of direction (to avoid a collision) can cause a new skid, a loss of control, and a new collision which has happened in real life to many actual drivers. :eek:
 
#20 ·
2020? Maybe, but don't hold your breath.
Even if the technology is ready, not sure regulators, insurance companies, etc. will be. Automakers are perceived as having "deep pockets" and will be sued for even the slightest damage caused by a car in autonomous mode.

I expect we'll see some applications phased in.
Automatic valet at airports (hotels, parking garages, large employee lots) would be cool. Have the car drop you (and your luggage if applicable) at the curb, then park itself at a wireless charging station. It could move when topped off leaving room for the next EV to charge. I could then pick you up when you're ready. This would allow testing in a much more controlled environment than open roads, with lower speeds and fewer variables.
 
#23 ·
That I can see becoming a first for these systems. Audi is a good example of this, they rolled out a prototype that could do exactly this nearly flawlessly, that was a few years ago. By now you can imagine where they're at and what more they can do.

Communicating with other cars on the road will be the real issue.
 
#24 ·
Unfortunately, the first fatality while using autopilot has been reported. As a result, I expect to see much closer scrutiny of all such systems.

Tesla's position is that the driver is still responsible for keeping alert and responding to situations. This is common sense as it is not a completely autonomous system. One of the issues is that use of autopilot, by it's very nature, increases reaction time. Not only does the driver need to recognize a potentially hazardous situation, they also then need to decide if autopilot has recognized the hazard and is acting appropriately. Ideally, a drive wouldn't wait to see if the car is reacting, but reality dictates that the very purpose of the system is to handle mundane tasks. If you take control every time a vehicle is merging onto the freeway, you are in a trucks blind spot, traffic is slowing ahead, you'll never have autopilot on.

This is a systme that has very little room for middle ground. IT should be either:
1) An assist system (auto braking, lane departure warning, etc.)
or
2) Fully autonomous. Asking the driver to remain at least as alert as when not using the sytem is unrealistic and potentially dangerous.

There is a valid (perhaps) argument that autopilot is safer than human drivers. But how good is good enough? 99% is not even close (1 out of 100 situations would result in a collision). In order for autopilot to be viable, it can't be just "safer than human drivers", it must be very, very nearly perfect.

The fatality while using autopilot is still under investigation, and the left turning vehicle will almost certainly be found at fault. It's unlikely we'll ever know for sure if the accident would have been avoided or at least some braking/steering done that would have lessened the severity of the accident if autopilot had not been engaged.

As a motorcyclist, I'm hyper-sensitive to left turning vehicles and this carries over to operating a car as well. If I was using an autopilot system, there is no way I would maintain the same level of awareness, it goes against human nature.
 
#26 ·
I still think Tesla has a good autopilot system - based on all of the reviews and videos I've seen.. However, as has been stated may times, the system is not perfect - you have to exercise caution when using it. The latest information I've seen from the fatal accident states that the driver may have been traveling in excess of 90 mph and may also have been distracted by a video - not a good idea in either case. The accident may have been unavoidable even if the driver or vehicle had been able to brake before impact.

The Tesla autopilot system will get better with more advanced hardware and software. The system is constantly recording more training data as drivers continue to use it. My guess is that the system is at least partially based on AI models, so that new training data will probably can be applied to the models in order to further improve the system. Building a system in this way - using live data from over the world in many different environments - is a good way to build a robust system that is adaptable to the many situations that will be encountered by an autonomous driving system. The system will also gain experience faster, since you have thousands of drivers using it - instead of just a few engineers and/or test drivers.

The accident was tragic, but I think the media is drawing unfair conclusions about Tesla's vehicles - as was the case with the fires that occurred several years back.
 
#27 ·
I'm not pointing the finger at Tesla's Autopilot, but at autopilot systems in general. People are way too easily distracted when they actually have to pay attention in order to keep a car in it's lane and on the road. Remove that requirement and expecting them to pay the same amount of attention (or actually more) is the real "ludicrous mode".

We don't yet know enough about the details to draw any meaningful conclusion as to fault, but there is certainly plenty to go around.

If Tesla's autopilot allows the driver to set a speed of 90+ in a 65 zone, that needs to be addressed (whether or not that was true in this accident). I was under the impression that autopilot recognized speed limit signs and adjusted accordingly. If it indeed lets you set a value 30+ mph over the speed limit, that's likely to be an issue and may be ruled a contributing factor.

It's ironic the the person involved posted one of the most viewed videos of how well autopilot CAN work, and is likely (from current information) the first example of how badly things can go wrong when the system is abused (which I consider inevitable). Tesla's own videos promoting Autopilot show testers blatantly ignoring all the warnings about how to use the system, so it's no real surprise that the same type of behavior occurs in real world situations.
 
#28 ·
Apparently, the driver may have been possibly watching harry potter when this was happening. The fatality is a combination of system failure and the driver being inattentive and I don't think cars will be 100% accurate anytime soon. Improvements will be made but it's hard to take into account obstructions like weather or debris.

The Tesla sensors couldn't differentiate the truck's white colored body from the sky, maybe if the driver was paying attention and notices that the car wasn't slowing down, he could have stomped on the brakes and maybe come out with injuries only. This may serve as a warning to manufacturers and buyers alike, companies need to emphasize that this is like an advanced cruise control designed to be used when conditions are clear and drivers need to pay attention to the road.

I just hope that this wouldn't set back Tesla's, or any other companies autopilot development.
 
#31 ·
People seem to be confusing Autonomous Driving with Tesla's Autopilot (actually Autosteer).

It is a beginning implementation that still requires monitoring and input from the driver - true Autonomous Driving requires neither.

It seems that people are trying to use autosteer like a true autonomous system with predictable results. Part of the blame lies with Tesla and the way they market it (The Autopilot name for one).