quote:Tesla is under investigation because its cars keep hitting emergency vehicles
Federal safety regulators are investigating at least 11 accidents involving Tesla cars using Autopilot or other self-driving features that crashed into emergency vehicles when coming upon the scene of an earlier crash.
The National Highway Transportation Safety Administration said seven of these accidents resulted 17 injuries and one death. All of the Teslas in question had the self-driving Autopilot feature or the traffic-aware cruise control engaged as they approached the crashes, the NHTSA said.
Curious to see what comes from the investigation but I suspect it boils down to “people not paying attention …”
Even when driving with Autopilot, my hands are on the wheel, my feet are over the pedals and I’m ready to take over at any time. Anything else is flat out negligent.
Just like in a plane - the automation keeps the speed constant and the wings level. Not flying into a mountain is the pilot’s job. Same for the car … keeps my speed up and keeps you in the lane. Not smashing into something is still the driver’s job.
A new NY Times article: "Inside a Fatal Tesla Autopilot Accident," where a Tesla ran a stop sign with a flashing red light and crashed into a parked Tahoe, killing one person and injuring the second.
The article also states that, "Tesla’s critics contend that Autopilot has several weaknesses, including the ability for drivers like Mr. McGee to use it on local roads. With the help of GPS and software, G.M., Ford Motor and other automakers restrict their systems to divided highways where there are no stop signs, traffic lights or pedestrians."*
* Regarding the use of Autopilot on local roads, there's a location in Yosemite where at least 5 Teslas have crashed at the same spot. - »insideevs.com/news/52544 ··· t-crash/
I've been saying this since Tesla announced it, AI as us humans think they are simply does not exist, there's no way currently except for programming the vehicle to behave in a certain manner.
They've been talking about AI since the early 90s and it's nowhere ready to be used on the roads. Heck, probably not even in the next 10 years. The amount of CPU processing required right now is several orders of magnitude of what would be needed, then it's just the buggy AI software that needs to be sorted out.
I'm not at all hopeful I'll see it by the time I retire. (Hint: over two decades.)
A new NY Times article: "Inside a Fatal Tesla Autopilot Accident," where a Tesla ran a stop sign with a flashing red light and crashed into a parked Tahoe, killing one person and injuring the second.
Autopilot is designed for use on divided highways only. You have to agree to that to enable the feature in the car.
Point blank: unless they’re doing the beta test …. Nobody should be driving on Autopilot anywhere there’s a stop sign. Point blank. Just because you CAN doesn’t mean you SHOULD. The responsibility to use the feature appropriately lies with the driver and the driver alone. Agreeing to opt into the beta puts the onus completely on the driver IMO.
It’s like cruise control - you could enable it at 70mph in your residential neighborhood. Nobody blames Ford if you do that — it’s the driver.
Basically all Autopilot accidents are driver error. No different than enabling cruise control and not monitoring it and rear ending the car in front of you. From what I've heard, most of the accidents in the NHTSA investigation are in the same vein. DUIs. Driving way beyond speeds AP can even be enabled at. I suspect most of these will be thrown out with that. Any that aren't or are given some "better driver monitoring" type request...well Tesla already updated that with driver facing camera eye monitoring.
As far as FSD Beta (very limited use at the moment by maybe 2000 cars), I'm not sure I've heard of any accidents with it yet. If anything...it's too cautious. That is the march of the 9s they are working on. It's easy to pick on the delays but it's pretty ambitious what they are attempting. Time will tell if they can pull it off with vision, but I wouldn't bet against them. If you don't believe them, that's a $10k option you leave off the purchase or monthly rental. I don't see where it's a problem if you don't buy it.
When the CEO says things, it's important. "Elon Musk Says Self-Driving Tesla Cars Will Be in the U.S. by Summer" - March 19, 2015 - »www.nytimes.com/2015/03/ ··· mer.html
"I’m extremely confident that level 5 [self-driving cars] or essentially complete autonomy will happen, and I think it will happen very quickly... I remain confident that we will have the basic functionality for level 5 autonomy complete this year... I think there are no fundamental challenges remaining for level 5 autonomy." - July, 2020 - »bdtechtalks.com/2020/07/ ··· earning/ (This is a very interesting article.)
Eventually they'll discover that they can't put all liability on the user, just like the drug companies have discovered. How many lawsuits have been filed against Tesla?
Regarding cruise control, my Honda will automatically stop if the car in front slams on the brakes. My car also nags me if I haven't moved the steering wheel in ~30 seconds, so I don't know why Teslas can't nag the driver to pay attention. (My car has a basic lane following system, plus it tracks so straight, I don't have to turn the wheel.)
Regarding cruise control, my Honda will automatically stop if the car in front slams on the brakes. My car also nags me if I haven't moved the steering wheel in ~30 seconds, so I don't know why Teslas can't nag the driver to pay attention. (My car has a basic lane following system, plus it tracks so straight, I don't have to turn the wheel.)
Except that it does. If the steering wheel doesn’t sense any torque, it sets off alerts. If you ignore it, you get locked out of the system for the rest of your drive.
The thing is, people do silliness like hang weights off the steering wheel to defeat the torque check. I don’t see why/how the mfr should be held responsible for it.
If you were interested, I’d be happy to do an in-person demo for you. The system is fairly robust - if people do things that are stupid to get around it, that should be on them.
(And yes, Musk is a loudmouth who should keep his trap shut …. But I don’t hold that against the product.)
Yup and you can defeat the other systems as well. If you’re determined to drive unsupervised, nothing can really stop you. »www.teslarati.com/tesla- ··· t-video/
Just like in a plane - the automation keeps the speed constant and the wings level. Not flying into a mountain is the pilot’s job. Same for the car … keeps my speed up and keeps you in the lane. Not smashing into something is still the driver’s job.
Somes are better than others. In certain (GA) situations, letting "George" fly will get you killed.
Just like in a plane - the automation keeps the speed constant and the wings level. Not flying into a mountain is the pilot’s job. Same for the car … keeps my speed up and keeps you in the lane. Not smashing into something is still the driver’s job.
Somes are better than others. In certain (GA) situations, letting "George" fly will get you killed.
Absolutely! … putting the responsibility in the pilot’s hands to decide when it can be used safely.