The issue here remains the tendency of people (even those we would all consider rational) to "OK" rapidly through the "Terms of Use" screens to start off the enabling of the autopilot feature. It pretty much warns you it will have the capabilities of an overly-cautious 16 yo driver so that you need to be ready on a moment's notice to take over and that it will be quick to turn the driving back over to you if it gets remotely confused. It has been a great, very rarely-used feature for me as a back up of sorts when crossing southern NJ late at night to go from Long Beach Island to the other side of the state when an overly cautious kid driver is likely better than I might be even though I'm fully alert in my opinion but tired.As the article basically says, I wouldn’t single out Tesla for having the most autopilot crashes since they have lots more miles travelled than the other manufacturers. But these crashes just reinforce, for me, the stupidity of relying on software to operate cars absent V2V and V2I.
I hate the idea that there are cars out there on the road with me are using these semi automated systems. I hope states start banning it.
In the end, human choices are as much (if not more) a part of the problem with this technology's employment and that weak link factor leads me to agree with you more and more with each autopilot-related accident. (...and they need to change the name from "autopilot" to "driver assist"; it's part of the human problem)