iSteve commenter Jack D writes:
One of the weaknesses of automated systems (they are finding this also with self-driving cars) is that they are designed so that if something goes wrong that is beyond the design parameters of the system, they suddenly snap themselves off and return control to you (or in some cases, you snap them off because they are acting funky). The thing says “I give up. Something is wrong but I’m not sure what – human you figure it out and if it crashes it will be your fault and not mine.” Either way, suddenly you are back in control of the plane/car and there is something wrong to begin with that caused the automated system to flake out, but you have been kind of half-dozing and are not mentally prepared to figure out what is wrong AND in meantime pull the plane/car out of a precarious attitude.
In the case of Air France [crash in 2009] it would have been better for the pilot to have done nothing. The stuff that he did made the situation much worse. But that’s supposed to be the pilot’s job – it’s like being a fireman where 99% of the time you sit around and eat pizza (actually nowadays a lot of paid fireman double as EMTs) and 1% of the time you save people (including yourself) from certain death.
I’d like to have an automated system on my car for low-speed parallel parking. But for full speed stuff? What does it gain me if I have to alertly manage my automated car all the time so that I can suddenly take over and go all Captain Sullenberger?