In traffic incidents, humans drivers are rarely held accountable. It is notoriously difficult to get a conviction for vehicular manslaughter. It is almost always ruled an accident, and insurance pays rather than the human at fault.
Traffic fatalities often kill others, not just the car occupants. Thus, if a self-driving system causes half as many fatalities as a human, shouldn't the moral imperative be to increase self-driving and eventually ban human driving?
Traffic fatalities often kill others, not just the car occupants. Thus, if a self-driving system causes half as many fatalities as a human, shouldn't the moral imperative be to increase self-driving and eventually ban human driving?