> I still think that true level 5 (i.e. ability to drive autonomously everywhere with zero human oversight with a safety record equivalent to the median human driver) requires AGI.
This might be true. Most of the time (95%) I am on complete human brain autopilot when I’m driving but those other 5% need my full focus and attention. I shut of the radio and tell other passengers to be quite (if I have the time for it).
This assumes that the challenges that are hard for a human are the same challenges that are hard for a self driving car - that might be the case, but self driving cars may have some theoretical advantages such as 360 cameras/lidar and an ability to follow satellite navigation without having to take its eyes off the road.
Put another way, the 5% of times I need to focus are usually the times where I am somewhere new and don’t necessarily understand the road layout - which something like Waymo may avoid through mapping for instance.
It might be true, but plenty of problems that have been thought to require true AGI have later been found to not require it after sufficient research - for example it’s not long ago that we thought good image recognition was entirely out of reach.
Anybody who rides with me on a normal basis have come to learn to recognize the sudden stop halfway through a word when I switch from autopilot brain to active driving. There are times when you need more focus on everything than others.
This might be true. Most of the time (95%) I am on complete human brain autopilot when I’m driving but those other 5% need my full focus and attention. I shut of the radio and tell other passengers to be quite (if I have the time for it).