Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In Australia, one of the big reasons have been invasive surveillance - They have multiple cameras set up inside, tracking how many seconds you are looking here or there. If it detects your eyes not doing what the AI deems acceptable, then it pings your manager up in his executive chair and they get to yell at you.

Basically takes your attention completely out of driving the truck and doing your job, to deliberately making actions to stay within the allowed parameters of AI.



That sounds like a horrible situation to be in, but I can also understand the safety benefit. From high school, I remember a friend telling me about how they crashed. They glanced at the radio to switch channels (or change volume or something), which was just enough for the situation on the road to change and they hit the car in front of them. Clearly, that wasn't a good time to look away for even a split second, and a mature driver should know that. The empty highway has a lot more "safe" opportunities for glances away from the road, but I'm sure there's friction because the AI isn't lenient in this way. By the time the AI is aware of that, it's probably going to be doing the driving anyway!

I recently used lane keeping assistance and automatic cruise control for a long distance road trip, and it's SO liberating. You actually feel safe taking glances at the landscape. I never felt safe doing that in an old car. The new experience is simply not as exhausting as running the speed check, lane check, distance check loop required in old cars.


Typically these things are not about improving safety but about reducing liability for the company and shifting blame to the employee.


> I recently used lane keeping assistance and automatic cruise control for a long distance road trip, and it's SO liberating. You actually feel safe taking glances at the landscape.

And then you read about Teslas crashing into stationary vehicles because their drivers weren't attentive enough.

Current driver assist tech is dangerous because it is good enough to make you trust it to some extent, but not good enough to actually deal with unexpected situations when your attention isn't 100% on the road.


I don't think all driver assist tech is dangerous. Tesla FSD is dangerous because the name suggests no supervision is needed, and it's good enough that drivers fall asleep at the wheel. My driver assist experience is with a 2020 Honda Insight. The LASK/ACC features make the car a collaborator in the driving process. I can't fall asleep, it requires too much supervision/collaboration. But instead of running the exhausting speed/lane position/follow distance loop as fast as I can, I can do it at some variable fraction of that speed depending on road conditions. I have to monitor for the things I know or suspect cause issues for the automated system. For example, it generally wants to follow exits, so I have to put resistance on the wheel when passing exits to keep it on the highway. On the other hand, it is excellent (better than me) at making fine adjustments to stay between well marked lines on long straight stretches... that's where I feel comfortable stealing a glance out the side window or resting my foot on the floor instead of hovering over the pedals. It doesn't avoid debris sitting in the middle of the road, so I have to watch for that and wrestle LKAS to avoid if I see that coming.


Yes, but, as you say, the AI should be looking outside, rather than at the driver.


> You actually feel safe taking glances at the landscape.

Until you discover that the driver aid doesn't work correctly at that time.


I laughed until i realized that I’ve seen the exact software for this already, so it is unfortunately not an exaggeration.


My father in law has been a trucker in the US for a long time, and really enjoys it, or rather used too. Video cameras being installed in the truck cab are the reason he's changed his opinion. In every way he is an upstanding citizen, employee, and driver, and even though he's passed retirement age he enjoys driving seasonally. I think this will be his last season due to the cameras.

I know that if someone was watching me during all working hours I would also look for the exit.


That sounds like a very dangerous case of Goodhart's law. Sheer insanity.


Of course Australia allows road trains, that is a truck tethered to 4 or 5 trailers with a length of over 50 meters.


That's just... inhuman.


If you want a picture of the future, imagine an AI watching a human — forever.


But honestly is it much different than these new ultra-fast grocery delivery services? Those drivers aren't given a second to breath


It's not in-human if you can't trust your drivers to do their job honestly. It won't be used everywhere but if you constantly wonder why some drivers seem to deliver far less than others and then it magically disappears once you install monitoring, it "proves its worth".

It doesn't sound nice but when margins are paper-thin, it is not surprising.


If margins are paper thin, they should cut costs elsewhere, not human lives. I know a trucker who's stated that if they didn't lie on their timesheets or use amphetamines, there's no way they'd get the driving required done, they'd get canned, and another person would either falsify or use uppers to keep going. This kind of thinking could then be used for anything - office building workers, for instance, why is 0.5% of time mail not being sorted? It's entirely dehumanizing and takes creative process and ingenuity (what little there is in autonomous jobs) down the drain.


> timesheets or use amphetamines

If you're not lying on your timesheets, why would you need to resort to amphetamines?


These companies will say it's illegal to lie in time sheets and they'll fire you if the catch you but then they don't try to catch anything too hard and set up what you have to do so that there is no way to do it without lying on time sheets. That kind of behavior is very common for lower level workers so the company gets to eat the cake of illegal practices while protecting themselves by saying it's against policy. See all the Amazon warehouse bathroom issues.


Many companies are movie to digital timesheets where the truck figures out if you are driving or not. You pretty much just need to say 'on/off/sleeper'. Those are harder to lie on.

The correct action the companies should be doing is reviewing their routes and what is a realistic way to do them. That route from 1970 no longer is the same time frame. The driving rules have changed and so has the road layout.


> It's not in-human if you can't trust your drivers to do their job honestly.

Well, as drivers get paid per mile I’m not sure how they can be “dishonest” and still take home a decent paycheck.

The company I run for will install driver facing cameras for the top (bottom?) 200 drivers, the ones who set off the robotruck unsafe driving algorithm too many times to see what they’re doing wrong. The alternative would be to just fire them so this policy is marginally better I suppose.

It is pretty hard to get the truck to report you AFAICT — only managed to set it off once and that was when a car purposely brake checked me pretty hard, if the truck didn’t have collision avoidance I’d have slammed into the back of them because by the time I realized what they were doing it was already too late. This other time I was about to take everyone out due to another driver unsafely merging and the truck was like “ho hum, nothing to see here, carry on”.


But who's monitoring the monitors - bet margins are not thin enough not to offer cushy non-monitored positions for management.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: