Hacker News new | past | comments | ask | show | jobs | submit login

You just described the problem nicely. The "magic" that turns 3 lane roads into 2 lanes etc, is a situational awareness that is really, really difficult to impart on a learning system. The big problem is that probabilistic models don't have a notion of "common sense" solution to an odd situation. They need to have seen the situation, or something very similar to it, enough to make a reasonable calculation of what to do.



The 3 lane to 2 lane problem is solved already. These cars follow a centimeter-accurate 3D map that has the lanes precisely defined (as well as acceptable speeds, location of stoplights, etc).

The Google car knows the lane change is approaching long before it shows up on any sensor.


This isn't about a lane change approaching. This is about people disregarding the concept of lanes when there's fresh snow on the ground because they have no idea where the lanes are.


Isn't that last sentence describing common sense?


Not quite. I would define common sense as "a reasonable fallback solution given that the current situation is unfamiliar." This is something AI systems have a LOT of difficulty with, the self-driving car being no exception.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: