- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Driverless cars worse at detecting children and darker-skinned pedestrians say scientists::Researchers call for tighter regulations following major age and race-based discrepancies in AI autonomous systems.
Okay? It’s not like these systems are actually intelligent. Anything different from the majority of cases is going to be at an inherent disadvantage in being detected, right? At the volume of data used for their models, surely it’s just a matter of statistics.
Maybe I’m wrong (and I’m surely using the wrong terminology), but it seems like that must be the case. It’s not some issue of human racial bias, just a bias based on relative population. Or is my understanding that flawed?
Mind you, I’m not saying it doesn’t need to be remedied posthaste.
Yes, the issue is the data used to teach the systems that people look like are biased towards white men most likely.