Driverless cars worse at detecting children and darker-skinned pedestrians say scientists::Researchers call for tighter regulations following major age and race-based discrepancies in AI autonomous systems.

  • @[email protected]
    link
    fedilink
    English
    191 year ago

    This has been the case with pretty much every single piece of computer-vision software to ever exist…

    Darker individuals blend into dark backgrounds better than lighter skinned individuals. Dark backgrounds are more common that light ones, ie; the absence of sufficient light is more common than 24/7 well-lit environments.

    Obviously computer vision will struggle more with darker individuals.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      1 year ago

      If the computer vision model can’t detect edges around a human-shaped object, that’s usually a dataset issue or a sensor (data collection) issue… And it sure as hell isn’t a sensor issue because humans do the task just fine.

      • @[email protected]
        link
        fedilink
        English
        51 year ago

        And it sure as hell isn’t a sensor issue because humans do the task just fine.

        Sounds like you have never reviewed dash camera video or low light photography.

      • @[email protected]
        link
        fedilink
        English
        61 year ago

        Do they? People driving at night quite often have a hard time seeing pedestrians wearing dark colors.

    • @[email protected]
      link
      fedilink
      English
      171 year ago
      1. No it’s because they train AI with pictures of white adults.

      2. It literally wouldn’t matter for lidar, but Tesla uses visual cameras to save money and that weighs down everyone else’s metrics.

      Lumping lidar cars with Tesla makes no sense