an AI resume screener had been trained on CVs of employees already at the firm, giving people extra marks if they listed “baseball” or “basketball” – hobbies that were linked to more successful staff, often men. Those who mentioned “softball” – typically women – were downgraded.

Marginalised groups often “fall through the cracks, because they have different hobbies, they went to different schools”

  • Nollij@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I suppose it depends on how you define by mistake. Your example is an odd bit of narrowing the dataset, which I would certainly describe as an unintended error in the design. But the original is more pertinent- it wasn’t intended to be sexist (etc). But since it was designed to mimic us, it also copied our bad decisions.

    • Vanth@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Oh man, so many good examples of this.

      See the photo recognition software and smartwatch sensors that don’t work as well for black people because no one thought to make sure black people were adequately represented in the test data.

      Or the decades of medical research based on only male mice because female mice have different hormone levels that introduce more variability (just like female humans) and that’s like, just too much work to deal with and it’s easier to assume the female body responds to medications the same way the male body does.