Advertisement
Canada markets closed
  • S&P/TSX

    22,814.81
    +206.78 (+0.91%)
     
  • S&P 500

    5,459.10
    +59.88 (+1.11%)
     
  • DOW

    40,589.34
    +654.27 (+1.64%)
     
  • CAD/USD

    0.7229
    -0.0006 (-0.08%)
     
  • CRUDE OIL

    76.44
    -1.84 (-2.35%)
     
  • Bitcoin CAD

    94,245.58
    +611.37 (+0.65%)
     
  • CMC Crypto 200

    1,376.30
    +45.69 (+3.43%)
     
  • GOLD FUTURES

    2,385.70
    +32.20 (+1.37%)
     
  • RUSSELL 2000

    2,260.07
    +37.09 (+1.67%)
     
  • 10-Yr Bond

    4.2000
    -0.0560 (-1.32%)
     
  • NASDAQ

    17,357.88
    +176.16 (+1.03%)
     
  • VOLATILITY

    16.39
    -2.07 (-11.21%)
     
  • FTSE

    8,285.71
    +99.36 (+1.21%)
     
  • NIKKEI 225

    37,667.41
    -202.10 (-0.53%)
     
  • CAD/EUR

    0.6654
    -0.0013 (-0.19%)
     

A new book explains how AI assistants can reinforce racial and gender bias

Good morning, Broadsheet readers! More than three-fourths of women aged 40 to 64 report no menopause accommodations at work, the Republican Party’s new policy platform omits nationwide abortion ban, and the new book Mastering AI explains how the technology can reinforce racial and gender bias. Have a wonderful Wednesday.

- All about AI. If you are eager to understand the technology driving business today—AI—look no further than my colleague Jeremy Kahn's new book. Published this week, Mastering AI: A survival guide to our superpowered future is both a resource for those seeking to better understand the transformational technology and a guide to where AI is headed over the next decade-plus.

In a new excerpt from his book published by Fortune, Jeremy dives into one of AI's most pernicious challenges: bias. "By influencing how we think about what we do, buy, and say...technology is chipping away at our ability to freely make our own decisions," he writes. "Personalized AI assistants will make these problems worse, wrapping us in the ultimate filter bubble, controlling the innumerable decisions that make up our lives."

"Mastering AI" by Jeremy Kahn.
"Mastering AI" by Jeremy Kahn.

From political echo chambers to conspiracy theories, the consequences are vast. Of course, they also extend to racial and gender bias. As Jeremy explains: "Using an AI assistant with a particular hidden viewpoint to help write an essay for or against a particular position subtly shifted the user’s own views on that topic in the direction of the bias...Trained from vast amounts of historical data, many LLMs harbor hidden racial or gender biases that could subtly shape their users’ opinions—for instance, an AI assistant for doctors that falsely believes that Black people have thicker skin or a higher pain threshold than white people."

ADVERTISEMENT

The solution is to "mandate that tech companies reveal far more about how their AI models have been trained and allow independent auditing and testing," he argues. For more, read the full excerpt on Fortune’s site and order Mastering AI here.

Emma Hinchliffe
emma.hinchliffe@fortune.com

The Broadsheet is Fortune's newsletter for and about the world's most powerful women. Today's edition was curated by Joseph Abrams. Subscribe here.

This story was originally featured on Fortune.com