Advertisement
Canada markets closed
  • S&P/TSX

    21,656.05
    +13.18 (+0.06%)
     
  • S&P 500

    5,022.21
    -29.20 (-0.58%)
     
  • DOW

    37,753.31
    -45.66 (-0.12%)
     
  • CAD/USD

    0.7269
    +0.0005 (+0.07%)
     
  • CRUDE OIL

    82.81
    +0.12 (+0.15%)
     
  • Bitcoin CAD

    84,734.88
    -3,368.27 (-3.82%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • GOLD FUTURES

    2,383.90
    -4.50 (-0.19%)
     
  • RUSSELL 2000

    1,947.95
    -19.53 (-0.99%)
     
  • 10-Yr Bond

    4.5850
    -0.0740 (-1.59%)
     
  • NASDAQ futures

    17,707.50
    +49.00 (+0.28%)
     
  • VOLATILITY

    18.21
    -0.19 (-1.03%)
     
  • FTSE

    7,847.99
    +27.63 (+0.35%)
     
  • NIKKEI 225

    37,954.26
    -7.54 (-0.02%)
     
  • CAD/EUR

    0.6805
    +0.0003 (+0.04%)
     

Microsoft Seeks to Restrict Abuse of its Facial Recognition AI

(Bloomberg) -- Microsoft Corp. is planning to implement self-designed ethical principles for its facial recognition technology by the end of March, as it urges governments to push ahead with matching regulation in the field.

The company in December called for new legislation to govern artificial intelligence software for recognizing faces, advocating for human review and oversight of the technology in some critical cases, as a way to mitigate the risks of biased outcomes, intrusions into privacy and democratic freedoms.

“We do need to lead by example and we’re working to do that,” Microsoft President and Chief Legal Officer Brad Smith said in an interview, adding that some other companies are also putting similar principles into place.

Smith said the company plans by the end of March to “operationalize” its principles, which involves drafting policies, building governance systems and engineering tools and testing to make sure it’s in line with its goals. It also involves setting controls for the company’s global sales and consulting teams to prevent selling the technology in cases where it risks being used for an unwanted purpose.

ADVERTISEMENT

The use of facial recognition software by law enforcement, border security, the military and other government agencies has stirred concerns about the risks of bias and mass surveillance. Research has shown that some of the most popular products make mistakes and perform worse on people with darker skin. Microsoft, Amazon.com Inc. and Alphabet Inc.’s Google have all faced protests from employees and advocacy groups over the the idea of selling AI software to government agencies or the police.

“It would certainly restrict certain scenarios or uses,” Smith said of the principles, adding that Microsoft wouldn’t necessarily reject providing governments with the technology. The company only wants to prevent law enforcement from using the technology for ongoing surveillance of a specific individual without the preferred safeguards, he said.

The company has turned down contracts for that reason, he said. One was a case that Smith said would have amounted to public surveillance in a national capital “in a country where we were not comfortable that human rights would be protected.” Another was deployment by a law enforcement agency in the U.S. that “we thought would create an undue risk of discrimination.”

Asked whether Microsoft would rule out working with Chinese law enforcement, especially in light of new rules to judge citizens on their social behavior, Smith said “it would definitely raise important questions in China.” He said that in any case it appears that Beijing is more interested in procuring facial-recognition technology from local firms instead of American ones.

Despite steaming ahead with the self-imposed rules, the company said industrywide regulation is necessary.

“You never want to create a market that forces companies to choose between being successful and being responsible and unless we have a regulatory floor there is a danger of that happening,” Smith said.

To contact the reporter on this story: Natalia Drozdiak in Brussels at ndrozdiak1@bloomberg.net

To contact the editors responsible for this story: Giles Turner at gturner35@bloomberg.net, Andrew Pollack, Molly Schuetz

For more articles like this, please visit us at bloomberg.com

©2019 Bloomberg L.P.