loading

Microsoft Calls for Ai Face Recognition Software Regulation

Microsoft Corp., which has come under fire for a U.S.

Microsoft Calls for Ai Face Recognition Software Regulation 1

government contract that was said to involve facial recognition software, said it will morecarefully consider contracts in this area and urged lawmakersto regulate the use of such artificial intelligenceto prevent abuse. The company, one of the key makers of software capable of recognizing individual faces, said itwill take steps to make those systems less prone to bias;develop new public principles to govern the technology; and will movemore deliberately to sell its softwareand expertise in the area. While Microsoftnoted that the tech industry bears responsibility for its products, the company arguedthat government action is also needed.

The only effective way to manage the use of technology by a government is for the government proactively to manage this use itself,Microsoft President and Chief Legal Officer Brad Smithsaid Friday in a blog post .And if there are concerns about how a technology will be deployed more broadly across society, the only way to regulate this broad use is for the government to do so. This in fact is what we believe is needed today a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.

Companies like Microsoft, Alphabet Inc.s Google Inc.have been under fire from civil liberties groups and their own employees for selling AI software, particularly for facial recognition, to the U.

S. government and local police. Its somewhat unusual for tech companies to call for their own products to be more heavily regulated, but Smith and Microsoft AI chief Harry Shum earlier this year authored a treatise saying AI advances would require new laws.

Microsoft Calls for Ai Face Recognition Software Regulation 2

While we appreciate that some people today are calling for tech companies to make these decisions and we recognize a clear need for our own exercise of responsibility we believe this is an inadequate substitute for decision making by the public and its representatives in a democratic republic, Smith wrote. Facial recognition systems often have serious shortcomings and are particularly poor at recognizing and differentiating among people with darker skin.A February paper from one of Microsofts researchers, Timnit Gebru, andJoy Buolamwini of the Massachusetts Institute for Technology Media Lab, showed error rates of as much as35 percent for systems classifying darker-skinned women.

Microsofts research arm has since worked to correct the issues identified in its facial-recognition software bythe researchers. As technology evolves rapidly, Microsoft has been advocating for new laws that directly consider emerging areas like cloud computing and AI. The company fought the U.

S. government over privacy concerns for what Microsoft said was the governments application ofan outdatedcommunications law to cloud computing, then backed compromise legislation that directly addressed data privacy and searches in the era of the cloud. The Microsoft view strikes a somewhat different tone than the response of Amazon to calls for it to stop selling its facial recognition software to police departments and other government agencies.

In a June blog post replying to concerns raised then, AmazonsAI general manager, Matt Wood, wrote There have always been and will always be risks with new technology capabilities. Each organization choosing to employ technology must act responsibly or risk legal penalties and public condemnation. AWS takes its responsibilities seriously.

But we believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future. Among the issues that government regulation should consider, Smith wrote, is whetherpolice use of facial recognition software needshuman oversight and controls and whether retailers must post visible notice if they use such software in public areas. Other areas for consideration include whether companies should be required to obtain consent before collecting a persons images and what legal rights apply to people who believe theyve been misidentified by a facial recognition system.

The technology's use by law enforcement without rules in place raises concerns about racial disparity and privacy, said Barry Friedman , an NYU law professor who runs The Policing Project, which works to set rules and guidelines for law enforcement. The Supreme Court ruled police can't have long-term access to cell phone data without a warrant or probable cause, and similar restrictions should be applied to their use of facial recognition technology, he said. This technology is just rushing on us really fast, Friedman said.

If we don't get ahead of it, we won't be able to put the toothpaste back in the tube. Microsoft last month briefly removed references on its web site to a contract it had secured with U.S.

Immigration and Customs Enforcement after online complaints about the software maker selling to an agency involved in controversial immigration practices separating parents and children crossing the U.S.-Mexican border.

The blog post was later restored. While Microsoft later said its contract with the immigration enforcement agency was for puttingolder email and worker collaboration systems in the cloud, the blog post mentioned the possibility of the agency using the companys facial recognition tools. Hundreds of Microsoft employees signed a petition demanding Microsoft stop selling to the agency.

Amazon has come under fire for selling its Rekognition AI software for this task to local police departments, with the American Civil Liberties Union demanding the company stop letting governments use the technology. At Google, employees revolted over the companys work on ProjectMaven, a Defense Department initiative to apply AI tools to drone footage. Last month Chief Executive OfficerSundar Pichai released a set of principles that pledged not to use its powerful artificial intelligence for weapons, illegal surveillance and technologies that cause overall harm.

The company said it will still work with the military. Neema Singh Guliani, legislative counsel for the ACLU, agreed with Microsoft that lawmakers need to get involved in analyzing the use of facial-recognition software. Congress should take immediate action to enact a moratorium on law enforcement use of this technology until its grave threats to communities can be fully debated, she said.

However, in the meantime, companies like Microsoft, Amazon and others should be heeding the calls from the public, employees, and shareholders to stop selling face-surveillance technology to law enforcement." Microsoft has createdan AI ethics board to examine these kinds of issues and has said that it turned down some contracts to sell some software to certain customers, but has declined to provide details. The new AIboard has urged aslower, more deliberate approach to selling the software, Smith said.

With assistance by Spencer Soper

GET IN TOUCH WITH Us
recommended articles
Cases
no data
Shenzhen Tiger Wong Technology Co., Ltd is the leading access control solution provider for vehicle intelligent parking system, license plate recognition system, pedestrian access control turnstile, face recognition terminals and LPR parking solutions.
no data
CONTACT US

Shenzhen TigerWong Technology Co.,Ltd

Tel: +86 13717037584

E-Mail: info@sztigerwong.com

Add: 1st Floor, Building A2, Silicon Valley Power Digital Industrial Park, No. 22 Dafu Road, Guanlan Street, Longhua District,

Shenzhen,GuangDong Province,China  

                    

Copyright © 2024 Shenzhen TigerWong Technology Co.,Ltd  | Sitemap
Contact us
skype
whatsapp
messenger
contact customer service
Contact us
skype
whatsapp
messenger
cancel
Customer service
detect