Google Decides Not to Renew a Military AI Contract

Lovesick Cyborg
By Jeremy Hsu
Jun 7, 2018 8:26 AMNov 19, 2019 8:25 PM
Screen-Shot-2018-06-07-at-12.14.02-AM.png

Newsletter

Sign up for our email newsletter for the latest science news
 

The U.S. military logo for Project Maven, also known as the Algorithmic Warfare Cross-Functional Team. Credit: U.S. Department of Defense Google recently bowed to employee protests by deciding to wind down involvement in a U.S. military initiative called Project Maven next year. The Pentagon project focuses on harnessing deep learning algorithms--specialized machine learning technologies often described as "artificial intelligence"--to automatically detect and identify people or objects in military drone surveillance videos. Company emails and internal documents obtained by the New York Times show Google's attempts to keep its role in the U.S. Department of Defense project under wraps. By early April, more than 3,000 Google employees had already signed an internal letter voicing concerns that Google's involvement with "military surveillance" could "irreparably damage Google’s brand and its ability to compete for talent." On June 1, Gizmodo reported that Google's leadership had told employees that the company would not seek renewal of the Project Maven contract after its 2019 expiration. "I'm glad to see that the Google leadership is listening to the Google employees, who like me, think it would be a serious mistake for Google to do military contracts," said Yoshua Bengio, a professor of computer science at the University of Montreal in Canada and a pioneer in deep learning research. Project Maven, also known as the Algorithmic Warfare Cross-Functional Team, seems initially focused on training computer algorithms to automatically spot and classify objects in videos. Such automated surveillance technologies already exist to some degree and could spare the Pentagon's human analysts from spending countless hours eyeballing thousands of hours of surveillance footage taken by large military drones in countries such as Syria, Iraq and Afghanistan. Similar automated surveillance technologies can be used for beneficial purposes beyond military AI on battlefields. For example, Carnegie Mellon University researchers have developed machine learning software that can automatically detect both wildlife and human poachers in drones' thermal camera imagery taken at night. Nonetheless, the idea of developing automated surveillance technologies for use by the U.S. military touched a nerve among Google employees. The letter signed by several thousand Google employees argues that such surveillance capabilities could easily be used to assist in drone strikes and other missions with "potentially lethal outcomes." Stuart Russell, a professor of computer science and AI researcher at the University of California, Berkeley, said that he does not personally oppose all uses of AI for military purposes. For example, he suggested that military AI used in reconnaissance, logistical planning and anti-missile defense could fall under ethical uses of such technology. Many AI researchers, including Bengio and Russell, have publicly opposed development of technologies for lethal autonomous weapons that could actively identify and engage targets without requiring direct orders from humans. So far, Project Maven's goals are not directly tied to development of such autonomous weapons, which are colloquially referred to as "killer robots" by many who oppose them. Researchers recently organized a boycott campaign that led to a South Korean university agreeing not to develop autonomous weapons under a prior agreement with a defense company. But the dual-use nature of AI technologies that could be repurposed for lethal weapons or missions makes it trickier to regulate usage of such technologies. The same military AI technology that enables automated reconnaissance could also empower an autonomous weapon if combined with a vision-guided missile, Russell pointed out. Still, he suggested that an international ban on the use of autonomous weapons might have helped nip the issue in the bud for Google and its employees. "If there were a treaty banning autonomous weapons, then Google researchers could work on defense-related AI without worrying that the AI would be used to kill people," Russell said. "The threat of misuse goes away." The recent decision on Project Maven does not mean Google will necessarily withhold all its engineering expertise and technologies from the Pentagon in the future. After all, Eric Schmidt, former executive chairman of Google and current technical advisor to Google parent company Alphabet, remains a member of the Defense Innovation Board that serves as an advisory organization for the U.S. military. Sunar Pichai, CEO of Google, clarified the company's views by publishing a set of guiding principles about possible uses of AI on June 7. Besides ruling out AI applications for weapons, Google announced it would avoid pursuing "technologies that cause or are likely to cause overall harm," surveillance technologies that violate internationally accepted norms, and technologies whose main purpose goes against "widely accepted principles of international law and human rights." But Google also left open the door for future work with governments and the military that would "keep service members and civilians safe." "We want to be clear that while we are not developing AI for use in weapons, we will continue our work with governments and the military in many other areas," Pichai said in the blog post. "These include cybersecurity, training, military recruitment, veterans' healthcare, and search and rescue." Editor's Note: This blog post was updated to reflect Google's June 7 publication of its principles on AI use.

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 LabX Media Group