Pichai explained that Google would not use its A.I. for "weapons or other technologies whose principal objective or implementation is to cause or directly facilitate injury to people" nor to support "technologies that gather or use information for surveillance violating internationally accepted norms of human rights".
"While we are not developing AI for use in weapons, we will continue our work with governments and the military in many other areas", Pichai wrote.
"We will not be pursuing follow on contracts for the Maven project, and because of that, we are now working with our customer to responsibly fulfill our obligations in a way that works long-term for them and is also consistent with our AI principles", she added, confirming Gizmodo's reporting last week that Google would not seek to renew its Maven contract after it expires in 2019. Google lists seven core values for its AI research and lists several applications that are off-limits.
The document, which also enshrines "relevant explanations" of how AI systems work, lays the groundwork for the rollout of Duplex, a human-sounding digital concierge that was shown off booking appointments with human receptionists at a Google developers conference in May. Thousands of Google employees petitioned the contract and some quit in protest.
However, the search company will take on government contracts that it believes won't be used to hurt people (or at least will be beneficial enough to justify the harm).
The principles might bring to mind sci-fi legend Isaac Asimov's "Three Laws of Robotics", which boil down to robots shouldn't harm humans, they should protect them. And we will continue to thoughtfully evaluate when to make our technologies available on a non-commercial basis.
Google has said Duplex will identify itself so that wouldn't happen.
Google said Thursday that it would not let its artificial intelligence (A.I.) tools be used for deadly weapons or surveillance.
AI made by Google, Pichai states, should uphold standards of scientific excellence, be accountable to people, tested for safety, and avoid reinforcing bias. Google is widely seen as a potential contender for a massive contract to move Defense Department systems to cloud servers. "Google is already battling with privacy issues when it comes to AI and data; I don't know what would happen if the media starts picking up a theme that Google is secretly building AI weapons or AI technologies to enable weapons for the Defense industry".
"Ultimately, how the company enacts these principles is what will matter more than statements such as this", Asaro said. "In the absence of positive actions, such as publicly supporting an global ban on autonomous weapons, Google will have to offer more public transparency as to the systems they build".