Really important! This is exactly how to truly evaluate a company’s real values, if not, their moral position towards development technologies.
By Kate Conger and Cade Metz, NYTimes
Oct. 7, 2018
SAN FRANCISCO — Jack Poulson, a Google research scientist, recently became alarmed by reports that the company was developing a search engine for China that would censor content on behalf of the government.
While Dr. Poulson works on search technologies, he had no knowledge of the product, which was code-named Dragonfly. So in a meeting last month with Jeff Dean, the company’s head of artificial intelligence, Dr. Poulson asked if Google planned to move ahead with the product and if his work would contribute to censorship and surveillance in China.
According to Dr. Poulson, Mr. Dean said that Google complied with surveillance requests from the federal government and asked rhetorically if the company should leave the United States market in protest. Mr. Dean also shared a draft of a company email that read, “We won’t and shouldn’t provide 100 percent transparency to every Googler, to respect our commitments to customer confidentiality and giving our product teams the freedom to innovate.”
The next day, Dr. Poulson quit the company. Mr. Dean did not respond to a request for comment, and Google declined to comment.
Across the technology industry, rank-and-file employees are demanding greater insight into how their companies are deploying the technology that they built. At Google, Amazon, Microsoft and Salesforce, as well as at tech start-ups, engineers and technologists are increasingly asking whether the products they are working on are being used for surveillance in places like China or for military projects in the United States or elsewhere.
That’s a change from the past, when Silicon Valley workers typically developed products with little questioning about the social costs. It is also a sign of how some tech companies, which grew by serving consumers and businesses, are expanding more into government work. And the shift coincides with concerns in Silicon Valley about the Trump administration’s policies and the larger role of technology in government.
“You can think you’re building technology for one purpose, and then you find out it’s really twisted,” said Laura Nolan, 38, a senior software engineer who resigned from Google in June over the company’s involvement in Project Maven, an effort to build artificial intelligence for the Department of Defense that could be used to target drone strikes.
All of this has led to growing tensions between tech employees and managers. In recent months, workers at Google, Microsoft and Amazon have signed petitions and protested to executives over how some of the technology they helped create is being used. At smaller companies, engineers have begun asking more questions about ethics.
And the change is likely to last: Some engineering students have said they are demanding more answers and are asking similar questions, even before they move into the work force.
“What people are looking for — not just employees — they are looking for some clarity,” said Frank Shaw, a Microsoft spokesman. “Are there principles that get applied? Even if you don’t agree with the decision that gets made, if you understand the thinking behind it, it helps a lot.”
Amazon did not respond to a request for comment.
The lack of information about what tech employees are working on was recently evident at Clarifai, an artificial intelligence start-up in New York City.
Last year, a small team of Clarifai engineers began working on a project inside a private room at its downtown New York office, said three people with knowledge of the matter, who spoke on the condition that they not be identified for fear of retaliation. Paper covered the windows, and employees called the room “The Chamber of Secrets,” in a sly reference to the second Harry Potter novel. Even the eight engineers and researchers working inside the room did not entirely realize the nature of the project, the people said.
When employees asked about the project in meetings, Clarifai’s chief executive, Matt Zeiler, said it was a government project related to “analytics” or “surveillance” and would “save lives,” according to the people.
After employees read documents posted to Clarifai’s internal systems, it became clear that the company had won a contract for Project Maven and that workers were creating something for the Defense Department, the people said. One engineer quit the project immediately after a meeting with the Defense Department where killing was discussed in frank terms, they said.
A Clarifai spokesman said that at the very beginning of the project, the company sat down with those chosen for it to brief them on the nature of the work, and one employee quit the project then. “Every member of Clarifai’s Project Maven team agreed to work on the project, and the two people who chose not to participate were assigned to different efforts across the company,” the spokesman said.
Dr. Poulson, whose work involved incorporating a variety of languages into Google search, said he did not initially think his research could be involved in Dragonfly — until he noticed Chinese had been added to a list of languages for his project.
“Most people don’t know the holistic scope of what they’re building,” said Dr. Poulson, 32, who worked at Google for over two years. “You don’t have knowledge of where it’s going unless you’re sufficiently senior.”
The difficulties of knowing what companies are doing with technologies is compounded because engineers at large tech companies often build infrastructure — like algorithms, databases and even hardware — that underpins almost every product a company offers. At Google, for example, a storage system called Colossus is used by Google search, Google Maps and Gmail.
“It would be very difficult for most engineers in Google to be sure that their work wouldn’t contribute to these projects in some way,” said Ms. Nolan, who helped to keep Google’s systems running online smoothly. “My personal feeling was that if the organization is doing something I find ethically unacceptable, then I was contributing to it.”
Yet executives at tech companies have claimed that complete transparency is not possible.
“We’ve always had confidential projects as a company. I think what happened when the company was smaller, you had a higher chance of knowing about it,” Sundar Pichai, Google’s chief executive, said at a staff meeting in August, according to a transcript provided to The New York Times. “I think there are a lot of times when people are in exploratory stages where teams are debating and doing things, so sometimes being fully transparent at that stage can cause issues.”
Such policies have rippled beyond tech companies. In June, more than 100 students at Stanford, M.I.T. and other top colleges signed a pledge saying they would turn down job interviews with Google unless the company dropped its Project Maven contract. (Google said that month that it would not renew the contract once it expired.)
“We are students opposed to the weaponization of technology by companies like Google and Microsoft,” the pledge stated. “Our dream is to be a positive force in the world. We refuse to be complicit in this gross misuse of power.”
Alex Ahmed, a doctoral candidate in computer science at Northeastern University in Boston, said she organized a student discussion on campus this month to debate whether they should work for tech companies that made decisions they believed to be unethical.
“We’re not given an ethics course. We’re not given a political education,” Ms. Ahmed, 29, said. “It’s impossible for us to do this unless we create the conversations for ourselves.”
Over the summer, she said, students at Northeastern also protested the school’s multimillion-dollar research contract with Immigration and Customs Enforcement, under which it would provide research on technology exports to the agency. A Northeastern spokeswoman did not respond to a request for comment.
Bridget Frey, the chief technology officer at the online real estate company Redfin, said job candidates had increasingly raised ethical questions in interviews. This summer, interns questioned Redfin’s chief executive, Glenn Kelman, about whether the way the site displays school information and test scores could contribute to socio-economic divides in neighborhoods. In response, the company said it planned to add more context about the test score information early next year.Employees are now frequently asking, “If you don’t share the information with me, how can I make sure this isn’t happening here?” Ms. Frey said.