“For the last 15 or 20 years, anything Silicon Valley companies did was seemingly in the public interest, and society has encouraged that view until quite recently,” Mozilla Corp’s Chairwoman Mitchell Baker noted last week at the RSA Conference in San Francisco. “But those fantasy days are over.”
Panel at RSA Conference 2019
The numerous outcries against tech based on biased algorithms and social media’s negative influences and the instances of tech workers protesting their companies’ ethical choices are just some of the indicators that she is right and that the public has finally recognized that technology can lead to disastrous societal consequences and unwelcome changes.
What can tech companies do about it?
Baker, along with Electronic Frontier Foundation Director Cindy Cohn and Shannon Vallor, AI ethicist and visiting researcher at Google, offered their suggestions during a panel at RSA Conference’s first Public Interest Technologist Track.
Vallor, who is part of Google’s newly formed responsible innovation team, wants tech companies to develop a more mature form of humility and a willingness to engage the public from that place of mature humility.
On occasion, they admit to mistakes and promise they will do better. But Vallor says she would be much more reassured if the companies admitted that it’s hard to get everything right, that there aren’t obvious fixes, and asked public-interest technologists and ethicists for help.
Also, she would like them to engage in practices used widely in the cybersecurity sector: red teaming, adversarial thinking exercises and especially post-mortems.
“It takes a special kind of humility to [do a post-mortem] when [the failure] is perceived as an ethical or social mistake. People fear digging into those kinds of failures and I think we, as an industry, have to develop that kind of humility but also that strength to confront those failures and learn from them,” she noted.
Baker, for her part, also believes that it practically impossible to make an ethical decision at the beginning of a tech project or to figure out all the consequences it may precipitate. But now that’s become crystal clear that all new technologies can have unintended effects on humanity (or parts of it), the tech industry must accept that they will make mistakes, that some of them will be social mistakes, and that part of the public interest is to correct them.
That needs to happen inside existing organizations, she notes, but she would also like to see more organizations that are largely focused on creating public-interest tech.
“One thing that I learned at Mozilla is that people are interested in working in an organization that has a different approach to things,” she says.
“But Mozilla is not enough. Wouldn’t it be great if there were 3 or 5 or 20 different organizations that are actually effective in the market and dedicated to building public-interest technology?”
She mentioned Code For America as an example of an organization that’s on that path, but pointed out that there is still not enough of similar initiatives.
What can technologists do to help?
There’s also a dearth of technologists willing, capable of or available for fighting for technology that will serve public interest.
Cohn unabashedly used the panel for recruiting technologists that could help the EFF.
“I spend most of my time convincing technologists to show up in court cases and policy fights and we need people in both of those things,” she days. “If that’s the kind of thing you want to do – help make some legal or policy decision better – the EFF needs your help.”
She pointed out that technologists don’t need to have deep technical knowledge to be able to help them: just credentials, thoughtfulness and a willingness to do it. Or they can pick a particular technology and learn all about it and about how it’s being used, so that when the EFF wants to talk about it in the halls of Congress and other places, they can present them to share good information.
The EFF is currently in need of technologists who know enough about things like license plate readers, smart meters, airport scanners, digital forensics tools and government tracking devices (to name just a few).
From a technologists’ perspective learning about these technologies does not constitute heavy lifting, she notes, and they can be tremendously important resources for non-technologists engaged in policy and legal debates.
Awareness raising can also serve public interest and, she promises, talking about and explaining technology is a skill that anyone can learn. But for those that are not interested in “talking,” there’s always the option of “doing”: one can work on developing open source tools that will serve the interests of the wider public.
Ideally, though, aspiring helpers and organizations working towards promoting public interest in tech should not limit themselves to one area of activity (e.g., courts, norms, laws, building technologies).
“The people who are trying to build the panopticon and the surveillance state, they don’t just try to make changes in one area. For those of us who want to build a world that we actually want to live in it’s a fool’s game to try to pick which one of these we should do,” she pointed out. “The answer is now either/or but both and more, my friends. And there’s room in all areas to be someone whose voice is heard and who helps make things better.”
Cohn also noted that she would like to see Silicon Valley and tech companies to adopt a pro bono mentality and have their staff work a certain amount of hours on public-interest technology, whether on the advocacy or on the building side.
Positive changes for everybody
For a long time Mozilla concentrated just on building things and didn’t get involved in policy making, Baker noted, but things have changed. For them, expanding in the policy and advocacy space has been a breath of fresh air.
The move has changed the tenor of the organization (even though only a minority of engineers got engaged in the efforts) and they realized that addressing these broader questions can be positive for overall productivity, that interdisciplinary thinking is a boon for creativity and that different perspectives sometimes can provide wholly new answers.
Vallor says that tech companies can gain much by embracing people like social scientists and ethicists and make them part of the conversation. They can broaden and deepen the concept of public interest beyond the very easy tendency to make statements like “We developed this technology for the benefit of humanity” or “We all benefit from innovation”, which oversimplify what the public interest is.
When the people creating and talking about particular products and features really begin to think about the public interest in a deeper and more complex way, better decisions get made and things are seen that might otherwise be missed.
“One of the things we are yet to normalize in this industry is the recognition that those sorts of insights are part of being a good technologist and being a good technology company,” she says.
Even at big companies, who employ a considerable number of ethicists and social scientists, there is a tendency for these individuals to be doing research on their own rather than working day-to-day with the teams that are actually building things.
The latter needs to become a normal thing, she notes, and that’s when institutional learning happens, the company culture changes, and it no longer becomes so difficult to take a concept like fairness and make someone who sees it as a purely mathematical concept to see it is a broader social and political notion that requires a more complex approach. You don’t have to have that conversation over and over again if it becomes a standard practice of good design and building, she points out.
But even before that, we need changes in the educational system, she believes. We need to stop telling students that that they should choose between being interested in tech and being interested in the “soft” sciences (psychology, sociology, political history).
“We have to make people understand that you can’t be a good technologist if you don’t have an understanding of the context that technology lives in, and that it’s very difficult to have humane expertise in the 21 century and not understand how technology works, and not understand how science and technology shape the very foundations of our society and how it functions today,” she concluded.