The Wrong Culprits

Hannah Fry, an associate professor in the mathematics of cities at University College London, said an equivalent of the doctor’s oath was crucial given that mathematicians and computer engineers were building the tech that would shape society’s future.

Maths and tech specialists need Hippocratic oath, says academic — The Guardian

Ms. Fry seems to have left one bubble only to get stuck in another. It's not just the mathematicians and software developers who should be thinking about the ethical and long-term concerns with any given technology, but the people who lead organizations that build the digital tools. How often do we hear someone say something along the lines of "I don't like this, but it's my job and I have bills to pay, so I'll do it anyway?" It's all well and good to lay the blame for the adverse effects of social networks, facial recognition, and machine learning at the feet of the early pioneers of the fields, but it's a little too convenient as well.

Mark Zuckerberg didn't write every line of code that powers all of Facebook's tools. Larry Page and Sergey Brin didn't write every line of code that powers all of Google's tools. The same can be said for for Jeff Bezos, Bill Gates, and every other person who has led a group of people that has created something that the world once considered to be impossible outside the realm of fiction. It is not just the people who make the tools that should consider the ramifications of their labor.

Looking at this a little more broadly, everybody should pledge an oath equivalent to the Hippocratic oath for every job they do. We rely on so many people from so many industries to do things we're either unwilling or unable to do. We must trust that the labours of others will not harm us or the people we care about. Should meat processors pledge to not mix stale bread or rodents into their beef in an effort to reduce costs while conning the consumer out of money?1 Yes. Should a person who develops medicines do their darnedest to ensure that the pills they make are not addictive? Yes. Should a taxi driver strive to take the most direct route between point A and B to save the passenger a little bit of money?2 Yes.

And so should the managers of these professionals. And so should the middle-managers of the organization. And so should the C-level executives. Every person has a responsibility to "do no evil", including the people using the products and services (willingly or otherwise).

I'm not at all happy with the fact that every time I go to a JR train station in Japan my face is recorded and that data is instantly sent to a Fujitsu-run data centre in Tokyo, where it's processed, analyzed, and stored for who knows how long. There is no GDPR in Japan, and there is no way to even know how much data Fujitsu has on me. I wrote to JR about it and I wrote to the federal politician who represents this area. Neither even took the time to respond because there aren't enough people raising their voice over this issue.

I despise the fact that my non-smart TV and even stupider DVR want to send viewing habits back to Sharp and Panasonic respectively. It's not their bloody business what TV shows the family is watching and when. I've blocked these devices from accessing the Internet while maintaining a connection to the media server, but people shouldn't have to do this. I've written to both companies. Sharp responded with a generic "please read our revised privacy policy on our website"3, and Panasonic — having already taken my money — didn't care enough to even receive the message. Their web contact form gave an error.

Every couple of weeks my microwave wants to be paired to an Android phone despite the "cookbook sync" feature being disabled for 5+ years. My Canon printer recently asked if I wanted to order more ink because the PGBK4 cartridge was low. My work-supplied phone received an SMS a couple of months back when I walked into a mall in Nagoya offering a 50 Yen coupon for a restaurant in the food court5. Were all of these the fault of developers or mathematicians? No. They were the fault of management.

“We need a Hippocratic oath in the same way it exists for medicine,” Fry said. “In medicine, you learn about ethics from day one. In mathematics, it’s a bolt-on at best. It has to be there from day one and at the forefront of your mind in every step you take.”

People should be learning about ethics from day one on Earth, regardless of what career choices they make in life. Placing the blame for these technologies at the feet of the people charged with creating and implementing them is just a lazy cop out. There is nothing inherently bad about a lot of the technologies and tools we create until an ill-defined line is crossed. It's the job of management to ask "Is this too much?" before directing their people to make it happen.


  1. This was a real problem in Japan with a company a few years back, until a whistleblower had enough and ratted them out to the press … after three years of keeping quiet.

  2. This was another problem that was rampant in Japan for a little while.

  3. My eyes rolled twice when I saw that it was specifically referred to as a revised privacy policy.

  4. Photo-grade Black. Why do printers have two black cartridges, and why can't it use them more intelligently?

  5. This was freaky and very undesired. I've since learned that the mall has "ended their limited trial" and won't be sending SMS messages to phone numbers tied to faces anymore.