In late November of last year, Microsoft won a contract with the US military that will result in combat units using more of the company's offerings on the battlefield, including the HoloLens augmented reality headset. A number of employees at the company have expressed concern with this, saying in an open letter that they do not want to develop weapons, which is a completely reasonable position to take. Satya Nadella, Microsoft's CEO, doesn't see the contract as an issue and believes this to be a "principled decision", which is also a reasonable position to take. As someone who has been asked in the past to write software that went against my morals1 the question I would have if I were in this position would be far more complex than it sounds: where is the line?
At what point does a person draw the line and say "This far and no farther"? At the moment, given that I work in education, the answer is pretty clear for me. I will not write nor provide software that will be used to kill, injure, or spy on people. My work can not be used to infringe on the digital and physical sovereignty of others under any circumstance, regardless the promises made by people within the organisation in positions of power. If I am asked to do any of these things, even for something as seemingly benign as putting Google Analytics on a corporate website, I will decline the work and suggest someone else do it. If the infringement is greater, such as collecting the position of a person's mouse, every keystroke, and perhaps even audio data while using the company's resources, then I will protest as high up the chain as I can go until the offending business decision is cancelled. If this doesn't work then I resign and move on2.
This is all well and good for someone working in education, but how about a company such as Microsoft?
Working at Microsoft — or Apple or Google or Canonical — would make the line a little harder to identify primarily because of the military's extensive use of operating systems. Most modern, well-equipped military forces around the world use Windows and Linux extensively. Devices can run Android or iOS. Software that runs on any of these operating systems can be used for the sole purpose of death and/or destruction. If I work with a company that makes operating systems and do not want to participate in any way to the development of weapons or enabling of murder, do I have that luxury? Probably not. Even if I were to not contribute to Windows, people at Microsoft will be working on projects that could be used by a military; be it domestic or foreign. So would the line be somewhere around: so long as my project is not specifically used as a weapon? Would it be a little more specific with: so long as my project is not the active development of a weapon? Both of these seem conveniently myopic.
As unpalatable as the idea may be to some people, companies as large and established as Microsoft are going to work for governments and military organisations. While Satya Nadella has not said that they'll be officially entering the weapons market, he doesn't have to. A lot of Microsoft's products are already in heavy use by powerful entities we may not necessarily agree with. By supplying the US military with HoloLens hardware and some custom software, they'll be doing all the things one would expect from a big company. If employees are morally opposed to the idea and are assigned to the project, they can request to be transferred3. Is this perfect? Not in the least. It is, however, a potentially viable alternative to being a part of active weapons development.
For me the line is pretty clear cut and my positions are generally understood by the management at my employer. For larger open projects that I contribute to, such as Linux, I understand that my code4 could be used within a system designed to harm people, from Claymore mines to ICBMs, and there's literally nothing I can do to prevent this. What I can do, however, is refuse to participate in the active development of weapons … and the people protesting the decision at Microsoft can do the same. The key is not just protesting a decision, but being willing to walk away when the alternative is untenable.
I did not end up writing the software. My bosses were quite upset with me for refusing the demand, as it was coming from the company's investors at the time, but I wasn't going to sacrifice the data privacy of others for my own pay cheque. Instead my immediate manager wrote the code that broke the promise we made to our customers and I left the organisation a few months later.
I understand that this is primarily possible due to the privilege that I have earned over years of hard work, which has (only recently) resulted in a well-paying career that has made saving money possible. Not everyone can afford to walk away from a job for moral reasons.
with the understanding that any request will likely limit their career opportunities going forward.
which has been mostly limited to Bluetooth drivers and fingerprint readers.