The Logical Conclusion

Some of the recent conversations around employment equality have been quite interesting to observe. Regardless of whether the people discussing the topic are on YouTube, using podcasts, or publishing opinion pieces on well-known news sites, the same handful of arguments are trotted out as a reason for why employers must be forced to use discriminatory practices when acquiring new employees through the use of heavy-handed legislation, or why it's a fool's errand that should be abandoned. What's unfortunate is that some of the people debating the issue use the same words to mean different things, which results in unnecessary frustration and a surplus of decibels.

There are generally two meanings for the word "equality" that I see when people are discussing the obvious gaps in population representation within certain fields:

Equality of Opportunity, which is described as a state of fairness in the job market. Everyone is treated the same and not prevented from applying for a job due to artificial barriers, prejudices, or preferences. The objective here is for an employer to hire — or promote — the most talented or qualified based on verifiable and testable metrics. This is generally how meritocracies work, though it is not at all easy to maintain.

Equality of Outcome, which is described as a state where every member of a population has the same material wealth and income, or where everyone literally has the same things. A transfer of wealth is required to make this happen, resulting in a society with no super rich and no super poor. Everybody has food, clothing, shelter, access to medical services and education, and just about everything else a collectivist society can realistically support. It is, in short, the ideal of communism.

Both of these concepts have their pros and cons. Neither are complete solutions to the problem of the obvious inequalities we see in modern life. A lot of large organisations around the world try to present an equality of opportunity. There are missteps and poorly worded job ads from time to time but, for the most part, many of the employment laws found in North America, Europe, and even here in Japan will punish a company found to be guilty of discriminatory hiring or rewarding practices. This is a heck of a lot better than the openly hostile employment practices that were seen more than 25 years ago.

Equality of outcome, however, is something that I do not see as being realistically feasible for any amount of time without dropping the pretence that citizens of a nation have free will. In order to have a viable equality of outcome, ensuring every field of employment has the same ratio of various groups that are found within the general population within a short period of time, people must have a static career path assigned to them at some point during high school with no option to appeal barring a major catastrophe or war.

The "career chip" would need to go from being a gag in Futurama to a real thing that controls what a person can do in their life, as this is the only way to honestly ensure there is an equality of outcome without drastically destroying the economies and infrastructures that underpin the success of a nation.

You Gotta Do What You Gotta Do

This idea is not at all new, and it was something a high school friend and I were chatting about a while back as he explained the challenges in adding "visible diversity" to his team of spot welders, as it's become corporate a requirement. Every one of the 46 tradesmen who reports to him is male. Based on the published demographics of Ontario, 23 of these skilled workers should be women. His employer, a well-known steel mill in Hamilton, has been looking for more diversity on the factory floor for years, but there is simply no interest aside from young men. This is despite the very healthy salary1, several weeks of vacation per year, and a generous retirement package. How else can this gap be closed other than to force people into careers based on the very criteria that an equality of opportunity state decries as discriminatory?

A post from 2005 outlining some of the occupations dominated by gender in The Netherlands2 shows some career paths that are generally not very diverse, and these professions have certainly been dominated traditionally by men or women for a number of generations if not millennia. If an equality of outcome supersedes everything else, then there really is no other option.

This wouldn't be all bad, though. Young people concerned with their post-education futures will be given one less thing to worry about. Greater diversity in the professions would mean that jobs would be required to become much safer to accommodate all levels of skill and motivation. And organisations could more easily ensure that management positions were also filled by people who reflect the general population. If there was also a redistribution of wealth and every job was paid equally well based on age rather than seniority or ability, then someone assigned to drive a bus could earn the same as someone assigned to perform open heart surgery. Nobody would go hungry or miss a car payment3 ever again! These are all positives, and I don't say this as a joke at all.

Would the consequences4 be worth it, though?

I am a firm believer in equality of opportunity. Everyone should be judged on their skills and merits. Competence should be encouraged and rewarded. While we're not quite at a place where we can honestly say that everybody will be treated the same or compensated in the same way as a colleague, we're much closer to the ideal today than ever before. As for the equality of outcome movement, I can understand some of the reasoning and even agree on a couple of points5, but history has shown time and again just how untenable such a system is.


  1. Heck, I don't even make as much as the median wage without putting in more than 60 hours per week.

  2. I couldn't find anything similar for occupations in Canada, but the data must surely exist.

  3. Assuming, of course, that everyone treated money the same way.

  4. There are a whole lot of consequences that I've decided to not include here, because the post isn't to completely deride the idea, but posit the most logical means of making an equality of outcome possible … though I would never want to live in such a society.

  5. The massive gap between the crazy wealthy and the dirt poor is absurd. Wealth redistribution is not a viable solution, though.

Essays

Despite not being particularly good at the skill, writing is something that has been near and dear to me for as long as I can remember. There is always something that needs to be written down, be it something as trivial as a note or as complex as an argument. Over this past week I've had the opportunity to get a lot more reading done than usual and, as a result, there are a number of topics that I would really like to write about. The problem is that these are complex situations that will require a good amount of research before I can even think about penning an essay on the subject. Where in the world do prolific writers find the time?

Writing With Style

Essay writing is not something that I've done too often on this site given the lack of focus on any set of topics, and I'm not about to start. That said, I have been kicking around the idea of writing essays on current events with a different site, as this would allow a clear separation of content.

One of the things that I like about writing longer pieces, particularly those that require a bit of research, is the opportunity to better formulate thoughts around a subject. Sometimes I'll begin writing a piece with one idea then discover halfway through that the original position or understanding was incomplete or incorrect. The act of slowing down and really thinking about the subject made it possible to better examine the situation and draw a different set of conclusions. Being able to come away from a piece of writing a little more more informed than before is a wonderful thing, after all. So it's with this in mind that I've created a new folder in the notebook and have started making notes and planning arguments on various topics from reneging on historical treaties to imposing belief systems on others.

What I plan on doing is writing three or four essays to start with, working out the tone and style of the pieces, then aiming for a post a week. My goal with this additional writing project is to develop a more complete understanding of the complex decisions that need to be made to address current social and cultural situations. If anyone else finds value in reading the words that wind up getting published, then I'll consider that a nice bonus.

"Half"

Earlier today the family and I paid a visit to Reiko's grandmother, who currently lives in a retirement home. Her mental state has deteriorated as a result of Alzheimer's over the last decade, and it was decided a few years ago that she should be in a place with round-the-clock medical support. Most of the time she believes it's some time around 1930 or 1940. She's forgotten that her father passed away some fifty years ago. And she can no longer recall the names or faces of her children. Her grandchildren are another story. This post isn't really about her, though. Instead it's about a word that I often hear associated with my kid that I have grown to detest: "half".

In Japan, and perhaps other countries where a population is mostly homogenous, children who have one foreign parent are called ハーフ (ha-fu), meaning "half Japanese". Growing up in Canada, people were also classified as "half", such as half-latino or half-black. This wasn't seen as a derogatory term as far as I know, but something about hearing people say this about my kid grates on my nerves. I want to say "He isn't half anything. He's 100% just like you." … but maybe this is an over-reaction.

Despite being an immigrant, I'm fortunate enough to rarely face a situation where I do not feel welcome as a result of my genetic background. I don't want my kid to ever feel he's unwelcome because he's "half". Growing up is hard enough. He shouldn't have to deal with prejudice (or preference) as a result of his whiter-than-normal skin or lighter-than-normal hair.

Perhaps I'm just over-sensitive ….

Twenty Eight Years

Time flies when you're having fun, but I find it hard to believe that 28 years have passed since the inception of what became known as the World Wide Web. To mark the occasion, the creator of the web wrote an open letter talking about three challenges that face us today.

Today marks 28 years since I submitted my original proposal for the world wide web. I imagined the web as an open platform that would allow everyone, everywhere to share information, access opportunities and collaborate across geographic and cultural boundaries. In many ways, the web has lived up to this vision, though it has been a recurring battle to keep it open. But over the past 12 months, I’ve become increasingly worried about three new trends, which I believe we must tackle in order for the web to fulfill its true potential as a tool which serves all of humanity.

— Sir Tim Berners-Lee

Sir Tim Berners-Lee — Creator of the World Wide Web

His three points, we’ve lost control of our personal data, it's too easy for misinformation to spread on the web, and political advertising online needs transparency and understanding, are all valid challenges that we face today with the various commercial sites where people tend to congregate, and I plan on doing something very difficult to address the first one shortly. That said, I had the distinct feeling that there should have been a fourth challenge identified as well: we need to treat others as we wish to be treated.

It seems every time I take a look at a popular social network or forum there are people being compared to some of history's most notorious villains due to differing opinions. People are attacked for not fitting a certain stereotype or other asinine criteria. Good debates quickly devolve into strawman arguments or worse. It's as though we have lost the ability to listen to the other side and present our own arguments in a respectful fashion in order to convince or, at the very least, learn about the other side.

There are people who would condemn me for having a son with someone who does not share a recently similar genealogical history. There are people who would openly mock me for preferring hand-written notes to digitally recorded ones. In both cases I would try to engage these people and, if rational discussion was possible, hear what they have to say in the hopes that I could present my case and be seen as an equal with differing opinions. Chances are the vast majority of people online would enjoy the same.

Billions of people around the world face unjust bias and prejudice every single day of their lives for things that are often beyond their control. At the end of the day, the shouting and name calling gets us nowhere. Does society benefit by comparing someone on Twitter to Hitler? Has anybody in the last 20 years seriously reflected on their life choices after being called Hitler in order to determine if they may have an incomplete or incorrect understanding of a topic? Somehow I doubt it.

At the end of the day what we need as an ever-connected society is the understanding and appreciation that with a billion voices comes a billion personal histories and a billion perspectives. While some will more closely align with others, there is no reason why the greatest communications tool in the history of the human race should find itself a vehicle for excessive hate and rage. There's plenty to be angry about, of course. Humanity's problems cannot be solved in 140 characters, though. And this brings me to my next point …

Enough With the Arbitrary Character Limits

It's really time social tools dispense with arbitrarily tiny character limits. When Twitter first came out, 140 characters was chosen due to the limitations of SMS, not the Internet. Even the soon-to-be-decommissioned App.Net with it's 256-character limit is unnecessarily restrictive. If people are to have true discourse, heated or otherwise, the barriers to communication need to be reduced as much as possible. Forcing someone to present a cohesive and logical argument on complicated topics with a hard character limit makes nuanced communication all but impossible. More than this, with "threaded responses", individual quotes can more easily be taken out of context and used against a person as a straw man argument, distracting everyone from the ultimate goal of enunciating an idea.

Our lives seem to be moving ever faster as our tools allow us to communicate with more people across greater distances with ease. Being constantly "on" takes its toll on people, fraying nerves and reducing our ability to step back and think before responding. Perhaps with a slower, more natural approach to communication we can begin to understand the people we disagree with. We don't have to always agree, nor should we. But we should at least try to understand the other side. Unfortunately, this is all but impossible unless we can respect one another and supply better written, more considered responses.

Believe It Or Not, We're Getting Better

Like many people around the world, Star Trek played a huge role in how I perceive the world and my role in it. While it's a work of fiction, the truths explored in many of the better stories show us both how far we've come as a species and how far we have yet to travel. We are capable of incredibly generous acts of kindness in one moment, and incredibly heinous acts the next. What's fascinating about this is that a person can directly contradict themselves without even realizing it.

Patrick Stewart — Better

The world has undergone an incredible amount of change over the last forty years. More people are educated today than ever before. More people are fed than ever before. More people are connected across cities, borders, and continents than ever before. More people are financially secure than ever before.

And yet …

And yet people are angry. There are good reasons to be angry, too. A lot of what we see in the news is not what we want to see. But there's a lot of good, too. We just need to know where to find it or — if we're proactive — know how to create it. We can all be a little bit better. Looking at how far we've come in just my lifetime, I'd say we're generally making some positive progress.

It's Not a Software Problem

Is the difference between information and misinformation something we want to leave in the hands of yet another algorithm?

This is the question I asked after reading this Guardian article where Tim Cook, Apple's CEO, laments that fake news is "killing people's minds". He goes on to say that technology companies should be doing more to tackle this problem and stemming the spread of falsehoods without affecting people's rights to free speech. While I am just as frustrated as anybody else over the excessive quantity of misinformation online, a technological solution to the problem would be the ultimate precursor to a form of censorship the likes of which authoritarian governments salivate for.

I appreciate what technology can and has done for us, but I must question why this particular human flaw needs to be addressed by software and social engineers. Humans have been lying to one another for as long as we've had language, and probably longer still. Children tell us they didn't eat the last cookie in the jar despite the crumbs around their mouth. Companies tell us they appreciate our business on receipts but never through actions. Politicians tell us they can be trusted. Would an algorithm catch these fibs? If so, how would any algorithm know what is accurate versus inaccurate? Somebody would need to be the gatekeeper of "The Truth", while the definition of that very concept varies so wildly from person to person that any form of censorship of an article that is aligned with a person's current beliefs would instantly render the entire system suspect and untrustworthy.

The issues surrounding "fake news", "alternate facts", and outright lies are not recent creations that have caught a populace by surprise. Growing up before the Internet, I remember my father and I laughing at tabloid headlines at the grocery store. Stories so over-the-top that it's a wonder anybody took them seriously. Some of the headlines that stand out the most in memory include:

  • Horse born with human head. Farmer ashamed.
  • Woman abducted by aliens from Saturn
  • City of Atlantis discovered under Brooklyn

If any of these were even remotely accurate, there would be a lot more press coverage and a whole lot of academic papers1. What my father tried to teach me by reading these at the grocer was that you cannot trust everything you read. Some of the best teachers I had while growing up would say the same: it's okay to read, but verify.

This is what people need to do if they are to separate spin from fact. It's not easy, but critical thinking is the only way we can overcome the mountain of misinformation that exists both online and off.


  1. any one of these would result in a lot of follow-up studies

Eschewing Today

A few days ago The Guardian published a short little piece from Mark Boyle, a man who has decided to turn his back on modern technology and live completely off the grid. He's living in a cabin built in a very traditional manner without phone, electricity, or even running water. In his own words, using nothing "requiring the copper-mining, oil-rigging, plastics-manufacturing essential to the production of a single toaster or solar photovoltaic system." If Mr. Boyle can live the life he chooses without any of today's modern conveniences, then I hope he finds the happiness he seeks. He is not the first person to have an "I Quit Technology" published in the paper, and he certainly won't be the last. What I often wonder, though, is why so many of the people who decide to walk away from modern conveniences think of today's tools in such a negative manner and why so many do so on their own rather than join a community of like-minded individuals.

I decided to eschew complex technology for two reasons. The first was that I found myself happier away from screens and the relentless communication they generate, and instead living intimately with my locale. The second, more important, was the realisation that technology destroys, in more ways than one.

Mark Boyle

The quote above is something that I've heard a lot over the last 20-odd years as computers and the Internet has become more a part of everyday life, and the people who say it tend to think of these tools in a very binary manner. Either we use absolutely everything a piece of glass and plastic has to offer, or we use nothing at all. There doesn't seem to be a middle ground.

In the grand scheme of things, I agree with the idea that people can be happier away from the relentless communication that is enabled by always connected notebooks and smart phones, which is why just about every notification on my computer and phone is disabled. I don't need to have a bunch of apps vying for my attention, nor do I feel the need to be completely up-to-date all the time. Doing so is exhausting and ultimately pointless if you don't make a living writing for gossip sites. For this reason people can get my attention with a direct phone call, an email to a specific address, or a message on Skype. Otherwise, I'll see the message when I see the message. After doing this for social services and RSS feeds some five years ago, my general mood improved and I started being more present and offline while out and about in the world.

As for Mark's "realisation that technology destroys", I fail to see how this is even remotely accurate. Technology couldn't care less whether it's used creatively or destructively, for good or for evil, to help or to hinder. The tools that we create — and technology is a catch-all term for every tool we've created — simply enable humans to accomplish the goals they set out to do. This isn't to say that all technology is inherently beneficial or the best use of resources, but how often does humanity make the most of anything? We are an incredibly wasteful species with the tendency to destroy wonderful creations for our own purposes. Making a sweeping statement like "technology destroys" is no different than saying "men kill". It's a poor rationalization of a far more complex concept.

Reading through Mark's reasoning behind the "technology destroys" statement, I get the feeling that he's more disgusted with consumerism than technology. He's disappointed with the way people interact more with glowing screens than each other. I feel much the same way, as our excessive consumerism and isolationism does have a very clear impact on the world, but I wouldn't dream of turning my back on everything just because I'm upset with how other people use the tools I was once openly mocked for developing. Instead, I can try to lead by example and encourage better technology recycling and offline interaction habits. Unplugging from the world is no different than sticking our heads in the sand, so to speak. It makes absolutely no difference in the grand scheme of other people's lives.

One Needn't Go It Alone

Amish Life

I'm sure we've all wanted to get away from it all at one time or another, perhaps escaping to a tropical atoll in the middle of the Pacific like Tom Hanks in Castaway. Living completely off the land and eking out a simple life without the endless distractions that make up so much of our day. While this can sound attractive at times, it sets us up for failure in a pretty big way. The world has been tamed in many ways, but it's still a hostile place for those who are under-prepared or over-confident. Going it alone and living off the grid can leave a person or family susceptible to an awful lot. This raises another question I've had for people choosing to reject modern technology and live like our ancestors: Why not really live like those who came before us and join an Amish community?

While it doesn't happen very often, people are more than welcome to join the Amish community so long as they commit themselves to following the Ordnung. Wikipedia does a great job of explaining it:

The rules of the church, the Ordnung, must be observed by every member and cover most aspects of day-to-day living, including prohibitions or limitations on the use of power-line electricity, telephones, and automobiles, as well as regulations on clothing. […] As present-day Anabaptists, Amish church members practice nonresistance and will not perform any type of military service. The Amish value rural life, manual labor and humility, all under the auspices of living what they interpret to be God's word.

So, be a true member of the church, be an active member of the community, follow the rules, and be a shining example of humility. In exchange you get to be part of something bigger than a single person could possibly be alone, and you get to contribute towards something worthwhile. While some Amish affiliations will permit limited use of technology, some outright reject the vast majority of it, as the chart below shows.

Wikipedia — Amish Affiliations

Being an island unto one's self seems awfully selfish and more wasteful than being part of a community. The great thing about the Amish is that, even if you choose to not join their religion and live life the way they do, they'll gladly let you join in many of the community events so long as you're civil.

To What End Will One Go?

The last question I usually have for people who want to reject tech is whether they'll reject modern science, too. Modern technology is ultimately the result of modern science. We couldn't have any of our digital tools without a deep understanding of physics, and we couldn't have the plethora of plastics and alloys we use without a solid grasp of a lot of different fields. Rejecting technology may as well be the rejection of modern science and, if that's the case, at what point will one draw the line to say "any science from after this time in human history is not for me"?

At the end of the day, everybody is free to make their choices. So long as nobody is hurt as a consequence, my opinion of these choices is neither here nor there. I do wonder whether extreme decisions are made without a complete understanding of what one is rebelling against, though. Technology can be used for good. Technology can bring communities of people together in unique and wonderful ways. Technology can help extend our lifespan. Technology can ensure our food is clean and safe. Technology can provide for us a toilet that doesn't smell like a porta-potty after a week at Burning Man.

Ultimately, the problem isn't the technology. It's how we decide to use it, and how we decide to perceive others who use it differently.

Bias or Prejudice?

I've been abstaining from writing any blog posts about the recent American election because, to be completely frank, I find it hard to believe that the character model for Biff Tanner in Back to the Future II has been chosen to lead the most economically and militarily powerful nation in the history of the human race. I'm unhappy that some of the people being hired for incredibly powerful jobs have such an open disdain for people who think, act, or look different from them. I'm frustrated that a bunch of poor "winners" are running around, terrorising people who are not caucasian … or male … or of a certain religious affiliation. I'm disgusted to hear that legitimate lawsuits against the president-elect will be fluffed off because "he's too busy". I'm dismayed that the current president will need to support the new one, who openly mocks people who take the time to learn about the things they're doing. To top it all off, I'm angry that the threats against groups of people — be they Muslim or journalists — are coming from the very top of this next government, and that we've seen this sort of thing on numerous occasions in the past and bore witness to the horrific events that followed.

Yet, in the back of my mind, I'm wondering if all of these overly negative thoughts about the new leader of the American people are unfair.

Whenever I would read about Trump in the newspaper while growing up or as a young adult, the articles would focus on scandal or failures. Whenever I would see him on TV, which I'll admit was not very often, he would come across as a wannabe mafia don with his big mouth, harsh words, and the stereotypically flashy extravagance one would expect from a performer. He did not strike me as a political contender, becuase he did not strike me as someone I would ever want to make decisions that affected me or my family. If anything, four decades of press coverage has made the man out to be an opportunist who'll take advantage of any situation because "there's no such thing as bad publicity".

The same can be said about a number of people he's bringing into the Oval Office with him. People who are leading members of groups that exhalt racial supremacy, religious persecution, and other fascist ideas. People who took advantage of horrible situations to make themselves look like heroes. People who use their wealth to shut down opposing voices …

And I wonder if this is the end.

Not the end of civilization, of course. Civilization evolves. It changes and adapts to the needs of the people who forge the societies that constitute the very idea of civilization. But I wonder if this is the end of the Ameri-centric status quo that has existed for so long. Will we really see parallels between the Trump presidency and Hitler's Germany? Or will we merely see that Trump is another Silvio Berlusconi?

Is this just an overreaction due to bias? Or are my feelings seated more deeply than this? Is my bias actually prejudice that has been subconsciously moulded and formed over decades by reading The Guardian, The New York Times, and The National Post? These three papers from three different countries have very different editorial staff with different opinions, different agendas, and different backgrounds. Despite the differences, could they all have a similar bias that has fed into pre-seeded beliefs I've held, and could these biases have manifested into prejudice in such a way that I could feel physically ill just at the thought of a Trump presidency? Confirmation bias is a very real thing, and my preference for three specific news organisations likely came about as a direct result of reading articles that talked about the world through a lens that I was already familiar and comfortable with.

Regardless of who won the presidential election this year, there would be protests in the streets. Countries would align or distance themselves. Trade agreements would succeed or fail. Societal tensions would simmer or boil. CO² emissions would rise or fall. Yet I believe — and there's that word we must watch out for — that Hillary Clinton would have been a safer choice. Would she make mistakes? Absolutely. Would she do things I didn't agree with? Yes, of course. Heck, she's done lots of things in the past that I didn't agree with, but that's to be expected for a career politician who has been saddled with incredible responsibilities. Of the handful of people who ran for the most powerful office on the planet, she was perhaps the least likely to do something that would directly affect my friends and family around the world. She knew how to play the game, and she knew how to make the tough decisions. I don't believe Trump or Stein or any of the other contenders for the job have the requisite appreciation for the power one can wield as the President of the United States.

But again, is this just prejudice? I've been wrong before, and I've been wrong just about every time I've made a prediction about Trump's run for office. Maybe I'm wrong again this time, too. Maybe the man will invest heavily to rebuild the country's infrastructure, end wars, strengthen the economy, improve education, reduce the nation's prison population, and end gang-related violence in the poorest of urban communities through positive actions that vastly improve the quality of life for all Americans. Maybe Donald Trump really will make America great again … but I just can't see it. My bias — my prejudice — just won't allow it.

Abundance

Last week I was having a rather heated discussion with a pair of software developers who enjoy working on really big projects. Big, in this case, does not necessarily mean that the programs they make are feature-packed management systems that cater to everybody, but instead software that builds on the work of hundreds or thousands of people around the world. This is the antithesis of how I write software, and often a bone of contention. My software tends to lean heavily towards a minimalistic approach. So much so that I will often write close to 95% of the code that makes the tool work. Many developers see this as a glorious waste of time and make extensive use of frameworks and libraries of code that are available online. By plugging these various elements together, they can often have the core of an application built in a weekend or less. For the developers I was debating with last week, they built a scheduling application in a week. I had done the same but needed two weekends. Their web application weighs in at 4,291 kilobytes in total. Mine is 312 kilobytes. They both accomplish the same top-level goal, but only mine is able to work on browsers from 2007 and smart phones. Which one is "better"? Does it even matter anymore?

A lot of software developers I've spoken to over the last few years tend to lean heavily towards speed of development over speed of execution or efficiency. When their projects become computationally intensive, the most common response is to "throw more power at it" rather than asking themselves how to make the most of the hardware people actually have. Given the incredible surplus of computing infrastructure we have around the world, this attitude certainly follows the same pattern we see with other commodities when presented with a seemingly limitless surplus: use as much as you can get and to heck with the consequences.

We can see time and again how people's perception of resources have changed when presented with an oversupply. An abundance of electrical power, food, clean water, education, telecommunications networks, fossil fuels, and human labour has made it possible for us to create the world we have around us, and we waste a large percentage of these resources without a second thought. It's really no surprise that people are treating processing power the very same way. My perceptions of how we should use this resource are on the fringe, much like the ideologies of people who do their best to live green or completely off the grid.

The unbridled use of software has pushed hardware to where it is today and some pretty amazing things have become possible as a result. Entire worlds of visual splendor and imagination can be rendered in real time to act as a background in video games and big-budget movies. Complex problems involving weather prediction can run again and again, allowing agencies to notify communities that might be affected by exceptionally strong storms. Our words can be transcribed as text despite heavy accents and other verbal aberrations. Heck, in the next 25 years we're expecting that the confluence of better hardware and software will put untold millions of people out of work as unskilled manual labour jobs are replaced by incredibly dexterous robots and 3D printing. It's been said for decades, but we really are on the cusp of becoming a post-scarcity world, where just about anything we want can be provided relatively cheaply, when and how we want it. More than this, there is already a great deal of work being done to create software that writes itself. In the next few years it may become commonplace for anybody to pick up their cell phone and ask its digital assistant to create a program that will solve a specific need.

Computer, create a program for the robotic mower to cut a fractal pattern into the lawn and send the drone up to take a time lapse of the work. Oh, and post the completed timelapse to YouTube with some catchy music when it's done.

It's just a matter of time before this sort of situation is commonplace and people begin exploring a whole new set of boundaries for software development for the hardware they have, and I believe this will also be the point at which many people stop being willfully ignorant of technology and hop in to shift some paradigms. The code will likely be inefficient as heck and require a lot more processing power or memory than a custom-crafted solution but, at the end of the day, it won't matter anymore.

When the barriers to entry are stripped away, people are capable of some pretty amazing things …

… and I'll be out of a job, if I'm still working as a software developer when we reach this point.

What comes after this is anyone's guess, though I do hope that we — as a species — see little point in continuing the chrade of working 40 hours a week to support a family or desired standard of living. Unemployment rates will undoubtedly grow around the world as more specialised machines begin making products on-demand, reducing a great deal of waste and human error. Warehouses full of finished products will be less necessary, as will shopping malls consisting of big-box retailers.

We'll still need people of various professions, but the days of the unskilled worker are numbered. Part time jobs may pop up every now and again, but they won't be as common as they are today. This is going to very quickly lead to an interesting problem: an abundance of time.

History shows that societies dramatically change with the introduction of abundance, and new forms of art and entertainment become possible as a direct result of that abundance. What might we do with ourselves if presented with a 4-day work week? How about a two-day work week?

This is a topic for a future blog post.

The Rise of ED-209?

On July 7th Micah Xavier Johnson, a 25-year-old Army veteran, decided to target and kill police officers in the city of Dallas. A dozen officers were hit, five have died as a result. This was not a last-minute decision to walk into a crowded area with a weapon to cause damage. It was a planned attack that took into account how the police would move and the resources they had available. In the end, the Dallas police opted to use a bomb-disposal robot to deliver an explosive payload to eliminate the attacker. This was the first time the police in the US used a robot to kill a citizen, and I wonder if it will set a precedent.

In the very fictional world of Robocop, Omni Consumer Products created a walking and talking machine that would be put on the front lines in the battle against crime syndicates and their well-armed foot soldiers. Given the level of access the bad guys had to powerful weaponry, and given the level of desperation cities had to keep the people safe, it seemed only logical that the tools of war should be adapted to work on home soil as well. It was preferable if a pair of ED-209s were destroyed during a raid than members of the force. Machines could be replaced, after all.

In a bid to save lives during dangerous situations, does it make sense to send in the machines?

ED-209

With walking, talking, remote-controlled devices, it becomes possible to directly confront snipers and other dangerous individuals. As the machines would (potentially) be less affected by a barrage of bullets, it becomes possible to arm the robots with stun guns and other non-lethal devices to incapacitate suspects. Once unconscious, people could move in to arrest the appropriate people and recover any hostages.

A pie-in-the-sky, incredibly optimistic proposition if there ever was one. A real world situation would not be as simple as this.

Surrogates Movie Poster

In a more recent movie, we saw something very similar to this. Surrogates — the Bruce Willis movie — posited an interesting idea. Humans would rarely venture out into the world themselves. Instead, "perfect" machines would be sent out and people would control them via virtual reality rigs in their home. In addition to wealthier citizens having this freedom, police and military personnel also made use of this technology. In one telling scene, a military soldier is hit while battling on the field in what looks like a Middle Eastern nation. We then learn that it's being remotely controlled by someone in a military bunker who then connects to a different robot and returns to the field. Military personnel are no longer in danger when deployed to war zones.

While this raises a lot of questions about the morality of war when one side literally has an endless army of immortals with billions of bullets at their disposal, it also makes it possible to have a police force that is armed with less-lethal weapons when entering into dangerous situations. These machines could even ride in the trunks of cruisers and be deployed on an as-needed basis when a situation is too dangerous for humans.

The technology to make this happen is very close. Bipedal machines roughly the size of an adult have been under development in laboratories since the 90s, and we've seen some incredible progress in balancing mechanisms and the requisite software since 2010. It really is just a matter of time before machines like this become part of military and civilian defence systems.

Will the recent events in Dallas merely accelerate the development and deployment of these sorts of tools? Will they be deployed in a non-lethal fashion?

The answer is probably along the lines of Maybe and No.