Encouraging Technological Fragmentation

On May 15th the US government issued an executive order that could effectively reshape the technology that many of us will use in the coming decade. Chinese companies are being accused of using their position as the world's factory to secretly modify the electronics that permeate our lives, making it possible for the Chinese government to monitor everything that everyone does at anytime and anywhere. If this were a joking matter, one might believe that this is little more than jealousy on the part of America's covert ops industry. In addition to this order, the US Commerce Department took additional measures by adding Huawei and 70 affiliates to its "Entity List", which bans the Chinese telecom giant from buying parts and components from US companies without US government approval. Earlier today Google signaled its logical intention to comply with the revised laws by suspending some of its business with Huawei. Other companies outside of the United States that provide hardware and software to Huawei are also cutting the company off in an effort to stay on good terms with the US government.

This leaves Huawei, the second largest mobile phone maker on Earth, in a bind. They cannot get all of the parts they need to build products. They cannot get access to all of the services that Google offers people who use their Android operating system outside of mainland China, which will give potential customers a reason to not buy a Huawei product. Their other products, including TVs and traditional computers, will soon face a similar series of problems.

The people in leadership roles within China will not take this lying down. Huawei and other companies will not have their livelihoods held for ransom every time a foreign government, be it the United States or someone else, decides to issue a decree. The Chinese government could react with a number of measures, but many of these would just hurt their own economic position. Rather than lower themselves to an endless game of action-reaction, it may be time for some of the technological innovations in China to replace those developed elsewhere; a technological split from the west, so to speak.

Zhaoxin is a viable domestic alternative to Intel and AMD for x86-based processors. Kylin is a modern desktop operating system that is certainly up to the task of replacing Windows and macOS if people were so inclined. Huawei has been working on their own fork of Android for quite a while and have even hired some former Nokia people to make it happen. Next generation RISC processors are open-sourced, meaning they can be used by anyone regardless of a government order. It wouldn't be easy, but there is no reason why Chinese corporations, with the support of their government, couldn't "fork" current technologies and begin diverging from the products developed primarily in the United States, Europe, and Israel. In the space of a decade, China could be a technological Galapagos, much like Japan was in the 90s. So long as the Chinese business leaders are smarter than their Japanese counterparts, then it wouldn't be too much of a stretch to see Chinese technology begin to replace western technology first in developing countries and later in developed nations.

The parallel development of technologies would probably appear to be a duplication of work at first but, within just a few short years, a noticeable diversion would become apparent. Customers would vote with their wallets. Markets would expand and contract. Companies would adapt or fade from relevance. The reality is far more complex than a 700-word blog post might make it out to be, but a technologically independent China would have a lot of benefits. Not only for the people of China, but everyone around the world. A technological race to domination would drive a lot of innovation and require a lot of intelligent people.

The rising tide raises every boat.

Of course, this could also backfire and result in drastically incompatible systems. I'm optimistic that we would see more good than bad from a technologically independent China, though.

Right to Repair

Early this morning Joe shared a link to a Motherboard article describing how Apple has effectively killed another attempt to pass "right to repair" legislation by suggesting people will injure themselves when working with the intricate components that are part of the phones, tablets, and other devices we buy. As someone who could never build a computer from scratch in the 90s without cutting my hands at least once inside the case, I can certainly see the logic of the argument. While the Motherboard article clearly calls out the dangers of puncturing a modern battery, the number of phones with shattered screens that one can observe being used on public transit in this country is nothing short of amazing and it's bound to be the same elsewhere. For an inexperienced person to replace the shattered glass on an iPhone or iPad, there will almost certainly be a price to pay in blood.

A few hours after Joe's initial post, Robert followed up with this:

But if one makes the assumption that regular people either can or want to repair their devices, we are nothing short of delusional. Most people only what something that works, they don’t want to fuck around with it. Modern electronics are painfully integrated, components are few and specialized on a tiny PCB. Does the average person even know what they are looking at if they were to open the case of any contemporary device? […] Those of us who so loudly demand the right to repair, which is a broken term in and of itself, need to understand that we are the edge case and not the standard.

Indeed. Edge cases are consistently hard to please. I consider myself to be firmly in this category, hence the preference for certain types of less popular hardware and software. However, Robert goes on to make a recommendation on how a "Right to Repair" mechanism might work to the benefit of manufacturers and customers:

If one were so concerned about regular people cracking open their wares and potentially injuring themselves, there are better solutions. Perhaps a course that people could take, educating on the ways of the electronics and giving spare part access to those who pass a test or something along those lines. […] Think; certification for individuals to perform repairs.

This is an interesting idea. While it will not please everyone, it will please some of the more technically inclined who might want to run a small business fixing people's devices. A high school student with certification and access to fairly-priced replacement parts could earn a pretty respectable living and reduce the number of cracked screens in their school, thereby saving fingers from being sliced open before a screen protector can be applied. The same can be said for people in poorer neighbourhoods who might want to help their community get more value from their technology investments. Offering a certification program is no panacea, and it would undoubtedly ruffle a bunch of feathers like Robert said in his original post, but it would make an interesting solution for companies who claim they care about the health and well-being of their customers, as well as the environment. Repairing is better for the planet than replacing.

Personally, I doubt there will be much people-friendly movement from companies on giving people the ability to repair (or easily upgrade) their products. Systems have become so incredibly complex in both hardware and software that only a small segment of the population could actually stand a chance in repairing a broken device. Take apart a "smart speaker" and see just how easy it is to replace a burned out capacitor. Most people just want things to work and don't really care to invest the time in understanding the how or why, which is fine. That said, there will generally always be options available to people who want a greater degree of control and freedom over their technology. It may not always look as pretty or be as popular, but options will exist.

Indistinguishable from Magic

When out and about during the day, I like to watch people interact with the technology that surrounds us. So much has changed in such a short amount of time that comparing the tools we enjoyed in the 90s to the devices we take for granted today can be quite amazing. Thinking back even further, there was a time when the average person could repair just about any item they purchased without requiring a PhD in a science-related discipline. Those days are long gone and will likely never return, which means generations of people will grow up not really understanding how the things they use actually work. This isn't necessarily a bad thing, nor is it necessarily a good thing. This is just the way it is.

A Lithographic Sheet of Intel Haswell-series Processors

The integrated circuit is a perfect example of something that the vast majority of people do not understand. Most software developers couldn't tell you how one of these things really works yet, without them, there wouldn't be any need to develop software. Yet the same could be said about a lot of the food we eat, too. Do most of us really know how the meal we ate went from being alive somewhere on the planet, to dead, to processed, to packaged, to shipped, to delivered, to bought? Probably not. It's too complex and, for most people, irrelevant.

We live in a remarkable time. The vast majority of people over the age of 20 are quite literate in at least one language, and we have access to vast sums of knowledge that span thousands of years of study. For most of us, though, we focus on just a narrow band of subjects and learn them well enough to accomplish the basic tasks asked of us. Some people dig deeper to learn more. They go on to solve some wonderfully complex problems, such as designing the next in-demand CPU … essentially teaching a rock to "think".

For the vast majority of us, though, we are surrounded by tools that are taken for granted and pretty much indistinguishable from magic.

33RPM

Vinyl records have been a mainstay in the world of music distribution for longer than I've been alive, being one of those mediums that refuses to disappear into the sands of time like wax cylinder, cassette and 8-track tapes, and even the CD. People who collect these antiquated artifacts claim that the analog sound quality is superior to the digitally-mastered versions that are commonplace today despite their obvious frailty. Records do not like heat, nor do they respond well to people dancing in close proximity. What records do like, if I can anthropomorphize them just a bit, is tactile interaction. What records offer people today is a set of physical objects that people can enjoy looking at while listening to the music encoded in the grooves of the plastic disc.

A DJ Sets Up Another Record

Over the last three years a number of my favourite musicians have started putting out special and limited edition runs of their albums on vinyl. The pattern is generally the same regardless of artist or genre. A new release is announced weeks or months in advance alongside the opportunity to buy a record, fans flood the online stores to reserve their copy for anywhere between $40 and $60 USD, then the product sells out within a week. The artist earns a good bit more per album than they would through iTunes or any of the streaming services, and the people who bought the discs are treated to lovely cover art, interesting inserts, and as much analog sound reproduction as their stereos will permit.

It's a pretty good business model.

This desire people have for nostalgic technologies has me thinking about some of my recent ideas about old software that I miss, such as the Graffiti handwriting system that was used on the Palm handhelds for so many years. It was, in my opinion, the ideal way to write information onto a tiny computer. On-screen keyboards are fine for a short bit of writing1, but it has never felt natural. Holding a stylus and writing just feels more natural. The same can be said for home appliances, too. Everything seems to have a complex computer inside running the show and setting the microwave to heat something is more like programming than cooking, not that a simple dial that activated a countdown timer constitutes cooking … .

Nostalgia certainly sells, and often for a premium to the people who are pining for a simpler time when it seemed we all had a great deal more freedom and naivete about the world. What I'm becoming more interested in knowing is whether the technologies of the past really need to be relegated to history when they are still perfectly usable in 2017 and beyond.


  1. I say this having invested thousands of hours writing hundreds of blog posts on the trains over the years via an on-screen keyboard

Five Years Since the Switch

Five years ago today I made the switch from Windows to OS X and, while I've not stuck with Apple's preferred OS on my hardware for the entire duration, the move has been incredibly illuminating. It's been said on numerous occasions on this site, but the first computer I could call my own was an ancient off-brand 8088 back in 1994. It was on that machine that I learned how to program in Turbo Pascal and Watcom though a pair of books that were instrumental to my understanding of software development. From 1994 through to 2012 the core ideas in those books were used over and over and over again across thousands of projects, evolving as the field developed new processes and techniques. Core to the understanding, though, was that software should be written for the lowest-grade of hardware whenever possible. We can't assume everyone has the latest and greatest computer on their desk, nor should we make people suffer for our own impatience at finding the most efficient means to solve a problem. While I still very much stick to this core idea as much as possible, the platform switch in 2012 brought about a slightly different area of focus when designing digital solutions, and it's one I'm still actively learning about today.

For the first eighteen years, my primary goal was to create great software that would solve the problem at hand as efficiently as possible. The interface that people would see was often an afterthought, though I always tried to make the various elements line up and look good. After using iOS for a bit and seeing just how important interface design can be, I started to take it much more seriously.

Design is a funny word. Some people think design means how it looks. But of course, if you dig deeper, it's really how it works.
— Steve Jobs

While watching parts of Apple's big hardware release a few weeks back, they played an audio snippet with the above quote from the late Steve Jobs. A lot of people have heard this quote, but I wonder how many people think about it.

At the day job I've been working on a project for almost two years1 where I've tried very hard to think more about how the software is supposed to work rather than how "tight" the code is or slick the interface might be. When designing new functions and features, I try to collect as much information as I can from the people who will wind up using that part of the system because, at the end of the day, they are my customer. The software is used by people across the organization with very different sets of goals so, sometimes, the best solution is to write two or more interfaces that draw from the same source of data, but display the information differently. Sure, everything would work if I were to just make a single, unified view that people could then selectively ignore. But I want the software to actually work … so I ask questions. Lots of them.

Despite working with software for almost a quarter century, I still feel there is a lot left to learn. Every platform has taught me something valuable that is just as relevant today as it was days, weeks, months, or years ago when I would heavily lean on them. DOS taught me the importance of efficiency. Windows taught me the importance of object oriented coding. PalmOS taught me the importance of resource management. Linux taught me the importance of community building. iOS/OS X taught me the importance of designing tools rather than solutions. The web taught me the importance of picking a standard2 and sticking to it. The next area I expect to work in will involve a great deal of voice interaction, and I look forward to the evolution in thinking that move will bring.


  1. it's really hard to believe this much time has passed already

  2. there's an XKCD joke in here …

Elon Musk is Wrong. We Need Killer Robots.

Elon Musk is one of the world's most influential, wealthy, and intelligent people. He's the leader behind Tesla Motors and Space-X, two companies that I find incredibly ambitious and worth paying attention to. He's also a proponent of using technology to the betterment of our culture and society, which is something that I can usually stand behind and applaud. Over the last few years he's been banging the drum that Artificial Intelligence needs to be controlled and kept non-sentient in order to ensure our species has a future. Recently he and Google’s Mustafa Suleyman joined forces to lead a group of 116 specialists from across 26 countries to have the UN ban autonomous weapons in much the same way chemical and biological weapons are banned from use. While I agree with the idea in principle, this may not be the most logical solution to the problem of nations sending "armies" of robots to war. Until nations have a means to destroy or disable remote or autonomous vehicles en masse through EMP or similar weapons, it only makes sense that militaries not only possess "killer robots", but continue working on improving the software that operates them.

ED-209 and Dick Jones

The wars fought in my lifetime have been mostly commercial endeavours, with parties battling for resources at the expense of human life and toothless UN condemnations. One of the biggest issues facing nations and megalomaniacs hellbent on annexation of territory is the undeniable cost that comes with supplying the people who are pillaging, occupying, and otherwise conquering space on a map. Armies, air forces, and navies cost money. Lots of it. But if one could instead employ machines to clear out territories, operational costs of skirmishes go way down. More than this, training is essentially reduced to zero as everything a machine would need to know before going to battle could be stored in memory within seconds. Gone are the days, weeks, or months of training to learn the art of war. "Loyal" machines could be built by the thousands with each passing day allowing a hostile force to overwhelm the defences of all but their most powerful adversaries. A lot of people think North Korea with ICBMs and miniaturized nuclear warheads is a problem. A legion of drones carrying several thousand rounds of ammunition, mini-missiles, and a kamikaze sensibility to use every last bullet before the batteries run down would be just as terrifying. These could be built in secret and deployed under the radar, catching nation states completely unaware until the death toll was in the thousands, leaving infrastructure in place for the encroaching power to occupy territory without having to rebuild roads, power, and telecommunications lines along the way.

Wars of the future will be absolutely terrifying, and humans are simply not enough to combat such a horrific sight as 50,000 drones flying like locusts into the heart of a city while firing indiscriminately at anything that moved.

Rather than prohibit killer robots, we should enlist the best people to build them while following Asimov's three laws of robotics:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

These "laws" are hardly perfect, but they give us a really good place to start. If a hostile nation were to send drones into battle, either to conquer or as a terror operation, domestic drones would stand the best chance to provide the best line of defence until the military proper, staffed with humans, arrived. Domestic devices could provide cover while populations escaped. Domestic devices could drastically reduce the number of hostile robots targeting the civilians and key target areas. Domestic devices could buy time. UN laws alone are simply insufficient to prevent someone who cares little about the rule of law from exercising might.

People who know me will understand I don't propose the creation of machines that kill lightly. I'm hardly a pacifist, but I have a strong distaste of mechanical armies fighting our battles, as it cheapens the entire act of war. There is zero substantial cost if a government sends a million machines to fight a battle versus a million people. For this reason, it only makes sense that nations defend themselves from phalanxes of artificial troops. When better countermeasures such as targetable EMPs[1. Electro-Magnet Pulse] and other intelligent mechanisms are in place, then nations can look at fully outlawing the use of machines in war. A drone is not the same as a chemical weapon. A drone is not the same as a biological or nuclear weapon. A drone, autonomous or otherwise is a completely different type of threat, and one that should be met with whatever force is necessary until better defences are available.

At the end of the day it's not AI that we should be outlawing, strictly regulating, or blindly fearing; it's our fellow humans.

Auto Rejection

One of the many benefits of taking job interviews despite being employed is seeing what other companies are working on and how your existing skill set might be challenged in such an environment. This was certainly the case when I responded to a headhunter's email last week regarding a position at a company in town that is working on AR — Augmented Reality — technologies.

AR is something that I've looked at but not seen the allure in. Asking people to wear a heavy and uncomfortable headset or burn through full phone batteries within an hour, contend with visual distortion and perspective shifts, and generally unimpressive graphics resolutions while embarking on some kind of lame game or unrealistic experience is not something I'm interested in. Sure, we're still in early days of the technology and there are teams of brilliant people around the world working on making it all better, but I've just not seen something interesting enough to capture my attention. What interested me about the headhunter's email was that the position I'd be interviewing for would involve building the infrastructure and the API that would interface with the AR tools that were being built by another team within the company. There would be plenty of opportunity to work with cutting-edge hardware and build brand new software tools. Regardless of how one feels about AR, building new things and pushing one's boundaries is really, really cool.

So with this in mind, I decided that it would be silly to ignore the window of opportunity and agreed to meet with someone from the company in order to see if their goals aligned with some of my own.

The meeting took place at their office, and a cursory glance around the place revealed that most people are using Dell workstations with two or three monitors attached. Headphones were everywhere, so there's likely very little conversation taking place during much of the day. Most chairs looked slept in, and the air carried a whiff of sweat and parking lot odours, which is to be expected in the summertime. The person I met with was the CTO but, when I learned the company had fewer than 20 employees, the title seemed a little inflated. Small companies are great, as every person counts, but the titles are almost meaningless. Roles are more cooperative and dynamic.

Semantics aside, the meeting went rather well. The host arrived ten minutes late and, as we're both roughly the same age, we started by talking about tech in the 90s and what we'd like to accomplish. Later I talked about 10Centuries and its objective to be a non-profit service available for 1000 years, as well as some of what I'm doing at the day job. He talked about wanting to make holodecks, replicators, and transporters from Star Trek. An enjoyable discussion, though a bit rough at times as my Japanese is nowhere near native level.

Eventually we got to the subject that mattered most, what the company needed and whether I would be a good fit within the organization. The role was explained quite clearly, and I outlined some possible solutions to problems that were brought up, such as how to offload some of the AR work to a web server from a cell phone without introducing too much lag or requiring the web service to buy an entire data centre. Things were going well, up until one of the company's current projects was discussed1.

The first project would be an AR "imaginary friend" system. A character with the appeal of Clippy and an anime body of your choosing would essentially be a semi-interactive avatar that simulates a house mate. The system would be targeted at seniors living alone — who I can't imagine looking through a phone all day long — and NEETs who never leave their apartments but need companionship. My job would be to work with a team to build an API that takes the visual data from the cameras of the area, generates a 3D map, stores it on local servers, and then gives the digital pet a floor map from which they're expected to walk and avoid obstructions. This doesn't sound too crazy, aside from the storage of detailed maps of inside people's homes complete with geolocation positioning and other highly sensitive information, and could be an interesting challenge. I had a question.

This sounds very involved. The technologies required to make this a reality will not be cheap. How much will this software cost?

One of the biggest problems facing software companies is the lack of income from the people who use the services. Customers do not want to spend money on applications anymore regardless of how much time and money development of that system cost. Everything is expected to be "free".

This project was no different. I was told the app would be available for some Android and most iOS devices for about $3 after a promotional period where it's free. After that, people would be encouraged to buy their avatar things from the in-app "store", such as clothes, treats, voices, and other add-ons. Interestingly enough, the avatar itself would ask you to get these things if it didn't feel wanted.

Emotional blackmail as a service, anyone?

The goal would be to take the maps and character interactions from this system to use later when the company tries to build a complete VR game with very realistic avatars to interact with. We didn't talk too much about that project, though, and stuck to just this initial idea. The more questions I asked, though, the more it rubbed me the wrong way. The avatars were primarily gold diggers and secondarily spies, returning inventories of people's homes in incredible detail to servers. What's to stop anyone from "adding value" by delivering ads as verbal suggestions? Agreeing could then trigger a purchase, which could then result in the delivery of that product. Convenient? Sure. A little too convenient, though … no?

I'm all for providing some kind of companionship to people in need. Heck, done right, something like this could act as a confidante to people all over the globe. However, this company seemed to have their heart set on making this AR system into an ad delivery mechanism. When I asked whether the service might have a subscription option for people who didn't want to buy digital goods, the answer was a pretty quick "no" as "subscription services don't enable growth".

As the meeting wrapped up, I was asked if I'd like to come in next week to meet some of the team and ask some questions. Not being one to reject right away, I asked for a day to check my work schedule. Soon after leaving, I shook my head and decided this wasn't a place I'd feel comfortable working at. Just because the technology is possible and the tools are sophisticated enough to accomplish this sort of goal does not mean I want to be part of its creation or propagation. It all seems so … inhuman.

One saving grace is that I don't need this job. I still have full time employment and am earning a good amount for what I do. Not everyone is fortunate enough to have this sort of luxury. This does raise a question, though: if I were unemployed or still working in the classroom rather than behind a keyboard, would I accept this job? I've turned down others that do not align with my set of moral beliefs, but I wasn't responsible for a little human back then. I could certainly work at a place that I objected to if needs be, but for how long?

Of course, given how many organizations are using technology to strip away the last remnants of personal privacy and dignity in the name of "convenience" or "share holder value", I wonder how long a career in technology I might have …


  1. I was not asked to sign an NDA, so I'm free to talk about the project in detail. That said, I'll keep it vague in the event there are problems going forward.

Eschewing Today

A few days ago The Guardian published a short little piece from Mark Boyle, a man who has decided to turn his back on modern technology and live completely off the grid. He's living in a cabin built in a very traditional manner without phone, electricity, or even running water. In his own words, using nothing "requiring the copper-mining, oil-rigging, plastics-manufacturing essential to the production of a single toaster or solar photovoltaic system." If Mr. Boyle can live the life he chooses without any of today's modern conveniences, then I hope he finds the happiness he seeks. He is not the first person to have an "I Quit Technology" published in the paper, and he certainly won't be the last. What I often wonder, though, is why so many of the people who decide to walk away from modern conveniences think of today's tools in such a negative manner and why so many do so on their own rather than join a community of like-minded individuals.

I decided to eschew complex technology for two reasons. The first was that I found myself happier away from screens and the relentless communication they generate, and instead living intimately with my locale. The second, more important, was the realisation that technology destroys, in more ways than one.

Mark Boyle

The quote above is something that I've heard a lot over the last 20-odd years as computers and the Internet has become more a part of everyday life, and the people who say it tend to think of these tools in a very binary manner. Either we use absolutely everything a piece of glass and plastic has to offer, or we use nothing at all. There doesn't seem to be a middle ground.

In the grand scheme of things, I agree with the idea that people can be happier away from the relentless communication that is enabled by always connected notebooks and smart phones, which is why just about every notification on my computer and phone is disabled. I don't need to have a bunch of apps vying for my attention, nor do I feel the need to be completely up-to-date all the time. Doing so is exhausting and ultimately pointless if you don't make a living writing for gossip sites. For this reason people can get my attention with a direct phone call, an email to a specific address, or a message on Skype. Otherwise, I'll see the message when I see the message. After doing this for social services and RSS feeds some five years ago, my general mood improved and I started being more present and offline while out and about in the world.

As for Mark's "realisation that technology destroys", I fail to see how this is even remotely accurate. Technology couldn't care less whether it's used creatively or destructively, for good or for evil, to help or to hinder. The tools that we create — and technology is a catch-all term for every tool we've created — simply enable humans to accomplish the goals they set out to do. This isn't to say that all technology is inherently beneficial or the best use of resources, but how often does humanity make the most of anything? We are an incredibly wasteful species with the tendency to destroy wonderful creations for our own purposes. Making a sweeping statement like "technology destroys" is no different than saying "men kill". It's a poor rationalization of a far more complex concept.

Reading through Mark's reasoning behind the "technology destroys" statement, I get the feeling that he's more disgusted with consumerism than technology. He's disappointed with the way people interact more with glowing screens than each other. I feel much the same way, as our excessive consumerism and isolationism does have a very clear impact on the world, but I wouldn't dream of turning my back on everything just because I'm upset with how other people use the tools I was once openly mocked for developing. Instead, I can try to lead by example and encourage better technology recycling and offline interaction habits. Unplugging from the world is no different than sticking our heads in the sand, so to speak. It makes absolutely no difference in the grand scheme of other people's lives.

One Needn't Go It Alone

Amish Life

I'm sure we've all wanted to get away from it all at one time or another, perhaps escaping to a tropical atoll in the middle of the Pacific like Tom Hanks in Castaway. Living completely off the land and eking out a simple life without the endless distractions that make up so much of our day. While this can sound attractive at times, it sets us up for failure in a pretty big way. The world has been tamed in many ways, but it's still a hostile place for those who are under-prepared or over-confident. Going it alone and living off the grid can leave a person or family susceptible to an awful lot. This raises another question I've had for people choosing to reject modern technology and live like our ancestors: Why not really live like those who came before us and join an Amish community?

While it doesn't happen very often, people are more than welcome to join the Amish community so long as they commit themselves to following the Ordnung. Wikipedia does a great job of explaining it:

The rules of the church, the Ordnung, must be observed by every member and cover most aspects of day-to-day living, including prohibitions or limitations on the use of power-line electricity, telephones, and automobiles, as well as regulations on clothing. […] As present-day Anabaptists, Amish church members practice nonresistance and will not perform any type of military service. The Amish value rural life, manual labor and humility, all under the auspices of living what they interpret to be God's word.

So, be a true member of the church, be an active member of the community, follow the rules, and be a shining example of humility. In exchange you get to be part of something bigger than a single person could possibly be alone, and you get to contribute towards something worthwhile. While some Amish affiliations will permit limited use of technology, some outright reject the vast majority of it, as the chart below shows.

Wikipedia — Amish Affiliations

Being an island unto one's self seems awfully selfish and more wasteful than being part of a community. The great thing about the Amish is that, even if you choose to not join their religion and live life the way they do, they'll gladly let you join in many of the community events so long as you're civil.

To What End Will One Go?

The last question I usually have for people who want to reject tech is whether they'll reject modern science, too. Modern technology is ultimately the result of modern science. We couldn't have any of our digital tools without a deep understanding of physics, and we couldn't have the plethora of plastics and alloys we use without a solid grasp of a lot of different fields. Rejecting technology may as well be the rejection of modern science and, if that's the case, at what point will one draw the line to say "any science from after this time in human history is not for me"?

At the end of the day, everybody is free to make their choices. So long as nobody is hurt as a consequence, my opinion of these choices is neither here nor there. I do wonder whether extreme decisions are made without a complete understanding of what one is rebelling against, though. Technology can be used for good. Technology can bring communities of people together in unique and wonderful ways. Technology can help extend our lifespan. Technology can ensure our food is clean and safe. Technology can provide for us a toilet that doesn't smell like a porta-potty after a week at Burning Man.

Ultimately, the problem isn't the technology. It's how we decide to use it, and how we decide to perceive others who use it differently.

The Future of Space Exploration

Shenzhou 11 successfully reached orbit a few hours ago, carrying two crew members to the new Tiangong-2 space lab to conduct 30 days of tests. This is the sixth crewed mission China has sent into space, and it's a great step towards the normalization of the country's space ambitions. Considering the direction of other nationally-funded space programs around the world, China is certainly proving to be an ambitious nation with sights set firmly in the future. Regardless how how all of the commercial space projects develop in the coming decades, I really feel that China is setting itself up to be the go-to country for everything space-related before 2050.

The Long-March 2 Rocket — Shenzhou 11 — Launches Into Space

Although I've been pretty critical of China's political actions here on planet Earth over the last few years, the strides they've made in space technology has been nothing short of amazing. Standing on the shoulders of giants, they managed to send their first crewed vehicle into space within a few short years of the program and have since made great strides in technology towards supporting a permanently habitable station off the planet. Regardless of how they've gone about acquiring their knowledge, they've proven themselves committed to getting their foot into space and, looking at their future goals, a lot of humanity's physical expansion into the solar system may very well happen under a Chinese flag.

Pushing the Boundaries

China has made their intentions very clear. They hope to have a rover on Mars within the next five years, and human settlements on the Moon soon after. A lot of the people I've spoken to about China's ambitions look at them with suspicion, as though China is going to lay claim to vast sections of the solar system1 or act as a gate-keeper, preventing vehicles from other nations access to the void beyond Low Earth Orbit2. While it's certainly a good idea to keep an eye on every nation that invests a great deal of time and money into space programs, we needn't eye everyone as a threat. In fact, China's unbridled desire to be the best will ultimately be a good thing for the rest of us. Just look at all of the modern conveniences we have today as a result of NASA's various projects. There's no reason why Chinese companies could not also do the same, providing useful tools and products that are used by people all over the planet.

Ultimately, though, the one area I can see China revolutionize space technology is in propulsion systems. If any organization is serious about sending people to planets and moons, we simply must have a way to get to these places faster. When the planets are aligned correctly, we can send a vessel from the Earth to Mars in 18 months. That's just too long a timespan if there is to be any sort of commercial interest. There's a vast asteroid belt sitting between the orbits of Mars and Jupiter that is theorized to have a far greater supply of raw materials than several Earths combined. The Chinese government isn't stupid. These resources — while not currently economically viable for Earth-based consumption — will undoubtedly result in massive profits to the group that learns how to collect and refine them before anyone else … which is where I see China going in the next century.

Humans will always need resources, and China is very, very good at extracting them. By sending rovers to planets and moons they gain useful experience aiming for fast-moving rocks in the endless void of space as well as sending machines that are sturdy enough to survive the inhospitable environments they'll encounter. While space is said to be a vacuum, it's not without its dangers. Radiation, micro-meteorites, big meteors, and things we don't even know about yet can destroy a lot of equipment. Learning how to build better machines will result in better returns very quickly. And then comes the question of speed …

Asteroids in the Solar System

If — and I realize I'm hypothesizing here — China becomes very good at building semi-autonomous rovers that can land on asteroids and extract, refine, and store those resources, they will position themselves as the economic lords for the first generation of space explorers. Need water, oxygen, platinum, silicon, or any other resource in a hurry? Well … China can have it to you in a hurry, so long as you're willing to pay for it. The asteroid belt sitting between Mars and Jupiter is huge. Having factories placed throughout the area would make it possible to have resources dispatched and received in months rather than years, and having machines go out and pre-collect these resources makes for a sound investment. One that would pay for itself a hundred-fold if not more.

Looking at China's rate of technological advancement and the various projects that are championed at some of the more prominent Chinese universities, it's clear that this is the direction the country's government is leaning. It makes sense, too. Good business sense. While humanity may not be clamouring to colonize the solar system like popular culture encouraged in the 1970s, it's only a matter of time before we do take our first tentative steps off this lush blue planet. But, when we do, China will be there waiting.


  1. This would be legally impossible due to the Out Space Treaty, which China has been a signatory for over 40 years.

  2. This would be logistically impossible given the current state of our technology.

Indistinguishable from Magic

Earlier today Matt Gemmell wrote an impassioned article on his website where he decried the current state of Apple's software and how it makes very expensive electronics horribly frustrating to use, and even harder to justify buying. The post has been taken down1 but, in the short time that it was available, people from across the planet wanted to weigh in with their opinions about how "Apple is doomed", offer technical support, or otherwise openly mock this person who has clearly invested a great deal of time, money, and effort into this wholly artificial realm we refer to as the Apple Ecosystem. Like Mr. Gemmell, I have also invested a great deal into Apple's hardware and software over the years. I came in just as it seemed that everything Apple touched turned to gold, and I left when my rose-coloured glasses eroded, revealing that Apple's world was just as incomplete and prone to error as any other platform a person might choose.

Any Sufficiently Advanced Technology …

One of the many things I like about modern hardware is just how sleek everything has become. Smooth lines with seamless transitions between metal and glass abound. Processors thousands of times faster than anything we might have used while growing up can now be found in a wristwatch. Batteries can go on and on. And the screen resolutions available today are really nothing short of breathtaking. Our modern consumer electronics today are light years ahead of what we could have imagined 15 years ago. Looking at the sorts of complaints people have with our current range of devices, I wonder if this isn't the actual problem that we're facing: we've benefitted so much from so little in such a short amount of time that we don't realize just how much "more" we're expecting from electronics manufacturers, and these vendors are simply unable to keep up with our laundry list of expectations.

It's Time We Slow Down

When I first entered the world of Apple software, I was pretty fortunate. The iPod Touch, though a marvel of engineering, was a relatively simple device running simple software and performing simple tasks. The first version of OS X I used was 10.6 Snow Leopard, arguably one of the most refined versions of the operating system given that there were zero new features and a slew of fixes that resulted in standing ovations at Apple's Worldwide Developer's Conference all those years ago. Coming from Windows, the Apple world just felt so much more cohesive because updates rolled out at a slower pace than those from Microsoft. With more time between releases, new updates and features could be more thoroughly tested and refined, meaning that the majority of the bugs reported from people would — ideally — be more of an edge case than the rule. This is clearly not the case anymore, and not a week goes by where a semi-popular web personality doesn't lament Apple's declining software quality or lack of cloud infrastructure skill.

One might suggest that people who aren't happy with Apple should move to another ecosystem that is more in line with their values, but this isn't always realistic. One cannot simply swap out Google for Apple or Oracle for Microsoft. These sorts of migrations often take a great deal of time, planning, and money. More than this, people shouldn't be expected to always vote with their wallets. Moving from one platform to another 10 is not a feat that can be completed in a weekend nor is it an effective way to send a message. At the end of the day, the best way to send a message is to openly communicate, and I say it's time people stop demanding so much pseudo-innovation from their electronics and instead ask companies to slow down and release their hardware and software products when they're ready and not a minute before.

Traditional computers such as desktops and notebooks do not need to be replaced every year, so companies should focus on building machines to last 5+ years. Cell phones do not need to be replaced every year, either, so why push out an update that appears to be little more than an incremental update over last year's model but with a substantial price tag? Tablets and watches are in the same boat. Heck, just about every electronic device we use on a regular basis could likely switch to release a new model every three years with minor revisions to accommodate problems during the intervening months. Many people will undoubtedly disagree, but today's hardware is good enough for the vast majority of what we want to do. Why must we continually push the envelope?

With a slower hardware release cycle, software developers will have more time to focus on the less-tangible aspects of our systems and strive to make improvements. Some new things could certainly be introduced during this time, but this in-between time would really be the time for the devices to be polished and refined while people also become more accustomed to using the tools they already have.

The amount of change that we've all seen in the last fifteen years has been nothing short of amazing, and this change has resulted in some phenomenal effects on societies around the world. By slowing down we won't diminish anything that's happened thus far and we stand to gain greater benefits if the functional lifespan of our difficult-to-recycle electronics is extended thanks to a slower release schedule. Will this hurt the bottom line for a number of people who are already wealthy? No doubt. But we cannot expect our tools to continue their evolution at such a breakneck speed. We're seeing the faults and cracks in our systems already, but it's not too late to do something about it.

A three year hardware release cycle with a longer software cycle would go a long way to changing our perceptions of modern electronics from being the occasionally frustrating objects they are to devices indistinguishable from magic.


  1. Though intrepid investigators may have some luck in finding a cached version of his article, titled A Declining Trajectory.