As a programmer I learned the basics of optimization: if you're going to need the same data a million times, prepare it once then reuse it. Or in other words if you do things once instead of a million times, surprisingly it will happen faster overall.
Why is it that geeks who seem to understand this concept perfectly when it comes to machine code, just do not get it when it comes to interaction with other human beings ?
I have always been a ruthless geek. Mind you, for a full year I ran a Linux distribution as my main desktop system (Mandrake, then Debian). That was around 2002 and I probably admired Linus more than Steve Jobs back then. It seemed natural to me to know which type of RAM each PC used and I could never have imagined that ten years later I just wouldn't remember nor care much what GPU is in my laptop.
Then after struggling for a year editing text files here and there to get back my X display and after having succesfully compiled my 802.11b modules as explained in one of the 236 contradicting tutorials around, I suddenly had my share of the constant management that was required by a Linux desktop. I realized I was getting a bit old to waste my only life fiddling with my system. Maybe this was preventing me from learning stuff that is more important.
When my Laptop died I just didn't have the motivation anymore to go through the endless search for the perfect system and my next PC naturally came with Windows® installed. Once I changed the desktop background it was still ugly but useable. I could write code. I didn't like Windows, but it worked fine. I kept a Debian/Ubuntu partition around for a while hoping stuff would get better. Part of it did.
So about ten years have passed. However if you do a google search today there are still zillions of recent tutorials that explain how to do stuff on some Linux distribution or another. Most of it is about getting hardware or software to work. The result of these procedures is not some amazing achievement. You're not installing Linux on your alien wristwatch, you just get to use software or hardware that you couldn't otherwise on this particular computer or distribution.
One thing strikes me: out there thousands of people are following tutorials to solve the exact same problems. Just like I did years ago. Instead of having someone responsible of fixing the root of the issue (also known as rewrite), a seemingly infinite number of people keep on patching stuff hoping that one day everything would be fine. But it just never is. If it was fixed, there just wouldn't be so many tutorials. It may have the appearance of knowledge, but if it is outdated a month later, then it isn't worth much more than your groceries shopping-list.
Linux users are stuck in a for-loop duplicating all the same processing over and over again because there was nobody to simplify and optimize the process in the first place. This has to be the worst optimization of ressources in recent history.
The Linux world is an extreme case. But this syndrome of useless repetition of tasks affects us in daily life even in the most basic iPhone apps.
Let's say we want to buid a weather app: a French developer would set Celsius as the default and an American would set Fahrenheit. Both would be wrong. To save himself two lines of code for appropriate defaults based on the locale the developer here creates an unnecessary effort for potentially millions of users.
Who cares? It's one checkbox to click right? It may be three taps in your app but multiply that by each interaction a user has every day and you might be one of the drops in the ocean of confusion that prevents people from learning a whole new skill. All that because they wasted their precious time doing something that they might not have needed to do at all.
If there is something good a software designer can do it is to think ahead and solve people's problems without them ever knowing. Think of yourself as of Batman, quietly saving people in the night. They will never know but the world will be better off. And no, you don't need the costume. You hopeless nerd.
Note that this is not necessarily opposed to choice. If you want to put options in your apps do it, some people will like them. Just make sure the majority will never need to use them.
I was recently reading slashdot user comments on Steve Jobs' death. It was painful and yet amusing. Most dismissed him as a "fashion designer" or just some "marketer". Blasphemy I tell you, blasphemy !
Imagine spending 20 years of your life studying Mandarin, and then comes a guy with a 99 $ device which translates your thoughts perfectly in all languages. Wouldn't you somehow hate the guy who created it AFTER your learned each one of those very complicated symbols ? And wouldn't you denigrate all the "idiots" who just use the bloody device and are suddenly your equal ?
Computer geeks usually despise Steve Jobs, not as they like to tell themselves because he supposedly took liberties away from them – by offering a set of products they don't buy – but because his life's work was to destroy the complexity they held as their pride. Jobs was one of those who wanted to allow everyone to use computers. He did so because he was more interested in listening to the music he loved than in the AAC compression algorithm, which is more or less the definition of a normal human-being. He didn't make computers easy with a snap of his fingers. A lot of people worked to make computing simpler, including Bill Gates. Jobs got more credit because he was simply better at it.
Programmers will look at iOS and be bewildered at the dissappearance (user-wise) of the general purpose file-system. The others will be thankful that this thing they never really understood went away and that they now have a thing called "photos" and another called "music". For those born afterwards, managing your files manually will be akin to using a Terminal today: a programmer's thing.
Some talk with pride of their shiny Android devices, boasting about their numerical specs, and could spend a night discussing which is the best task-killer available or the best backup solution or how well they have configured them. The others have either never heard about task-killers or just wish they didn't exist. They wish that backup didn't need to be setup: that it would be there already, appearing only when you thought you had lost your data. Batman-style.
This notable participation in the common search for simplicity – not dumbing-down – is why a lot of people feel sad that Jobs is gone. Simplicity in computing is the modern times' printing press. It increases the spread of knowledge to the masses and optimizes the life of all humanity, so great people can dedicate to great discoveries, not having to worry about their tools.
Speaking of tools, what is Richard Stallman up to these days? Still browsing the web via e-mail in a text console I guess.