In a previous post, I discussed the factors that allow small teams to create products that can be exposed to millions of users within a few months. In this post, I want to take a deeper look into why consumers are so much more comfortable with technology now compared to 20 years ago and try to see where this leads. Since customers are what cause our businesses to grow, we need to be cognizant of what drives their behavior in order to plan for the future. Wayne Gretzky’s father famously said “A good hockey player plays where the puck is. A great hockey player plays where the puck is going to be” and I’m hopeful that we’ll be able to see where the consumer puck is going to be.
To me, the major driver is Moore’s Law. We’ve seen computation speeds double every 18 months for the past 50 years. This has obviously led to faster computers but has also led to exponentially reducing costs. This has been a huge economic driver and is allowing computers to be more accessible than ever. Our cellphones are more powerful than what was used to land on the moon. These increases in computation also led to the rise of the modern web. It went from being a military/academic project that dealt with text data to something that’s distributing pictures and videos to whoever is interested.
More importantly, improvements in computation led to improvements in usability. Even if we had modern browser standards like CSS3 and HTML5 in the 1990s our computers would be too weak to handle them. We would not have any of the modern innovations (AJAX, DOM manipulation) and our web pages would be static without any rich media content. If we never got past the command line, how many people would have computers in their home? How many smartphones would exist? I’d argue that the usability improvements are what led to the massive consumer adoption of tech products. Of course, computation, cost, and usability are all intertwined but computation and cost alone would not have led to the consumer adoption we’ve seen.
What does this mean for the future? I see usability becoming even more native with us not realizing that we’re even using a computer. We’re already seeing this emerging with Siri and Google Glasses. As long as our computation speeds continue to improve these technologies will become better and better and will recede more and more into the background. Of course, this is all dependent on Moore’s Law holding, with many saying the pace will decrease by 2020. I’m optimistic that we’ll come up with something but even if we don’t, as long as we computing costs keep on dropping, via Koomey’s Law, we should still see the benefits as we move more and more computation to the ever cheaper cloud. It’s difficult to imagine what would happen if our computation speeds stop increasing the way they have been over the past 50 years.