The year is 2019 in the month of November. Above your head there are cars flying through the skies, advertisements from every direction beckoning you to colonies on other planets, and a breed of life-like, androids called Replicants serve their human masters. You might recall this is the classic sci-fi movie Blade Runner. I watched it again recently* and was struck not by the acting, story, or deeply detailed imagery (all of which is great by the way). What really took me back this time was the year mentioned in the beginning. Here we are in 2014 and only five years removed from this imagined future vision of humanity.
In a conversation a couple of weeks ago among family, the topic of “the cloud” came up. The term is ubiquitous and nearly every tech company extols its virtues. What the term actually means however is ambiguous. The concept of serving files and applications from a centralized server has been around since the earliest mainframes. Connected computing resources accessed via a standard protocol existed since ARPANET in 1969. When the Web grew, many companies launched ASP hosting services, otherwise known as Application Service Providers, that in some ways resembled the current cloud model. So this begs the question, is “the cloud” really a unique technological innovation or merely a marketing term?
A few years ago Bruce Gibney of the Founder Fund wrote a manifesto about the future summed up by the quote, “We wanted flying cars, instead we got 140 characters.” It was an incendiary commentary on the current state of innovation in Silicon Valley that seems more interested in quick hits and simple apps than with solving “real problems”. It ignited critics on both sides of the debate. Some decried the lack of real risk taking by investors into cutting edge technologies and tackling more pressing societal challenges. Others noted that what may appear to be silly apps can later become groundbreaking and transformative technology platforms as Twitter.
I would not argue that truly innovative and disruptive technologies do not occur in today’s environment because that is obviously not true. Just look at how the Web impacted entire business models, and how mobile is creating further disruption across every area of industry. The way we travel, the way we eat, the way we buy and sell goods, and the way we work, all have felt the march of technology.
What we often think of as leaps and bounds in progress are more the result of many smaller, iterative discoveries that taken together led to a more steady path of progress. Sometimes fortunate accidents lead to revolutionary discoveries like the polio vaccine. Most of the time though, what we perceive as great leaps are more of a consequence of many congruent and modest innovations coming together at the same time.
Take the “cloud” as an example. Yes, it is groundbreaking when observed holistically. However, was it a huge stepwise jump or simply a consequence of many other ideas coming to the forefront? Open source server operating systems, faster broadband access, multi-tenancy, and the model of “renting” software all needed to get to a certain level of maturity before companies could bet the farm on creating businesses based on those things. Taken piecemeal , those were important technologies and ideas, but not necessarily groundbreaking by themselves. This is why generally smart folks such as Larry Ellison could be “cloud deniers” for so long.
It may be funny to admit now, but when I first heard of Twitter, I thought it was the dumbest thing I had ever hear. What good could come out of posting pithy messages like what you had breakfast to a bunch of random people. Six years later, I could honestly say that Twitter transformed my life through the people I met to the new ideas I discovered. It also transformed entire countries through revolutions, became the key source and distribution channel for news, and changed the way we think about communication in the modern age.
Picking what is innovative versus what is not innovative is a tough task when living in the now. Sure, another photo app or subscription goods company probably is not about to change the world. Then there are apps that simply send “Yo” across the Internet. How often do we laugh at things that seem so pointless? The court of public opinion has become even more reactionary and polarized because of the constant stream of communication at both higher volume and higher amplitude. So instead of thinking of the context and giving new ideas time to grow and take shape, we attack them like a piñata (or as alien outcasts like the bizarre GE ad).
On the other hand though, what we think as such obvious “innovations” at the time just do not pan out as expected. We only have to look at the dust heap of tech duds and product bombs over the past few decades to realize that our ideas of innovation probably are often as fanciful and misplaced as flying cars. It is not that the technology is bad necessarily as much as it is a stranger in the marketplace. Techies get it but the broader market is simply not ready.
As Apple rolled out their anticipated smartwatch today, I could not help but think of the iPad’s launch versus the time they launched the Newton in the 90’s. The iPad was a massive hit whereas the Newton was the epitome of a tech dud that nearly sunk the company. The jury is still out on the smartwatch. It reminds me of my time at Siebel, we had what I thought was the coolest product called Siebel Voice, a voice command interface for interacting with Siebel from a cell phone. Only problem was that beyond the demo, it simply did not work for users. The context, time, and technology maturity simply were not there for something so vastly different from the norm. Now with higher powered smartphones and technologies like Siri and Google Now, such an interface makes total sense.
So what does a 80’s sci-fi movie, the cloud, and the debate about “real tech” have in common? It all boils down to innovation and how we view what truly is novel and world changing. Flying cars seem really fascinating and wiz bang when compared to a social network, but are we looking at the question from the wrong perspective? In other words, maybe we have got the idea of innovation wrong? Instead of fretting that not enough innovation is happening, maybe our smaller steps are actually big leaps in disguise.
Image credit: CC by Tuncay