Beware of geeks bearing gimmicks

Why do we deny the obvious and discount trends?

Why do we deny the obvious and discount trends?

A few weeks ago, I watched the founders of Humane unveil their new type of wearable, the AI pin, a screenless squircle that makes a big, bold sci-fi bet on being the device that will supersede the smartphone. 

My team were less than impressed … “I don’t get it?” … “why would you want this?” … “it’s literally Siri” … “so there’s no screen?” 

Their reaction reminded me of BlackBerry – a recent film that charts the extraordinary rise and spectacular fall of the eponymous smartphone manufacturer. In one dramatised scene, the Blackberry team are gathered around watching the iPhone unveiling at Apple’s 2007 keynote. Mike Lazaridis (the founder) looks at his BlackBerry and says, “Why would anyone want a phone without a keyboard?”

It didn’t end well for Blackberry. Apple and the iPhone took ever larger bites of their 45% hold on the cellphone market until it had virtually nothing left. 

This all got me thinking: why do we so often dismiss disruption?

Unsurprisingly, Clayton Christensen, author of “The Innovator’s Dilemma” and father of the term “disruptive technology” has a theory:

New technologies tend to undershoot what people actually want, and as a result are considered fun or niche by the general population.

Or, as Chris Dixon (Partner at a16z) puts it so succinctly in a brilliant article, “the next big thing will start out looking like a toy.” 

It doesn’t help that initial versions of disruptive technologies may seem unconventional or playfully experimental; they look and feel like novelties, so we treat them as such. Remember the early days of the internet? Cartoonish webpages, coloured scrollbars, autoplaying videos / music, hit counters and animated gifs …

When websites looked like this^ it’s hardly surprising that the economist Paul Krugman made his ill-fated Internet prediction:

“By 2005 or so, it will become clear that the Internet’s impact on the economy has been no greater than the fax machine’s.”

We also tend to forget many disruptive technologies undergo iterative development processes. The initial versions may lack the full functionality or features that make them viable for mainstream use. The first commercially successful portable computer, the Osborne 1, had no on-board battery and had to be plugged into the wall. The first commercially available handheld mobile, the Motorola DynaTAC, weighed over a kilogram and was affectionally known as The Brick.

It’s also the case that disruptive technologies often challenge existing norms and paradigms. They may introduce new ways of thinking or doing things that initially seem unconventional – like getting into a Cruise self-driving car or spending your leisure time in the Decentraland metaverse.

It’s why early technology adopters are typically geeks. They don’t buy products because of proven results, or an established track record. They buy because it’s new and innovative and because they want to tinker and try it out. Instead of being wary about no social proof, they are actually motivated by it.

Geek (noun) – a person who is knowledgeable about and obsessively interested in a particular subject, especially one that is technical or of specialist or niche interest.

You just have to look at the PC revolution. It was MITS, a small company that produced electronic kits for hobbyists, that made the first commercially successful personal computer – the Altair 8800 (1974). It wasn’t until much later (1983), when IBM launched their PC and Apple introduced Lisa that business computing eventually achieved widespread adoption (thanks largely to killer apps like word processors and spreadsheets.) Prices fell, performance improved and the real explosion began!

3D printing is a more recent example. What started as a perfect product for at-home DIY enthusiasts is now being used across industries, from Aerospace (eg. Relativity Space 3D prints rockets) to Construction (eg. Rebuild 3D prints buildings), and even the medical industry (eg. Xilloc 3D prints human bones!).

No discussion about geeks would be complete without mentioning gamers. In the process of writing this article, I came across The Gamer Disposition by John Seely Brown and Douglas Thomas in the Harvard Business Review. They share 5 key attributes that embody gamers – 3 of which are worth highlighting:

  1. They thrive on change

    Gamers do not simply manage change; they create it, thrive on it, seek it out.

  2. They see learning as fun.

    Gamers convert new knowledge into action and recognise that current successes are resources for solving future problems.

  3. They marinate on the “edge.”

    Gamers desire to seek and explore the edges in order to discover some new insight or useful information.

With character traits like these, it’s unsurprising that gamers also tend to be early adopters of disruptive technologies. Whether it’s using cryptocurrency for in-game purchases, or NFTs to create, buy and trade characters, or joining virtual worlds (like Roblox) to connect, share and socialise.

Speaking of virtual worlds, another technology that has been adopted and popularised by gamers is Virtual Reality (VR). Thanks to increasing demand for immersive gaming (and the metaverse), headsets have gone from having to hang from a mechanical arm suspended from the ceiling to the Apple Vision Pro, a $3,499 pair of VR ski goggles.

So, it’s not that we necessarily dismiss disruption – it’s that the majority of us dismiss fringe behaviour and more often than not, lack the open-mindedness enjoyed by early adopters (the geeks and the gamers). So, the next time you look at something and think “it’ll never catch on”, reject your prejudice and remember your Virgil … “Beware Geeks Bearing Gimmicks”.

Till next month.