Skip to main content

While wearable technologies may still be in their infancy, there’s no denying the age of wearables is here to stay. As enterprises and consumers alike embrace smart glasses and smart watches, the ecosystem of available wearables is growing at a rapid pace.

As dawn of wearable technology unfolds, it begs the question—how did we get here?

Since it’s always fun to reminisce about old tech and because we love a good graphic, we’ve charted the lineage of wearables back to their 1970s ancient ancestors – the quartz watch, the television, analog radios and the microprocessor. In the process we realized the wearables family tree is complex, much like the family trees of today where divorce, remarriage, adoptions and the like must be factored in.

The broad, interrelated trends around miniaturization and portability, wireless communications, power efficient computing and advanced display technologies have come together to create the smart devices of today. The patriarch of these devices, the smartphone, is now 15 years in. The smartphone has paved the way for many generations of wearables to come.

In 1975 wearables hung around your neck

The path to miniaturization, digitization and convergence

In addition to undoubtedly groovy clothes, your portable electronic devices probably had a strap and were hanging around your neck. Your camera, your FM radio, eventually your Walkman by the late 70s had gotten smaller, cheaper and personal – creating the foundation for the consumer electronics industry.

The seeds of the digital revolution were also set during this era. In particular around the wrist.  In 1969 Seiko introduced the first automatic quartz watch that uses a rotating pendulum inside the case to recharge. It wasn’t long before the first electronic wristwatch by Casio was born in 1974. The Casiotron’s later iterations included the Casio Databank CD-40, which allowed for data storage such as telephone numbers and provided a calculator. The digital camera first came alive during this time too. Kodak developed first digital camera technology in 1975, but dropped the concept which then lay basically dormant for the next 20 years.

By the late 90s miniaturization and digitization were in full swing. Small digital cameras were mainstream. The Garmin portable GPS device was launched in 1990. Bluetooth headsets become available in 2000. In 2001, the iPod changed audio and consumer electronics forever.  Ultimately these personal electronics moved from standalone, to peripheral attachments, to fully absorbed components of the smartphone – it was just easier and cheaper to have it all together in a true mobile, converged smart devices.

The beast in your living room

Driving the human-technology interface

In the 70s, it wouldn’t have been unusual to see a big TV console set up in your living room.  Lots of wood, not so much screen. To address the screen to bulk ratio, the cathode ray tube (CRT) technology of the time was rapidly moving aside toward rear-projection big screen.

The advent of flat-panel liquid crystal display (LCD) was a watershed technology advance which changed both the trajectory of TVs, but also computing and now with wearables. With its low power and small form factor, LCD paved the way for full-color, high definition displays to be included in every device big and small – ranging from 100”+ TVs to smartphones. Further, the projection optics continued to evolve to be low cost and smaller, resulting in picoprojection technology found in smartphones and on head mounted displays, such as smart glasses.

As LCD became prevalent as the display of choice in portable devices, capacitive touchscreen technology originally found on computer touchpads also became integrated in smart handheld devices, with multi-touch capability being commonplace today. This combo of vision/touch technology has formed primary basis for user interface methods across the devices we use today.

The Computing Engine Cranks Up

The economies of scale that make it all possible

As the wristwatch and your portable devices evolved, so too did the modern computer as it continually got smaller and more user-friendly. Quickly moving from the world of DEC microcomputers and the first PC, mobile computing became a reality in 1986 with the birth of the Toshiba T1100—the world’s first mass-market laptop.

Laptops continued to get smaller and more portable and the earliest personal digital assistants (PDAs) were created and subsequent models based on the Newton platform that Apple developed from 1987-1998. The PalmPilot also emerged from this bloodline when Palm Computing introduced its first PDA in 1996.

The BlackBerry branched off from the PDA with its very first RIM device, the Inter@active Pager 900, a clamshell-type device that allowed two-way paging. Its initial success resulted in many generations of Blackberries that were introduced throughout the ‘90s and evolved into the tablets and smartphones we know today.

Through these stages of evolution the subsystem and chip companies designed new manufacturing techniques advanced their products, but more importantly also the ability to stamp out units by the millions.  This drove down costs, creating the ability to put a little compute power into just about every device imaginable.

The invisible fabric 

The marriage of devices and the Internet

Networking is the glue that holds the various technology branches together. Early on, finicky and expensive analog systems were the norm. The world of voice and data were completely separate.

At the beginning of the 80s the twin forces of wireless communications combined with packet/switched digital networking unleashed a wave of change. Once Ethernet and TCP/IP became the standard networking standard, network capacity exploded with the Internet of Devices having a common speaking language.

Along with the growth of mobile computing, wireless networking also became standardized and commoditized, extending the reach of the Internet, not just to servers and PCs, but also to portable computing devices, and eventually with Bluetooth LE and ZigBee, brought the notion of Internet of things.

The Birth of the iPhone

Where it all came together, and the starting point for a break-up

Smartphones were where we brought together the common trends in computing, communications, display and portable electronics. Fifteen-years later, it is now the generic system – the baseline which the next generation of wearables are building off of.

After a long era of technology convergence into a single consumer device, we are entering a period of new innovation, new options and new devices that branch off the smartphone.    Some, like watches and cameras, are decomposing or disaggregating individual capabilities into optimized form factors. Others, especially smart glasses are taking the smartphone guts and adding new capabilities to handle even more complicated and advanced scenarios.

The Next Big Branch

Wearables and IoT – the marriage of humans and machines

 In our lifetimes, we’ve seen a rapid evolution in the wearables family – it has brought together four core branches of technology into a new class of device. When we look back on the evolution of wearables in a few years, I predict we will see a new family in the mix – IoT or the Internet of Things. A connected environment and connected people that go hand-in hand. There’s a natural marriage and valuable self-reinforcing technologies with wearables and IoT. We already see this exploding in the enterprise – industrial machines connected to workers with smart glasses, for example. But this is just the beginning of a transformational new chapter in wearables family tree.

Read the published article here