The Technology Pendulum

The Technology Pendulum

Think about music in the 80’s — synth heavy, drum machines permeated the airwaves. A new technology gets introduced the pendulum starts swinging from its apex. Some adopt it and abuse it, others see using intentionally lower-fi tech as a way of rebellion, and still others just wait. It is so prevalent it creates new genres and cultural scenes. Eventually it swings back, and settles into even more interesting application in combination with preexisting instrumentation as well.

Consider the auto-tuned songs of the 2000s. Certain musicians are synonymous with it, and others that had no business using the technology even tried their hands. Why? Just because it existed.

Testing the limits is something humans do well, to find out what works, what doesn’t and to learn what the application of the technology is. This is the initial downward swing of the pendulum.

Enter the Touch Screen.

Eventually, things settle into their actual uses, or will be combined with even newer technology to integrate in human lives seamlessly, but the technology pendulum initially starts at a peak, so far in one direction, that all who adopt without question will have a wild ride for a bit.

Sometimes with the newest ideas, we are told it is “intuitive” without it really and truly being intuitive. Our behaviors have to change to adopt this technology — what is intuitive about that? We push buttons and turn dials, because we are physical beings. When those functions moved to a screen, it took over a decade for us to even be ready for the format of a touchscreen, yet alone be prepared for the use cases.

The iPhone wasn’t the first touchscreen phone. IBM had created one all the way back in 1992 called the Simon. It wasn’t marketed or sold until 1994 and even then, users weren’t ready for it. And it wasn’t just a touchscreen phone, it was smart“…it was also able to send e-mails, faxes and messages.It also featured very useful applications like calendar, appointment scheduler, calculator, world clock, electronic notepad, address book etc.,”*

The iPhone wasn’t even Apple’s first attempt at a touch screen. The Newton from 1995 was a massive failure — mostly because no one was ready for removing buttons in favor of a screen that couldn’t be used to rudely text in movie theaters — and because it had skipped a few iterations of what a mobile device SHOULD do.

Communication does not change based on platform.

We, as humans, intuitively and instinctually communicate with voice, expressions AND non-verbal gestures. In written formats, we distill language into glyphs and characters to tell stories. When communication gets translated into bits and bytes, content on a screen, we still make efforts to add expressions and gestures to fully encapsulate our human emotions. Thus the advent of emoticon/emoji. ^_^

But emoji was just a very very small step in the technology communication world to provide context for instant, written word. From a software standpoint proliferation of apps like InstagramSnapchatfblive, and tiktok have all given insight to a more natural way in which humans naturally encode, share, tell stories.

With the advent of the Apple 6s and “live capture” function, Apple took the context of a photo, and bookended a moment with a few more moments. This was one step closer to create context from a singular image, applied to technology. Several generations later, the acceptance of the technology is directly in line with the hugely popular proliferation of short form images, like gifs, boomerangs, and their use in the aforementioned stories and moments across all social platforms. This capitalized on the concept that humans all communicate strive to provide context while telling stories.

Moments before a new Apple 11 device announcement, the live photo still exists but what will sell more devices this year? More. Human. Elements.

Context is one thing, but how do we make this flat screen seem more human?

More ≠ more.

I’ll be the first to say I’m not an Apple fanboy, but even if they are not the first to adopt an idea, I do see that they tend to they wait a little longer for the pendulum to swing back towards the other direction before releasing a product. This allows other companies and apps to really test the proof of concept for them.

Light.co had an idea to solve for blurry photos, create a depth map that really lets you see around just the flat image, and it seems like Apple is looking to use that same thinking in technology to apply to their story-telling repertoire with the most popular feature of their phones — the camera.

A third lens will make the already capable 3D photos and software driven camera capabilities to a new level that will have ramifications that are yet to be seen, but like with every tech cycle, buckle up, it’s gonna get weird for a bit.

All things considered. Communication while something we all do instinctually is visual and creative, has to be programmed and created in technology to make it seem more “intuitive” to human users because we still want to do things in a way that IS human. This takes time to get right and finally gets to a point where it’s commonplace and makes sense for all to adopt.

*https://www.spinfold.com/first-touchscreen-phone/