Can you feel it? How to improve the human – computer interface…

“We live in a world of information now, and there is a difference between accessing Big Data and experiencing it” David Eagleman, 2015

What is it like have to have the pitiful, impoverished nose of a human? What is it like when you take that feeble little noseful of air? How can you not know that there

What is it like have to have the pitiful, impoverished nose of a human? What is it like when you take that feeble little noseful of air? How can you not know that there’s a cat 100 yards way, or that your neighbour was on this very spot six hours ago? Image source. Quote: David Eaglemen.

About a year ago I was fortunate enough to combine my love of Data with my love of Motorsport. One of the challenges that was described to me was how difficult it was to present actionable information back to a driver in a way that he or she could react without it being a distraction. Imagine driving around a high-speed corner at over 150mph (240kph), trying to find the edge of the grip available to you, and seeing an array of flashing lights on your steering wheel, and a commentary from your race engineer in your earpiece.

Those of you who have experienced the bedlam of a trading floor will know all too well the scenario. Your desk might well have a bank of 8 screens (4 across by 2 high) each showing frames of information, perhaps as many as 10 per screen. Your keyboard has coloured keys to help you navigate to the right shortcut, and you’ve got maybe two phones also (one for each ear). Above your desk on the floor support is another bank of screens with a stream from one of the financial news broadcasters and on the back-wall there is a ticker showing movements on whichever indices your desk is trading.

In worlds where the latency of data to information, and information to insight, and insight to action is at a premium, our only solution seems to be to bombard the senses with sight and sounds and hope the brain can somehow cut through the noise.

Imagine you’re sitting in a restaurant, catching up on the day with your loved one. Your phone is in your pocket and you can hear the little ‘ping’ every time an email or text message is received. Perhaps you have it on silent, and instead a harmless buzz alerts you to the presence of a new distraction. How many of us are guilty of letting our train of thought wander until the worry of what might be overcomes our sense of etiquette and our gaze is diverted to our inbox?

And the solution? Perhaps you have a smart watch, so that you can less intrusively check your inbound messages and stay alert to the present?

I remember clearly the moment when the lunacy of all this struck me. During the entire evolution of the human – computer interface, we have been preoccupied by the visual and audible transmission of information from the machine. Yet our entire body is a sensory grid. Right now, I can feel the touch of the eight or so keys my fingers are resting on, at the same time as hear the sound of traffic in the distance. My eyes are gazing at the screen, where a small pop up has appeared to say I have received a new email. I can feel my belt is a little too tight (a common occurrence, these days) and I’m aware of a mild ache in my shoulder to indicate I’ve been slouched at my desk for too long.

One technologist at least is thinking in the same direction that I have been pondering. David Eagleman is a Doctor of Neuroscience and a New York Times Best Selling author. His TED talk, given early in 2015 provides a tantilising glimpse into what might one day be possible with technology and expanding the human sensory interface. And this is why I found his presentation such an inspiration. If you’ve got 18 minutes spare, I’d highly encourage you to watch it:

Imagine a world where information can be fed to our skin in such a way that our brains can perceive it subconsciously and without it distracting us. Imagine a racing driver feeling the temperature of his tyres by pressure applied to his forearm. Imagine a trader feeling a critical announcement from the Asian markets by a small prod on her back. Imagine being able to train your phone not to beep or vibrate indiscriminately but only when the sentiment analysis of the inbound messages deems it above a certain seriousness threshold.

How much better will our human relationships be once we’re able to build better interfaces with our relationships with machines?

Leave a Reply

Your email address will not be published. Required fields are marked *