The Voice of Acceleration

I am not good at board games and I have never been in space. But recently both ideas have felt very appealing.

I went to see the premiere of AlphaGo at the London Film Festival, a captivating documentary about a super computer powered by artificial intelligence which in 2016 won an iconic five-game match of Go against the world’s leading player, the Korean Lee Sedol.

The movie takes you on an emotional roller-coaster through the games and gets you to empathize with Sedol, feeling as puzzled as him when he tries to figure out a counterpart who is not a person and has no human features. Unlike other players, AlphaGo has no body language, no facial expressions you can read, no telltale signs like fidgeting or shifting in a chair.

And that’s what got me intrigued… I wondered what I would do if I had to “read’ a non-human counterpart in a negotiation. Undoubtedly this is going happen in the future (… and not a distant one). Experts are already predicting that we will have to market to bots, so why not conduct negotiations with them?

If you are used to negotiating with people, after a number of years you start spotting patterns of behaviour (like the nervous movement of a lip before giving you an answer that is not entirely true). These are important signposts you can use to steer the conversation towards the result you want to have.

But what happens when you are sitting in front of a screen talking to a bot? How does a telltale sign that you are losing look like?

I was thinking that a bot will probably have its own style. One that, after you have talked to it for a while, you will be able to recognise and use to your advantage, for example to gain ground in an argument. But where would that bot get its style from? From the humans who trained it? From the data it processes and learns from? From the reactions of the people it gives answers to?

You can tell I have been thinking a lot about the kind of relationship I will be able to develop with machines in the future.

At the Watson Summit London the other week, I attended an inspiring keynote delivered by Canadian astronaut Chris Hadfield who had some uplifting insights into technology and the impact it is having on our world.

I loved it when he said that, as technology improves, the size of our global tribe grows.  Current political developments in the world years seem to be proving the opposite. But, may be, it is just a phase and we have to hang in there.

Hadfield also believes that humanity is experiencing an acceleration and that “it will never be this slow again”. We are going to need to interact with computers in a much faster way. Another speaker was talking about voice recognition and mentioned that “keyboards were designed to slow us down”.

That felt like a punch in the stomach….

I have a confession to make… I am still very attached to keyboards and typing. I am not very comfortable with using voice to interact with my devices.

I learned typing in high school on an old typewriter. I hated it. The keys were big and clunky. They would bruise my fingers (which I had to hide under a ridiculous sheet of paper as I was not supposed to look at the keyboard while typing…) I remember telling my father in desperation that I did not want to become a secretary, so what the hell was I learning to type for….

Little did I know that typing would become a skill so important my early career (journalism) and that I would grow so attached to it.  My keyboard feels like a physical extension of my thoughts… Sometimes, it is enough to position the tips of my fingers on it for inspiration to start flowing. I feel centred and clear about what I want to write.

How can I translate that feeling to the world of voice technology?

I really need to find an answer to this question as I know I will be saying goodbye to keyboards pretty soon.

 

Views my own.

 

 

You might also like