Machines are starting to read our minds – and letting us read each others’ minds

By

ABC recently published a very nice compilation of perspectives of the year ahead titled Do you want the good news or the bad news?, giving readers a choice of whether to read ‘Exciting’ or ‘Scary’ perspectives.

Their interview with me on mind-machine interfaces was published under the ‘Exciting’ section:

2017 will be the year computers and humans start talking to each other, like really talking, according to futurist Ross Dawson, who also says there are some surprises in store for humanity as artificial intelligence becomes even more intertwined in our lives.

“The accelerating pace of a number of different technologies, in particular artificial intelligence, greatly facilitates how we can interact with machines,” he says.

“The ability to use our voices to control our machines, which is obviously not new, the promise is that even within the coming year, the developments in capabilities of the technology will mean that we can start to have far more natural fluid conversations with machines.”

Dawson says we are going to be interacting with AI in a much more natural way by the end of 2017, but that is just the beginning.

“Some of the more exciting developments are around thought interfaces, so using our thoughts to control machines.

“A lot of this is underpinned by new approaches to AI, this is underpinned by increasing processing power, but advances in machine learning, and deep learning which are particular approaches to artificial intelligence have been absolutely extraordinary in terms of their capabilities — including speech recognition, and the ability to be able to create natural responses.

“One thing, which already has happened, is essentially computer-mediated telepathy.

“It’s not necessarily going to be something which all of us are doing by the end of 2017, but we will start to see more and more examples and a further degree of development of the ability for us to basically transfer our thoughts to others just by thinking.

“Essentially they use sensors around our brain to detect brain activity, and to pick those up and to then use something in proximity to our brain to evoke that particular thought.

“We already are able to get a sense of some of the images people are thinking about in their own mind through external sensors, and it is possible that we can start to evoke some of those images in other people’s minds using external sensors.”

If this conjures visions of Arnold Schwarzenegger in a leather jacket, you might be surprised to learn Dawson envisages a different future for how humans and machines interact.

“The reality is we will be more emotionally engaged with robots, we already have begun to have emotional relationships with robots — a cute little robot pet for instance — but as we can start to have conversations with them I think many people will be surprised at how they actually have feelings about what they know is ‘just a program’.

“We are walking into a world where we will have significant emotional ties to some of these technologies.

“There are many people that may think this is a terrible thing, but that is just the nature of who we are, we are emotional beings, and when we start to have interactions we will become emotionally engaged. We’ve already had significant use of therapeutic robots for aged care and dementia, we’ve also seen for example robots that clear landmines in Afghanistan have been given names by their handlers — they start to feel an identity with these robots, which are going out to defuse mines.

“We certainly emotionally bond with our pets and I think to a certain extent we bond with our tools — a mechanic can have a favourite tool — but now when they start to interact with us it is just expressing the nature out of our humanity that we will be emotionally engaging sometimes deeply.”

Image: Bernard Spragg NZ