From bots to bodies: making machine learning more human
We spoke to Dr Jamie Ward from Goldsmiths, University of London, about his work bringing machine learning to the fields of medicine, social neuroscience and the study of autism.
Whether it’s film recommendations from Netflix, or automatic face recognition in our photos on Facebook, machine learning is increasingly intersecting with our daily lives. It is estimated that by 2020 AI and machine learning have the potential to create nearly five trillion dollars in additional business revenue worldwide. But what social impact could this technology have?
Machine Learning and Artificial Intelligence is one of seven cutting-edge specialisms on offer with the new Computer Science programme, launched by the University of London and member institution Goldsmiths. These so-called ‘cognitive technologies’ are having a significant impact on business, with spending on systems predicted to reach more than $75 billion by 2022. However, the potential uses of machine learning stretch far beyond the business world, as we discovered when we met Dr Jamie Ward.
The trouble with wearable tech, like Fitbits and Apple watches, is that they can actually make you more self-obsessed. You’re focussed on ‘how many steps have I done?; how many calories have I burned?’ I wanted to look at how wearables could be used for a wider benefit.
Dr Ward is a lecturer in machine learning. But, by his own admission, he doesn’t like the way we interact with computers. Much of his early career was spent trying to get away from the intrusion of keyboards and screens and find new and innovative ways to make technology more portable, more convenient and more human.
Describing his early research he said: “I was interested in the idea of wearable computing and the idea that if computers are seen as ‘big brother’ then wearable tech should be like a ‘little brother’ that helps us without getting in the way. The trouble with wearables, like Fitbits and Apple watches, is that they can actually make you more self-obsessed. You’re so focussed on ‘how many steps have I done?; how many calories have I burned?’ that you’re looking increasingly inward and not at the world around you. I wanted to look at how wearables could be used for a wider benefit.
“I began working on a device called AMON, which was a wearable heart monitor designed to record things like your pulse and blood oxygen levels and report it back to a hospital to predict if someone is going to have a heart attack.”
You don’t have to be really good at coding and creating new machine learning algorithms – you can learn to use algorithms and apply them to whatever interests you. It’s fantastic to be able to use the tech in a truly interdisciplinary way.
After retraining, Dr Ward spent several years working as an actor. He found that his interest in human behaviour could actually marry well with his early work in machine learning and that the theatre might provide the ideal setting.
“I’ve been working on a project called “Deconstructing the Dream” with Bloomsbury Theatre and a group of social neuroscientists from UCL. During a performance of A Midsummer Night’s Dream, we put sensors on all the actors and some of the audience members and were able to do the world’s first live brain scan of an actor while they acted.
“I’m interested in how we move, what non-verbal signals we give off and what happens to an audience when they’re watching a play. We found that they move in rhythm with the person they agree with most on stage – they’ll nod in unison. We can use all the data we gathered – from motion capture to eye tracking – to create models of human behaviour and train those models to do something really useful.
"That’s what I think our students should be taking from a course like this. You don’t have to be really good at coding and creating new machine learning algorithms – you can learn to use algorithms and apply them to whatever interests you. It’s fantastic to be able to use the tech in a truly interdisciplinary way."
Using machine learning algorithms and wearable tech we could support teachers, carers or therapists to analyse and get a better understanding of young people with autism.
From voice assistants in our homes, to sat navs in our cars predicting traffic on our daily commutes, machine learning is already having a significant impact on how we connect with each other and the world around us. It is thought that by 2020 85% of customer interactions will be powered by chatbots and in a recent survey, 44% of US consumers said they already prefer chatbots to humans for customer relations.
However, Dr Ward is passionate about the potential for this technology to be used in a much more social way.
“One project that’s particularly close to my heart is some work I’ve been doing with Kelly Hunter’s theatre company, Flute Theatre, who work with children and young people with autism. By putting movement sensors on all the actors and the young people and looking at the patterns of movement between everyone, we saw some really interesting things. Not only were the actors synchronising with each other but some of the children were as well. They were connecting in a non-verbal way.
“In particular there was a young boy who was sitting off to one side and didn’t appear to be watching or engaging with the activity at all. But when we looked at the data we found that in fact he was tapping his hand in perfect time with the actors – so he was engaging, but without the tracker we might have missed that completely.
“Using machine learning algorithms we could develop wearables that can support teachers, carers or therapists to analyse and get a better understanding of the young people they work with – particularly those who are non-verbal.”
Find out how the BSc in Computer Science could help support your passions.