Read this: Do machines make the rights choices for children?
Summary: PodcastDownload MP3 www.bbc.co.ukDo machines make the rights choices for …
BBC sounds music Radio podcasts hello, I'm Andrea catherwood, and this is the media show from BBC Radio 4 in the radio theatre at Broadcasting House in London where at the annual conference of the BBC Blue room, that's the team at The Corporation whose job it is to test out new technology viewers and listeners could be using and this year the theme of the conference is artificial intelligence welcomes AI into your life already.
You might have a smart sitting in your kitchen that helps you cook dinner.
You might let the Netflix algorithm was designed entirely what you watch in the evening, but when was she was so intelligent that they can make all these decisions for us is actually responsible well.
That's a big question for this conference and the media show is going to try and answer it in the context of one particular group do machines make the
Choices for young people let me introduce you to our panel sitting beside me is Anne Longfield she is the Children's Commissioner for England and then we have Dr nirav and electro in psychology at Imperial College London documentary maker who has made films for voice and the BBC Amy Lawrence is the deepmind professor of machine learning at the University of Cambridge welcome to all of you and Andy welcome to our audience to start by explaining that job title is the professor of machine learning just tell us a little bit about what machine learning is the principal technology underpinning the car.
And when the person I'm speaking to the nuts off themselves.
I I should have go on to explain that it's a data-driven technology for making decisions when I came out this morning.
I should have checked the temperature in decided whether or not to wear a coat.
I might want to automate that decision and the way I could also make that decision is by observing me go out whether I have a coat or not and checking the temperature and finding out what the temperature threshold.
I'm putting a coat on at that would be a simple for machine learning using data and often data from humans to emulate reconstructed decision the deepmind professor visits indicates who funds you of course Steve minding the Artificial Intelligence company.
That's now owned by the parent company of Google so the deepmind gave a benefit royal gift to the university, so I am under no direct application apart from gratitude because they found in my position to work with or for deepmind.
But they're very interested as many of the song in improving the quality and the environment ecosystem for machine learning around the country they also found another position there so famous that if I tell people my title is the deepmind professor of machine learning they stop listening after I've said the deepmind and see my work for deepmind.
I don't work for deepmind.
I have many friends that do work there then of how AI is actually being used by young people machines are actually making decisions for them Netflix decide for you know you've got the percentage weighting saying this is a 96% that you're going to like this.
So it's like 96% related to the content that you've already been engaging with so there's that aspect of it that for me personally.
I tend to go and social media and kind of use as a
As I can as a black woman in this kind of word with something from bothered with race issues in gender issues, I've kind of created my Twitter feed for example to follow certain people and then once I thought I only get recommended other people who are similar to those I follow so for example.
I follow black girls book club.
I've been recommended that the campaign recommended all these different kind of people is the people I tend to follow but also unlike the negative side of it.
I feel like so from myself.
I'm getting married in March thank you my Instagram explore page.
It's just the wedding or weight loss and it's this idea of women who are getting married need to lose weight and that's I think where the problem comes in so like to me.
I'm the older end of the people but I can imagine someone a lot younger seeing that I'm thinking ok.
I need to lose the way I need to do this.
I need to make sure I look this certain ways I feel like they're alright pros and cons and it can be very
Looking very even sit back and let her decide for you or you can go out your way and kind of curator to make sure that you only see certain things that you want to see you grew up presumably with this this technology these algorithms working for you change the the nature of your experience as a young person having all this 100-percent because there's times where I go online and as I said I only see the stuff that I want to see so I end up assuming that the world is that way and then you can never go out and you engage with other people in your life.
You're not reflected in what I'm saying so it's in my have done that in the past.
I just choosing friendship groups that reflected exactly.
Where is now it's like you have to do that online going to get bombarded with 101 different things that you may not even be let me know may not be interested in Longfield
Children's Commissioner for England and your job is obviously to look out for young people and to raise issues with the government that you think are of concern just listening to the positives growing up in the digital age, but how concerned are you that decisions that young people are making of being outsourced to algorithms for kids that are grown up machines around them and part of their life so for the older generation.
I think they can of libel moment is when we realised that for the older generation you think what's the real world and then what is actually if you're growing up with it.
All it just says the way life is and that's totally different so for children now from the very first week's information being stored toys now.
Have you no data gathering information in there? You know the parents will be posting photos of them online 13.
100 by the age of 13 on the average of 26 x 70000 by the age of 18 information as well as the choices as adults it's difficult enough to see the difference between what you've chosen to look at and what someone else has chosen for you, but if you're you know 9 with your first smartphone and you're starting to be bombarded by things that you've shown interest in at one point your life and start to Narrow down, so I'm concerned about the we keep an open on that we find ways to measure hold account the technology companies that are feeding that information and an owning those algorithms, but also there are no longer time ways that they can be used for positive.
They can see when things are going to go wrong all of those things but they're very sensitive tools that need careful ownership and judge.
And that's something yet to be seen where that goes I'd like to bring in the doctor at Nero vans, Almera as a lecturer in psychology.
I'm looking to you know the actual evidence what evidence is there around the impact that this technology is having this algorithmic decision-making on young people answer later to the academic peer reviewed literature is very scattered.
We have an array of sort of short-term cross-sectional studies that do point to relations between for example over using social platforms or you know online bullying whatever end and obviously psychopathology so increase potential increases in the symptoms of anxiety the problem is that we don't have the data necessary to actually understand from an academic viewpoint.
What is happening because the company said that.
So we are having to collect this data should have posted talk to understand.
What is happening.
You know a lot of focus is given to for example screen time parental worry about you know our children spending too much time on their phones and there was a recent review that actually show that it's not necessarily the screen time for sale although there is a breaking point at where you ask yourself you know it's 5 hours a day is that a good idea, but it's not necessarily scream what children do while using their screens and that is something that we haven't really cold up with the research.
We ask children.
What they do but there are now ways of collecting this information through their phones actually without having to ask them at its academics.
We don't often have that information.
Do we know we have the information about those recommendations that algorithms give us.
You know if you like this.
You'll love this effect if they are naturally shape and choices, but we do know from sort of.
Economics literature and literature looking at how efficient advertising can be that it can be very good in a predictions can be very very close to what you do my concern is that the companies are now starting to gather data.
Not just about what you do and what you might like but also how you feel, so if I understand that you are not particularly.
You know you're not feeling very well.
I might start targeting you with advertisement about antidepressant medication or you no alcohol or whatever and this is my this is beginning to be a worry for me personally in this is something we don't know enough about all from an academic standpoint.
It's a very good and I don't think it's just the way we feel I mean as pointed out.
What are data.
Does is it begins to categorise as within two particular subgroups and that as I was pointing out this merging of the real and virtual world.
So that's not just having consequences for our lives in the virtual world into the real world so the other challenges.
It's happening very very rapidly.
Just don't have the data to understand the scale of this effect but the computers do have the data the ones were interacting with and they are and almost understanding behaviours better than we understand ourselves think that's on its own just that last sentence you giving us is terrifying the idea that computers understand better than us to paradox, because it seems that we are collecting more more information about ourselves, but we're understanding almost less unless we can't predict elections anymore.
We can't predict the responses of our behaviour to these technologies and it's because they can remove so rapidly effectively without intervention in terms of the changes they can make but we just can't gauge the quality to the effect of those changes on our own lives and something that I would call the Big Brother Big mother.
It's some to protect our children we actually want to protect ourselves and one of the challenges.
He faces that are children have thrown into worlds which have dangers in them at the same time to expose on to some of those dangers and a very similar things happening around Our Data now.
We can use this data to do many many things for us too many wonderful things to a better predict disease to help us in various ways, but when we do that, that's one of the idea of a big mother being looked after by the computer, but some you can't watch over us without watching us so the other end of that coin is the sort of Big Brother side of it and I often think of the got the story of the Garden of Eden where you live in this paradise, but you don't have free will and in some States you can see what's going on.
Is that the restriction on what channel was talking about in terms of the environment that is exposed to us is Taylor to we are and is that a sacrifice we want to make any sacrifice a child is in a position.
Do you understand and make a decision about when they are very difficult it can be very daunting especially if you don't understand it so at the beginning.
I think I was quite hesitant.
You know when you're asked to give more info or to sign up to this and to do this as a bit hesitant but after a while you see how much easier life because once you give that information so now that I've been on Netflix for a while for example.
It is what I like and what I prefer and certain times of the day so like an evening.
I may be recommended something else over the recommended in the morning.
So that's made my life a lot easier.
I don't have to remind him mind through all this kind of misinformation.
So there are pros of it, but also it can be very scary or something as I was saying you putting up all these pictures you seen you know who's looking you don't know who's kind of engaging.
Do you see them away as a trade-off that you're prepared to trade off clearly.
You know that there are parts to this that are very worrying and yet?
Real life a lot easier and more convenient and in some ways and improve your life, so it's also discuss with your young child or young person your life and be like I could be there are pros and cons like everything else in life and you really just need to decide for yourself and be like I am comfortable with given this information for giving when I'm on my period I know it sounds very kind of expensive but then you get advertising that matches that and you like I don't know the time of month that you wanted for me.
I don't feel comfortable sharing information someone else might because I feel like they're going to be receiving adverts and like films and stuff that go off the back of that definitely I'm actually in a way it can be improving their lives.
They are making decisions about how much information.
They give are you concerned that younger people?
Actually not as good at making the decisions that perhaps older people would be simply because they don't have that life experience there quicker perhaps to give over information that an older person might feel a bit more circumspect about giving yeah.
It's very interesting question and I think it's multi-layered privacy.
I think it's something in the fact that I don't think most people understand not even adults to be honest what it is that we are actually giving away when we do say.
Yes to something in order to get a free software or service but going back to something that this is a discussion as to whether we will exist, but let's assume that we are proceeding that we have free will do we actually have it if we are in fact in some essence been trained by machine algorithms to make choices without actually exploring uncomfortable choices.
Modelled into certain way of thinking because it's safe.
It doesn't look inside you know it.
It makes us feel nice and makes life easier like I said I will let you know what will that lead to one of the major developmental challenges as a child is in fact to become an autonomous person or we removing essentially children from learning autonomy on Their Own by making their world as easy as simple as possible because they haven't made that choice and you mentioned on about that this sort of date of filing of this happened in one more in this is becoming an educational standard now.
We're trying to understand.
How can we best support children but in a way by understanding more about them were also removing all this point of education of being riddle with difficult ideas strange choices that sort of disappearing that would be one concerned that's
Decision-making is actually being impaired and classroom we don't want our children to be helped to develop critical thinking and to analyse situations and respond to them as independent adult independent as they grow towards Adulthood but actually if they are being coached and trained into this way of accepting a relying on some to make the man that for them that's going to become more and more difficult and so many children will say to me that if they had known what they were signing up to the different decision if they have known what that they have some choices here.
They might have done so too so there's something about miseducation into the classroom to help them navigate the digital world and that's something which is yet to catch up, but the other part of this of course it you're still a parent even when your parents are at your children are online.
Parents have a role here to help their children learn how to be independent and resilient online and that's something we haven't really seen in the way that you need and maybe I could turn it around a little bit Tom and just think that may be thinking about our children's futures is one way we can better understand ourselves and how they reacted to this.
I don't know how many parents like myself to realise when you tell him you need to get off your phone.
They say back to you while you need to get off your phone talking about it's just thinking about children.
You know I believe that children are our future.
I don't know that this is a future and we need to play an active part in their selves was mentioning you know about it being personal choice currently in power to make that choice or are we being shown in ecosystem? Where that choice is not apparent that some and an hour.
Sometimes this also applies to children.
I think what these computer systems act if you want planning this but these computer systems have evolved to provide us with a cognitive diet which is the equivalent of high fructose corn syrup syrup drinks.
It's something just like this evolution is something we become addicted to isn't normally available these algorithms are very good at because I see your data at scale finding the coordinates of equivalent now.
We need to get out of that and have a more balanced diet but that implies we should not be saying that one size fits all whatever one-size-fits-all this later.
Let's be there on beef burgers instead that will lead to consequences.
Let's give people in a power people to have the choice of the solder material.
They want to consume the downfalls giving them back control of their personal data because that's what's being fed on and I am excited about some of the legislative changes in the gdpr that give us enhanced data right.
I just think that there's a big power asymmetry still in our ability to analyse data, write something will come down and say let me explain your data rights to you probably not going to work as well.
So it's institutional bodies that are acting on our behalf representing us with the larger companies to try and ensure that we have this ecosystem of choice people like Hannah to make decisions about what she wants to share and how she's being seen by the machines in whether these companies actually create algorithms that target young people specifically it's not something.
I've wherever tall unfortunately people have increased believe in what these companies are capable of I think they don't even understand their own systems.
I think the king of young children is going on it's going on in the reasons.
They don't fully understand in their own systems.
Just as we saw post Cambridge analytica Facebook initially coming out and saying oh no.
Happening after the elections no nothing happened at this way and then as they dug deeper into their own system because it had happened because the humans on the outside and they can exploit these things and the value alignment that is the problem if the incentive for the company is not aligned with the interests of the population we get into trouble and if there are people out there who can take these existing systems regulate them in dangerous ways we also getting to travel case for heavier regulation regulation can cause more harm than good.
We're not very nice position at the moment the most of these large companies are involved are public facing so there companies that we know high street names such as Google and Facebook and we all know if they were to do such things Amazon end.
It would become known it would become known very quickly that would be a backlash of trust against them the much larger danger.
Ecosystem of companies that sits behind the front end because those companies are pushed out of the business and managing data and therefore there's not the public scrutiny on their activities that we have on some of these large internet.
I'm not sure I agree entirely with you.
I think the one of the problems we have today is the advertising model that Facebook and Google primarily rely on which they collect data from people in order to target them with potential pathways to consume more and in that sense.
I was just recently made aware.
I didn't even know this myself that you know as I use Google as an adult and I type information into the search engine learns about what I want so information that I'm getting essentially is not really free.
I'm getting what Google thinks I want to hear I go back to my early PhD days, when I actually had to go into library and look up information and I had to read.
There wasn't necessarily great, but it was an exposure to information which is very different now.
What if I want a search engine that really provide me with information.
I'm willing to pay for that, but we don't have that right now rely on things been free but we're also been mined for data.
So where it says you're paying with the information.
I think we need to move away from these models Netflix and Spotify and apple music when you make when you provide services to people whereby I pay for something and I know what I get and I have a bigger say and in the regulation right was in a Google Earth free platforms.
We don't have a say.
I personally think this is we need to move away from it could be done better to protect people and protect particularly Young by the line for the fact that this is the reality and all of these amazing inventions.
I'm going to go away people.
What happens with your day services where it goes in this is how you can stop it, but was I just kind of education that they can be Echo Chambers that come from you being on particular social sites so for example when I'm doing research for a new piece that I'm working on I might start searching.
I don't know the right right wing media and my YouTube channel starts to suggest only those kind of topics and then it goes on my YouTube if she's on my laptop and she can have comes across that it can kind of be a slippery slope, but she's trying to teach them to be like ok.
This is what can happen and this is what you can do about it.
I think being really trying to find the only way as you said, I don't think it's going to stop and I don't think it's only going to get bigger and he's going to get any smaller are you concerned that this actually is having a mental effect on young people's abilities to make decisions question I wouldn't be able to give you a decisive answer.
I mean if you look at the effect of social platforms have had.
Interact with each other for example these ways of being changed forever and in a way all of us are part of a massive social experiment where you know we really don't know what the outcome is now is a scientist as a researcher when I conduct behavioural interventions of any kind of to apply for an ethics approval.
I have to undergo a lot of scrutiny about what I'm doing and I have to test the divine intervention over a long period of time to do as a company is a tech companies to register my company and the product and I don't have to worry about what effect it has on people I run a lab at Imperial design psychology we're working on mental health proofing technologies which requires time resources and commitment and I think we have to hold companies responsible because surely all of us are their customers in there.
It should have our best interests at heart so I'm not against the company's.
I'm not sure that we can say that they've you know.
They're harming children, but we can definitely see negative tendencies.
You must be ways that we can work around these that move away from just thinking about prophet and Longfield Children's Commissioner I'm sure that when you listen to what's been said today.
It's raised concerns with you.
I know what something you think about a lot but what do we do? What is the best way to tackle this was one of the first series is transparency because yes we know the names of all the six big multi-billion companies that we use online, but they're in no way transparent.
I can ask for in-law data from any public body and how to get that information.
I asked you to a technology company they close the door and ignore me and heavy have every right to do so at the moment.
So they are public facing but they're not public facing in any transparent way that has to change and that will only happen by regulation and thank you very much indeed for that and thank you to all our guests today.
I'm afraid that's all we got time for that.
Pleasure to be here with you all.
Thank you to Anne Longfield Children's Commissioner for England Dr nirav ANZAC of Imperial College London filmmaker, Hannah Adam and Lawrence the deepmind professor of machine learning at the University of Cambridge thank you very much indeed.
You are fantastic audience and will be back at the same time next week on the media show.
Lots more recommendations to read at Trends - ukfree.tv.
Summaries are done by Clipped-Your articles and documents summarized.
CommentsYour comment please