data use: BBC Own It app
BBC Own It was a free app designed by the BBC to support, help and advise children when they use their phones to chat and explore the online world, without adult supervision. It was an experiment which aimed to give children more control of their personal data.
Watch the BBC Own It animation:
Proposed benefits of BBC Own It:
- children receive personalised wellbeing advice & support without identifiable data leaving their device
Potential harms of BBC Own It:
- interventions in children’s wellbeing take place without parent/carer knowledge;
- potential for anonymous data to be shared with researchers which may not be clear in data agreement;
- makes automated assessments of & recommendations about children’s wellbeing, without parent/carer involvement.
Questions to discuss or think about
- What do you think about BBC Own It?
- Does anything surprise you about how data is being used here?
- How do you feel about the role that algorithms play in BBC Own It?
- How do you feel about who has control over how data is used in BBC Own It?
- How do you feel about how data is shared in BBC Own It?
- Look at the proposed benefits and potential harms of BBC Own It. What matters more to you: the proposed benefit or concern?
- Do you think BBC Own It is fair? Why/why not?
- How much do you feel like you understand BBC Own It?
What people think about BBC Own It
Some participants were concerned about unequal access to Own It:
“The benefit of the BBC’s Own It in telling me what to do if I’m bullied is only available to people who have phones. So kids who don’t have phones are disadvantaged.”Teddy, white British, heterosexual man, aged 65+
Others felt that it could help to overcome inequalities:
“A lot of people I know wouldn’t normally have access to that kind of resource. [Pakistani parents sitting at home in the UK] wouldn’t know where to reach out to, because they’ve not been educated in this country, for example, or just don’t know. […] So, for me that’s the fair one, if I was to look at it from that lens.”Tahira, Pakistani, heterosexual woman, aged 45-54
Some participants were concerned that Own It data could be used in unintended ways:
“What if the research done though, produced by the university, by lovely people like [our LWD researcher], was used by people for the very opposite of what it was originally designed for, to target specific children?”Louisa, white British, heterosexual woman, aged 35-44
Others were concerned that the algorithmic processing within Own It is not guaranteed to be error-free:
“Coming back to the thing we said about the algorithm and the fact that it’s not human controlled and it could make mistakes. And you could have a child who’s perfectly happy who suddenly gets an alert about bullying, and then freaks out and thinks that something’s wrong and they’ve done something wrong.”Jill, white British, heterosexual woman, aged 45-54