Technology to keep looked-after children safe online

Online exploitation of children in care has become a major issue, reports Camilla Pemberton. But technology itself can provide part of the answer

Online exploitation of children in care has become a major issue, reports Camilla Pemberton. But technology itself can provide part of the answer


Project: Keeping looked-after children safe online (part of Norfolk’s Computers for Pupils scheme).

Costs: Between £2 and £5 per looked-after child per year.

Numbers: About 400 laptops have been given out to looked-after children in Norfolk so far. The council intends to increase this number.

“What school do you go to, babe? I can pick you up at the gates. You’re sexy. x,” the message said. It was quickly followed by more, each popping up on Carly’s computer screen with increasing urgency: “When babe?”; “Tomorrow???”; “Let me know!!!”; “Keep it a secret!!!”

These short, instant messages were sent to Carly*, a 13-year-old girl who, like many teenagers, enjoyed spending her evenings chatting to friends on social networking sites and internet chatrooms. But this chatroom user was not a friend. When Carly left school the following day, expecting to meet a 14-year-old boy called Joe, she was shocked to find a man in his late twenties waiting for her. She told her foster mother who banned her from using the internet.

Although Carly was not harmed, the case highlights a growing problem for corporate parents: how to make sure that looked-after children are using the internet safely, without significant restrictions. Carly’s foster mother had been trying to keep her daughter safe, but her refusal to allow Carly to use the internet caused arguments and led the teenager to be increasingly secretive.

This was the dilemma facing Norfolk Council when, in 2008, the authority was given a grant to purchase laptops for looked-after children. “The internet is an important educational tool and we wanted to give our looked-after children regular access to it,” says Sharon Jay, Norfolk’s access to technology project manager. “But the internet has also given abusers and bullies more opportunities to target and pursue their victims. Its created another sphere where children can be put at risk,” she says.

Recognising a need to monitor the way young people were using their laptops, “without massive restrictions”, Jay and her team developed a partnership with Securus, an online security firm which specialises in e-safety tools that protect children.

At the end of 2008, online filters which monitor internet usage were installed on every laptop given to Norfolk’s looked-after children. The software has been specially designed, Jay says, to recognise an “enormous” range of words, images and phrases which could be cause for concern – from sexually explicit material, to harassment, racial discrimination and sexual grooming.

“The filters are very thorough,” she says. “They pick up phrases like “keep this secret” or “don’t tell anyone”, which could be written by abusers, as well as derogatory terms, swear words and oppressive or abusive language. They also pick up ‘text speak’ – where speech is abbreviated, and numbers are used in place of words – and skin tone, which detects images with an inappropriate amount of flesh.”

An in-built scoring system grades trigger images, words and phrases on a scale of severity, sending alerts to staff when something appears concerning. A snapshot of the young person’s screen, along with the date, time and their laptop’s unique IP address – essential in case a young person tries to deny an incident, or had leant their laptop to someone else – is then emailed to Jay’s team. This enables them to contextualise the incident before alerting social workers and carers.

“It’s important for us to review everything because in some cases we can eliminate risk straight away. For instance, we’ve had times where holiday photographs have been picked up by the skintone filters because they featured people in swimwear. When we are worried we contact the young person’s social worker, carer or even the police, who can then intervene directly,” she says.

Some of the flagged incidents may still turn out to be innocent, explains Linda Madden, a senior residential worker based at Easthills children’s home in Norfolk. “We’ve had occasions where it’s become clear that something which appeared oppressive or derogatory to staff had actually been said in jest and had not caused any offence. Sometimes it’s a cultural thing, or just the different ways young people speak to each other these days. We work issues through on a case-by-case basis.”

Incidents are managed sensitively, with great care taken to ensure that young people understand the conditions with which they are given a laptop. Each child at Easthills must sign a contract to show that they understand what is considered to be dangerous or inappropriate behaviour, either by them or internet users they come into contact with. The contract also sets out what will happen if they break the rules or if it becomes clear that they could be at risk.

“We explain that the software will only pick things up if they are doing something that they shouldn’t be, or if someone else is and it’s putting them at risk,” says Madden. “It is on this basis that they choose to take the laptop. If they break the rules, the home operates a three-strike warning policy. We can remove their laptop permanently if we feel they are unable to behave appropriately or keep themselves safe. This has happened a few times and the young person has always understood why and said it was fair.”

Whether the young person has deliberately misused their laptop, or has inadvertently been targeted by someone else, staff always approach them to discuss the matter and identify any potential safeguarding issues. “We’ve had occasions where this software has picked up issues we hadn’t known about, such as abuse in a child’s background, an older boyfriend or bullying at school. We discuss everything and can make sure that they get the help they need,” she says.

Madden says the most frequent problem concerns bullying on social networking sites. “Often when we think we have resolved an argument between kids in the house we’ll find they’ve continued it on Facebook, even while they were sitting on the sofa next to each other in full view of staff. Social networking sites have changed the way that young people interact and have created new ways for them to hurt and embarrass each other. It’s important to keep on top of this.”

Is she concerned that the software could be seen as prying? “No. This software gives us the security and peace of mind to allow our looked-after children the freedom they want to browse the internet unsupervised. They don’t have us breathing down their necks and we don’t have to fuel our resources into constant monitoring. But we know we’ll be notified of any problems if we need to be.”

Alison Thomas, cabinet member for children’s services, says: “Technology does come with risks and, like any parent, we need to make sure the children in our care know how to use technology safely and are protected from those who may use it for malicious or criminal reasons.

“The security we have in place…has already helped to protect several young people from situations that could have posed a significant risk and has therefore proved its worth.”

* Not her real name

What do you think? Join the debate on CareSpace

Keep up to date with the latest developments in social care Sign up to our daily and weekly emails

This article is published in the 13 October 2011 edition of Community Care under the headline “Protecting children in care from the internet’s wild side” 

More from Community Care

Comments are closed.