At the moment there doesn’t seem to be a day that passes without some story about “harm” to young people as a result of online technologies (in the past week I have seen articles in the mainstream press about Facebook, Snapchat and gaming). And always coupled with these articles are calls for the providers of technology to “do more” to ensure children can go online in a safe manner.
As I write this there are two significant developments coming for policy makers too. Firstly, the part of the Serious Crime Bill related to making sexual communication with a child is finally being enacted into law. Secondly, the House of Lords Select Committee on Communications has published a report calling for digital literacy to be the “fourth pillar” of a child’s education alongside reading, writing and mathematics. More specifically they state that “no child should leave school without a well-rounded understanding of the digital world”.
I have worked in this area for the best part of 15 years, holding many classes, workshops, assemblies and conversations with children and young people in the South West about their use of digital technology and how it affects them emotionally. In this time I have seen plenty of calls from politicians similar to those from the House of Lords Communications Committee and I have seen, and been asked to comment upon countless press articles about how we keep children and young people “safe” online and who is responsible for doing so.
Yet very little, at the grass roots level, seems to change. While the platforms they use differ now (back then Bebo, MySpace, and MSN and now Instagram, Snapchat and Facebook) the behaviours do not. While legislation continues to struggle to respond to the ever-changing challenge of emerging technologies, schools are told to “do more” in terms of education with a lack of resource, training or space in a timetable where senior leaders are judged more on academic performance than the social education of their pupils.
Many politicians call for technologies providers to “do more”, as if the providers are the only people that can prevent harm from occurring. However, technology providers can do little aside from provide technical “solutions”. In our rush to keep children safe online we are failing them in their rights – while filters may prevent access to inappropriate content, they will also block access to sites related to sex education, gender issues and sexual health. Monitoring internet access might provide us with a breakdown of what our children are looking at online, but at what cost to their privacy? And are we really reassured that the only ways our children are accessing online content are via systems we can monitor? As I have learned on many occasions from my conversations with young people, they know about these tools, and they know how to get around them! Perhaps not because they wish to get up to no good, but because they don’t want everyone knowing their business. And is that so unacceptable?.