1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Hi Guest, welcome to the TES Community!

    Connect with like-minded education professionals and have your say on the issues that matter to you.

    Don't forget to look at the how to guide.

    Dismiss Notice

Changes to curriculum: Report recommendations for Online Safety Education

Discussion in 'Education news' started by HouseOfCommons, Aug 14, 2019.

  1. HouseOfCommons

    HouseOfCommons New commenter

    Following changes to the curriculum in 2020, schools will be expected to teach about data and privacy, the internet and wellbeing and respectful online relationships.

    The Parliamentary Office of Science and Technology (POST) have published their analysis (POSTnote) which gives an overview of how children use the internet and the opportunities and risks it presents.

    You can read the full report using the link below:

  2. HouseOfCommons

    HouseOfCommons New commenter

    For further context the report provides an overview of current online safety teaching in schools and elsewhere and how this will be affected by changes in the curriculum. It also looks at the role of content filtering and age verification technologies to improve online safety.

    You can read the key points from the report below:
    • Children use the internet for a wide variety of activities: 59% of 7-16 year olds report that they go online to watch videos, 56% listen to music, 54% play games, 47% complete homework, 47% message friends or family and 40% use social networking sites.
    • Children’s internet use increases as they get older. As children age, they are also increasingly likely to access the internet from private locations or spaces outside the home. Children under the age of 12 years primarily access the internet using a tablet, whereas those aged 12–15 tend to use a mobile phone
    • The internet presents children with a range of opportunities and risks. Online opportunities include undertaking creative activities, socialising, developing skills, and engaging civically. Risks faced by children online can be grouped into four categories, including: ‘content’, such as pornography or violent content; ‘contact’ by those who seek to victimise or radicalise children; ‘conduct’, such as cyberbullying or sharing sexual images; and ‘commercial’, which includes risks such as exposure to advertising or online gambling.
    • There is limited evidence on the long-term effects of these risks and opportunities. There is evidence that access to the internet improves educational outcomes. Research on risks has mainly studied associations between risks and adverse outcomes, rather than looking at causation.
    • The ‘Keeping Children Safe in Education’ statutory guidance requires all schools to teach online safety as part of a “broad and balanced curriculum”. In most schools this is delivered through PSHE education, computing classes or through events such as assemblies. Additionally, local authority-maintained schools must teach aspects of online safety (such as safe, secure and responsible use of technology, and how to report concerns about online risks) as part of the computing curriculum.
    • From 2020, all schools in England will be required to cover aspects of online safety as part of newly compulsory subjects: relationships education in all primary schools, relationships and sex education in all secondary schools, and health education in all state-funded primary and secondary schools.
    • Many groups argue that tech companies have a responsibility to design safe online spaces in which children can take full advantage of the internet. An age-appropriate design code is currently under consultation, which sets standards on the design of online services which are likely to be accessed by children. The Government’s Online Harms White Paper also proposes a duty of care for internet companies, and the establishment of an online harms regulator.
    • Technologies that may be used to protect children online include filtering and age-verification. Under the Digital Economy Act 2017, pornography websites will be required to use ‘robust’ age-verification to ensure their users are 18 or over. This will be enforced by the British Board of Film Classification.

Share This Page