Home » Education Technology » How digital literacy can help us navigate AI and our online behaviours
Future of EdTech Q3 2023

How digital literacy can help us navigate AI and our online behaviours

Learning to code with confidence: Male teacher helping children master programming skills
Learning to code with confidence: Male teacher helping children master programming skills
iStock / Getty Images Plus / jacoblund

Andy Place

Inspector for Computing, Online Safety and Education Technology, Havering Education Services

Digital literacy can be defined as ‘the skills and knowledge required to be an effective, safe and discerning user of a range of computing systems.’


According to the National Centre for Computing Education’s (NCCE) ‘Digital Literacy Within the Computing Curriculum’ 2021, there are five key areas that need developing to help prepare for life in the digital world.

Five essential components of digital literacy

  1. Practical skills: This is about decision-making and choosing the right technology, correct software and apps to use for the task.
  2. Finding information: Use the correct search terms, refine searches and identify suitable sites.
  3. Critical thinking: Questioning what you find. Is it reliable? Do other sources verify this information? Is it bias, fact or opinion? Who wrote the information and when was it written?
  4. Online safety skills: This is about understanding the risks and knowing how to stay safe online. What do and don’t we share? How to report concerns? Protecting our identity and considering our digital footprint.
  5. Communication and online behaviour: Developing our ability to communicate and collaborate safely. Understand what is legal and illegal, such as copyright. Know how to behave online and respect others.

The dramatic rise in the use of artificial intelligence (AI) over the past 18 months is making it more important to question what we see.

What are the external factors that make this challenging?

  • Online phishing scams and cybersecurity attacks are becoming more sophisticated, and it is harder to spot fake notifications and emails. Younger children tend to be targeted through video streaming sites and gaming platforms, offering them free ‘rewards.’
  • The role of influencers often conveys only one point of view. Computer software can manipulate the way celebrities look; having perfect bodies. The use of clickbait encourages us to read articles; often, these contain disinformation and misinformation, so being able to recognise this is becoming more pertinent.
  • The dramatic rise in the use of artificial intelligence (AI) over the past 18 months is making it more important to question what we see. The use of deep fakes and manipulation of voice technology is making it extremely hard for us to recognise what is real and what is computer-generated.

Possible consequences of AI and using it effectively

The way social media and streaming platforms recommend future content to watch based on viewing habits automatically generates more of the same. Are we, therefore, getting a balanced viewpoint? AI software can generate images and online assignments, so how easy is it to differentiate between a student’s work and AI?

However, it shouldn’t be about combating AI but instead working in tandem with it. It is here to stay. Our young people are the future and will be using it for jobs that may not even exist yet. Let’s embrace AI in education but ensure digital literacy skills are at the forefront to always question what AI is telling us.

Next article