Author: Mohammed Estaiteyeh | Assistant Professor of Digital Pedagogies and Technology Literacies, Faculty of Education, Brock University

One recent study indicates that 78 per cent of Canadian students have used generative AI to help with assignments or study tasks. In China, authorities have even shut down AI apps during nationwide exams to prevent cheating.

The support structures and policies to guide students’ and educators’ responsible use of AI are often insufficient in Canadian schools. In a recent study, Canada ranked 44th in AI training and literacy out of 47 countries, and 28th among 30 advanced economies. Despite growing reliance on these technologies at homes and in the classrooms, Canada lacks a unified AI literacy strategy in K-12 education.

Without co-ordinated action, this gap threatens to widen existing inequalities and leave both learners and educators vulnerable. Canadian schools need a national AI literacy strategy that provides a framework for teaching students about AI tools and how to use them responsibly.

AI literacy is defined as:

“An individual’s ability to clearly explain how AI technologies work and impact society, as well as to use them in an ethical and responsible manner and to effectively communicate and collaborate with them in any setting.”

  • veee
    link
    fedilink
    English
    423 days ago

    Step 1: AI is not your friend

  • @[email protected]
    link
    fedilink
    English
    24
    edit-2
    2 days ago

    How about we tackle the environmental issues that Ai is causing before talking about using Ai “responsibly”

    Any use of Ai in its current form is not responsible or ethical. It’s an environmental disaster and threatens to harm the collective intelligence of every citizen.

    Ai is another way to lower the intelligence of the general population and keep them under corporate control.

  • @[email protected]
    link
    fedilink
    22 days ago

    People who are pro-AI seem so weird and dystopian, people who are anti-AI seem logical and reasonable, but my employer requires us to use AI, and I’ve even been forced to work on multiple AI projects recently. It does seem it’s unavoidable unfortunately, but honestly Copilot has given me some of the most useful autocompletions I’ve ever had, especially for tedious things like logging, and I’ve had good luck with ChatGPT assisting with tedious things as well like writing both scaffolding and queries. Considering all of that, I’m torn on AI. I am afraid of the consequences of AI, the fallout of all of it, but I also do find AI/LLMs useful in my day-to-day job and I’m required to use them for my day-to-day job as well.

  • Em Adespoton
    link
    fedilink
    113 days ago

    If AI literacy requires an individual to be able to clearly explain how AI technologies work… then even data scientists are AI illiterate.

  • @[email protected]
    link
    fedilink
    English
    92 days ago

    For a moment there I thought this was an onion article; Ai propagandists never cease to amaze me!

    • Otter RaftOP
      link
      fedilink
      English
      52 days ago

      Learning about how AI works and what it is/isn’t good at, is a good thing? It will most likely make kids use generative AI less, and be careful about what they use it for

  • rhvg
    link
    fedilink
    32 days ago

    AI is American hoax to win over China.

    Too bad Canadians fell for it.

    • Otter RaftOP
      link
      fedilink
      English
      42 days ago

      AI is American hoax to win over China.

      What do you mean?

      • rhvg
        link
        fedilink
        22 days ago

        AI race with China is artificially created to justify private and public money funding the AI and making it an industry.