Skip to main content

Why You Need to Talk About AI

Why You Need to Talk About AI
Image: GPT-4/DALL-E

It’s time to have "the talk." You’ve been avoiding it. It’s a little awkward because you’re not really sure how to go about it. Students are getting anxious for lack of clear information and they don’t know who to go to for advice. Some teachers really want to dive right in, but others don’t think it has any place in school. Parents are asking questions about how to navigate this important moment in their children’s education…

It’s time to talk with kids about AI.

Education has three purposes: to prepare students for the workforce, to prepare students for participation in civil and democratic society, and to prepare students for a fulfilling life. AI is already propelling significant changes in each of these areas. So great is the impact on each of these areas that failing to talk with students about the role of AI in their (and our) lives amounts to a level of professional negligence. As educators and citizens, we failed to prepare young people for social media, so kids and technology companies figured it out on their own — with disastrous results for mental health worldwide. We can’t fail like this again.

Here are three reasons why you need to have conversations with kids about AI, and then several suggestions for how to go about it:

1. AI and Workforce Preparation

This is the most obvious of the three. The workplace is changing. Study after study  is showing that AI’s capabilities are having a significant impact on who does what kind of work and with what tools. Low performing employees are seeing huge productivity gains with AI, and high performing employees are seeing improvement, too. Employees using AI are happier at work. In the schools I’ve been working with, the overwhelming majority of faculty, even if they are opposed to using AI in their classrooms, recognize how important AI literacy will be for their students in the future. Failing to provide foundational AI competency for students fails to prepare them for the workforce.

2. AI and Civic Engagement

Less obvious, but perhaps more important is that the political and legal landscapes are shifting in the age of AI. On the surface, AI first poses risks. Already, it is being misused to deceive citizens in the electoral process. AI generated robocalls are audio deepfaking political candidates to spread misinformation. Political campaigns have already used deepfaked images depicting false representations of other candidates’ actions. Foreign governments have created AI-generated false news reports to influence public perceptions of government. Students/citizens must have strong media literacy skills informed by knowledge of AI to be discerning consumers of political media.

New York Times

One level deeper, however, is that legislation about the use and abuse of AI is only just emerging and will grow increasingly complex. If being a responsible citizen means voting with an understanding of different positions on important issues, then part of our role as educators is to prepare students to be able to critically examine different stances. For example: should AI technology be open source for anyone to have access to it, like library books, or should AI development be confined to highly regulated corporations, just as high-grade military weapons are? Also: who should be responsible for a crime committed using AI: the individual perpetrator, the company that made the AI, or both?  While we may not need to study all these questions with students, we nonetheless have a responsibility to help students understand how AI works so they can reach informed conclusions on different political positions that emerge — and vote accordingly as they might for any other issue.

One step even deeper: as artificial intelligence further suffuses society’s inner workings — employers already use it to filter job applications, corporations use it to inform decisions in boardrooms, government offices use it to streamline workflows — what protections need to be set up in order to prevent bias embedded in AI from unfairly treating citizens? Citizens — students — need to be able to make discerning decisions not only at the ballot box but also on juries and in other public contexts about how laws about AI should be written and interpreted. 

Failing to prepare students to critically engage in these civic contexts fails to prepare them for life as good citizens in a digitally saturated world.

3. AI and Personal Fulfillment

Perhaps most important of all — depending on how you see the purpose of school — is that we want to prepare students to live the most personally fulfilling lives possible.  One doesn’t need social media to have a robust social life, and one doesn’t need AI to have a happy and fulfilling personal life.  But intentional use of social media can immeasurably enrich one’s social life, and intentional use of AI can immeasurably enrich one’s personal life. Fulfillment looks different to different people. 

For many readers, the prospect of AI serving an important role in one’s personal fulfillment may seem laughable, but this is not the trend in today’s society.  Today’s emerging AI technologies are not only creativity tools and productivity workhorses, but also philosophical and spiritual guides, personal thought partners, and conversation buddy. With AI, anyone can make extraordinary art, can develop complex plans for starting a business, plan a trip, try a new recipe, and more. Recent AI tools have provided valuable counseling services, even reducing suicidal ideation. In today’s age, whether through enabling new forms of creativity, providing individualized counseling services, or simply helping achieve a goal or complete a plan, generative AI has become a personal assistant for people everywhere. 

Failing to prepare students to engage with AI personally may not harm students in their personal endeavors, but it may keep them from opportunities to pursue their dreams.

How (When?) do we talk with kids about AI?

So when and how do we have these conversations? Cramming a new unit into the curriculum isn’t sustainable. Instead, integrating discussions into what we already teach both adds relevance to what we already teach and ensures learning is recursive. Opportunities for this kind of integration abound. The interdisciplinary nature of artificial intelligence invites discussion in humanities classes about the social and ethical risks of AI models, in STEM classes about mathematical or technical constructions of the software, in any class about how to use the technology for writing, and even in extracurricular activities for understanding the role of artificial intelligence in journalism, entrepreneurship, government, and more.

Included below is a starting point: discussion prompts for humanities classes. 




Comments

Popular posts from this blog

Four Ways to Measure Creativity

Assessing creative work has been a bugaboo for a good long time.  In schools it's the constant refrain: “How can you grade creative writing?”  or “It’s a poem: however it comes out is right.”  In businesses and elsewhere, people demand innovation--and are stymied with understanding how to measure it. But this is not the bugaboo we think it is--in the classroom, or in the broader world of creative work.  Here are four different ways to assess creativity, each designed for different settings: 1. Measuring How Creative a Person Is - The Guilford Model 2. Measuring How Creative a Work Is - The Taxonomy of Creative Design 3. Measuring Creative Work Against a Program - The Requirements Model 4. Measuring the Social Value of Creative Work - Csikszentmihalyi’s Model Notably, in each of these cases, what we mean by "creative" changes a little.  Sometimes "creativity" refers to divergent production (how much one produces, or how varied it is).  Sometimes "c

Taxonomy of Creative Design

Strategies to improve creativity are many, but they are also diffuse.  Little ties them together in a way that offers a coherent vision for how creativity can be understood or developed incrementally.  The Taxonomy of Creative Design, a work in progress, offers a new theory for doing so. Since creative work can be measured along spectrums of both and form and content, the Taxonomy of Creative Design offers a progression from imitation to original creation measured in terms of form and content.  In doing so, it organizes creative works into an inclusive, unifying landscape that serves not only as an analytical lens through which one might evaluate creative work, but also as a methodical approach to developing creative skills. Here is a closer look: Imitation Imitation is the replication of a previous work.  It is the painter with an easel at the museum, painting her own Mona Lisa; it is the jazz musician performing the solo of the great artist note for no

A Cognitive Model for Educators: Attention, Encoding, Storage, Retrieval (Part 2 of 14)

So how do  people learn?  What are the mechanics of memory?  Can we distill thousands of articles and books to something that is manageable, digestible, and applicable to our classrooms?   Yes.   In brief, the cognitive process of learning has four basic stages: Attention : the filter through which we experience the world Encoding : how we process what our attention admits into the mind Storage : what happens once information enters the brain Retrieval : the recall of that information or behavior Almost everything we do or know, we learn through these stages, for our learning is memory, and the bulk of our memory is influenced by these four processes: what we pay attention to, how we encode it, what happens to it in storage, and when and how we retrieve it. Here’s a closer look at each: Attention: We are bombarded by sensory information, but we attend to only a small amount of it.  We constantly process sights, sounds, smells, and more, but our attention se