Teaching Artificial Intelligence Literacy: ‘AI Is for Everyone’
Artificial intelligence (AI) is everywhere, but most people have limited understanding of how the technologies work. Researchers are working to bridge that divide.
The Northwestern University Center for Human-Computer Interaction + Design (HCI+D) hosted a virtual panel on April 7 entitled, “Thought Leaders on AI Education.” The discussion focused on what curriculums are available for students and teachers in all grade levels and how adults can also learn about AI systems and tools.
Duri Long, assistant professor of communication studies at Northwestern School of Communication, moderated the event. Her research focuses on issues surrounding AI literacy and human-AI interaction. Long co-published a research paper defining AI literacy in the Association for Computing Machinery (ACM) Conference on Human Factors in Computing Systems (CHI) in 2020.
“Widespread AI literacy has the potential to equip the public with skills needed to collaborate and create with rapidly changing AI technologies,” Long said.
Studies show many people don’t realize when they encounter AI. In December, the Pew Research Center surveyed 11,004 American adults, giving them six examples of common technologies that use AI, such as fitness watches and chatbots. Overall, only 30% of those surveyed knew AI was involved in all six examples.
Researchers understand education starts early. Professors in several universities have developed programs for grade school students to learn about AI. Last year, faculty and educators from the Massachusetts Institute of Technology (MIT) Responsible AI for Social Empowerment and Education (RAISE) started Day of AI, a free K-12 curriculum that includes courses on timely topics like ChatGPT.
“AI is for everyone,” said Cynthia Breazeal, MIT professor of media arts and sciences. “We really want to empower students with the knowledge of skills of how to responsibly use AI in their daily lives. It should be designed responsibly. And of course, kids are our future.”
Another program is AI4K12, an instructor-led initiative that develops national guidelines for AI education in K-12 schools. Illinois is among 15 states that have completed the AI4K12 plan. It’s organized on five big ideas: perception, representation and reasoning, learning, natural interaction, and societal impact.
Christina Gardner-McCune is co-chair of AI4K12’s steering committee. The associate professor at the University of Florida’s department of computer and information science and engineering researches the integration of computing across the middle and high school curriculum.
“You really want to make sure students understand that AI can impact the world in both positive and negative ways,” Gardner-McCune said.
AI4K12 is developing a curriculum in Georgia that allows students to explore case studies of AI-related societal issues. For example, students watch videos of people sleeping behind the wheel of self-driving Tesla vehicles.
“They actually get to explore,” Gardner-McCune said. “What are the issues that arise? Who's responsible: the company that made the car or the driver? How autonomous is it? Does it really need a human in it?
“The more they jump into these ethical issues, the more likely they're able to recognize when these issues arise in their everyday lives,” Gardner-McCune added.
Researchers are also working with adults to improve their AI literacy. Several projects at the Carnegie-Mellon University (CMU) Co-Augmentation, Learning, and AI (CoALA) Lab involve going into workplace contexts to study how people are working with augmented tools. Ken Holstein, associate professor at CMU’s Human-Computer Interaction Institute, directs the CoALA Lab and is also on the governance advisory committee for the Northwestern University Center for Advancing Safety of Machine Intelligence (CASMI).
Holstein said his group’s research has found there is often little to no training aimed at helping workers learn how to use AI tools effectively and responsibly. Also, he said AI-based tools are often not designed to solve the right problems.
“To give an example, in K-12, teachers have unique knowledge of their students and their personalities, emotional states, and home situations that typically AI tools do not,” Holstein said. “We found that in practice, oftentimes frontline workers, such as teachers, will attribute knowledge to AI systems that these systems don't actually have.
“These misunderstandings bring dangers, such as the risk of overreliance on AI recommendations,” Holstein continued.
This problem can be mitigated by designing AI systems with tools that laypeople can understand. For example, Holstein’s team developed a comic strip-based method to show unhoused people how the local government was using an AI system to make decisions about who to prioritize for housing.
“We found that when properly empowered to do so, our participants were able to provide specific, critical feedback on the AI system’s design,” Holstein said.
Adults typically learn online, and they like learning in the same way kids do, Breazeal found.
Overall, interest has been high from people seeking to learn more about AI. More than 3,000 teachers from 90 countries are using Day of AI to teach their students. The next step for Breazeal is to reach out to teachers to get their feedback about the program.
“They’re the key to making this work for themselves and their students,” Breazeal said.
To watch the full discussion, click on the video below.