Even With Some Cons, AI Should Still Be Used to Empower Students in the Classrooms
The aim to empower students in this digital age has revolutionized how education is both delivered and received, with artificial intelligence (AI) at the forefront of this transformation. AI in the classroom presents an exciting opportunity, but it’s not just about using it as a tool; it’s equally about understanding its workings, strengths, and limitations. An informed integration of AI can support students in various ways, such as simplifying information and combating misinformation.
However, some experts contend that there are also times when its absence is preferable in certain cases. While AI can adapt to unique learning needs and empower students, its potential cons can’t be overlooked. The intersection of cultural relevance and AI could be the key to truly inclusive educational technology.
An expert on this subject is Dr. Karla Badillo-Urquiola, who is a Clare Boothe Luce Assistant Professor of Computer Science and Engineering, at the University of Notre Dame. She has over a decade worth of experience in research and computer science technology. Her invaluable research has offered some insights into AI’s influence on young student minds, and where it can be leveraged. She stated that students using AI to decipher challenging texts is a positive use of the tool, but when it comes to using imagination, it shouldn’t even be an option. She clarified that to empower students, the responsible use of AI is key.
Badillio-Urquiola’s Thoughts on Using AI to Empower Students
“My name is Dr. Karla Badillio-Urquiola, and I’m a Clare Boothe Luce Assistant Professor of Computer Science and Engineering at the University of Notre Dame. While I believe AI can be and should be used as a tool within classrooms, more importantly, I believe educators should be teaching their students about how these technologies work, as well as when and how they should be used. These technologies shouldn’t just take over the classroom, but they shouldn’t be banned either. There are appropriate moments to integrate the technology. For example, students have used ChatGPT to translate difficult texts for comprehension, or teachers have used it in lessons about misinformation or copyright. Yet, there are also moments where students should be given purposeful and meaningful opportunities to think and engage with learning without the presence of AI. For instance, when we want students to tap into creative spaces, such as creative writing or language-oriented tasks, these may be moments where we don’t want AI to play a huge role or even be present.
“AI is a powerful tool that can adapt to different learners and even serve as a tutoring mechanism for students. AI can help students feel empowered because they can leverage these technologies throughout difficult coursework to reinforce concepts or even use it to provide themselves feedback. However, just like many other powerful technologies, AI also has inherent biases that can cause harm. My research lab has demonstrated a need for integrating cultural values, beliefs, and practices to increase acceptance and usefulness of AI systems across the world. AI systems need to be culturally relevant and beneficial to the communities they serve. And as AI continues to evolve, especially in the educational setting, we must provide teens with the tools necessary to protect themselves while interacting with these machines. In my research lab, we work directly with teens and other community partners to understand how the design and use of technologies impact the lives of people, organizations, and communities.
“In our most recent work, teens shared with us their frustrations with how ineffective current technology protections are at keeping them safe from online risks. With the advancements in AI technologies, these risks become even more complex, so teens want interventions that address these risks at the root of the problem. By shifting the responsibility of risk prevention from the victim, who typically uses strategies like blocking to ignore the malicious message or content, to moving to solutions where we nudge the perpetrator to rethink about their harmful actions. Teens want these risks to be handled before they even get to them, or could even affect them. And by focusing education on how these AI technologies work and should be used, I think we can help children, teens, and adults learn how to use these technologies for good and help them keep safe online.”