This article was written by Shayndel Jones and published by WIBW on June 8, 2023.
LAWRENCE, Kan. (WIBW) – A University of Kansas professor contributed to a study reviewing AI’s potential in special education, calling for patience and consideration of its potential uses before the technology is banned.
University of Kansas said a group of educators that includes a KU professor has published a position paper reviewing AI’s potential in special education. Most importantly, AI should be considered as a tool that can potentially benefit students with disabilities, according to KU professor of Special Education James Basham. Tools such as ChatGPT can quickly turn out writing. Naturally, some students have used it to avoid schoolwork. According to KU, banning AI tools is not the answer.
Basham shared a comment about the study.
“It’s really been over the last decade or so that we’ve seen AI and machine learning move from just what you might call geek culture to the bigger world. We’ve been studying it, but ChatGPT made it a little more real by making it available to the public. While we think the writing process is complex, AI can do it, quickly and fairly well,” Basham said.
“When you think about people with disabilities in education, you often think about writing. We get referrals all the time for students who can’t or struggle to express themselves in writing. And AI can help with that. So we need to think about what questions we need to ask or issues to think about, ” Basham concluded.
In the position paper, KU indicated the authors provided a brief history of Artificial Intelligence (AI) and how it developed to its current state. They then considered ethical questions regarding its use in education and special education and how policy should address the technology’s use. Foremost, schools should not reflexively ban the technology, the authors wrote. Meanwhile, educators, researchers and others need to think about what they want students to learn and how the technology can aid that process. Additionally, teacher educators who are producing future generations of educators need to work with their students to consider how they can effectively address the topic.
According the KU, among the main ethical considerations is information literacy, the authors wrote. Students need to learn how and where to find valid information as well as how to discern true information from false, think critically and assess topics to avoid misinformation. Educators should also avoid the trap of evaluating skills like writing too narrowly.
“If we’re only having students do things in one certain way, the AI can probably do that,” Basham said. “But if we’re bringing in multiple concepts and modalities, then it’s a much different conversation. We need to think about who we are as a society and what we teach, especially when we think about students with disabilities, because they are often judged on just one aspect.”
The article, published in the Journal of Special Education Technology, was co-written with Matthew Marino, Eleazar Vasquez and Lisa Dieker, all of the University of Central Florida, and Jose Blackorby of WestEd.
KU noted the authors also urged those in education to consider AI and if it’s a “cognitive prosthesis” or something more. As a student with physical impairments might use speech-to-text to translate their thoughts more efficiently to writing or a student with a hearing impairment can use an app on a phone to turn down ambient noise in the classroom, a student with cognitive disabilities could use AI technology to improve their writing.
While technology can help students improve writing and other skills, educators need to consider consent, the authors wrote. All students should be taught about what information any AI collects, how it is stored and how it is shared. Parents have a role to play in this regard as well, in considering whether a school that uses AI is right for their child, if it complies with an individualized education Plan and if it can be personalized while being respectful of diverse student backgrounds and values, the author wrote.
The authors also noted that AI already exists in schools, such as students using laptops, tablets, smartphones and other technologies unavailable to previous generations. Yet those tools are not banned from classrooms outright. Similarly, while technologies such as ChatGPT could be used to cheat or reduce student workload, they could also potentially be an effective resource from students with disabilities. Before any such judgments are made, researchers and policymakers should continue to ask questions and ensure people who represent students with disabilities are at the table, the authors wrote.
“Technology is a societal experiment,” Basham said. “We can use it effectively or ineffectively. But the education system needs to get in front of it and figure out how to use this particular technology to further human betterment. What we need is not to be afraid of change but to think about critical thinking and problem-solving so we are teaching students to do that whether with AI or without. We need to reflect not on today on how it will change our lives, but what it means for the future.”
Copyright 2023 WIBW. All rights reserved.