When the Americans with Disabilities Act (ADA) went into effect 32 years ago, there was optimism that technology could close the educational gap for students with disabilities and other special needs. The ADA goes far beyond visible disability, promising life-changing protection for neurodiversity.
As neurodivers, we know how educational technology can change lives—and how word processors, spell checking, and self-paced learning allow our brains to thrive in ways traditional schooling can’t. But we’re also seeing how emerging technologies threaten the opposite, making schools a harsher, less accessible environment.
Today, schools across the country are increasingly turning to technology solution tools that harm students with invisible disabilities. Rough risk assessment tools mistake neurological differences for harm to ourselves and others. Social media monitoring assesses posts about mental health, and punish students who require psychological evaluation as part of their personalized learning assessment.
Remote and computer proctoring Projects with biometric monitoring capabilities have become a mainstay during the COVID-19 pandemic. These programs flag students for cheating when they look away from the screen or make other “suspicious” actions. This is a real danger for disabled people. The voices and facial expressions of students with disabilities may differ from the “normal” baseline to which the software program compares the students — mislabeling their emotions and singleting them out for disciplinary action.
In many cases, remote proctoring programs do don’t even get used to it Disability – reject test takers bathroom break, time away from their computer screens, scratch paper, and dictation software. This exacerbates disability, causes stress, and forces candidates to rush through the most important exams of their lives.
This monitoring pushes neurodivergent students into the shadows, stop them From sharing their feelings to devaluing their Mental Healthand reduce their willingness to seek help.
“These algorithms roughly decide who’s “normal” and who’s not, punishing students simply because their brains behave differently…“
Seeking cognitive assessments and talking openly about mental health should be encouraged as a healthy behavior, not punished. Like many people with learning disabilities, we remember going from therapist to therapist, and from assessment to assessment, desperately trying to discover the correct diagnosis. We remember the sting and shame when teachers singled us out for our spelling, reading, or inability to sit still.
We are not alone.
more than 20% of Americans have a mental illness and approximately 10% Have a learning disability. Neurodifferences are nothing to worry about for almost each of us, but school surveillance technology sees our differences as a threat. Like the shame we feel when teachers single us out, when monitoring technology targets neural divergence, it hurts students.
The algorithm the school uses is not some magic crystal ball, but a bias in a box. These algorithms roughly decide who is “normal” and who is not, Punish students just because their brains behave differently. But the injustice didn’t end there.
Worse yet, there is a The explosion of biometric policing technology The same tools police use in public are making their way into classrooms over the past 30 years.
For example, emotion recognition and attention detection software monitors students’ biometric information (i.e., motor and vocal tics) and then compares it to behavioral trend lines considered “normal” or favorable to track students’ mood and attention.
some edtech The software already includes this technology. In 2017, a French school introduced the Nestor EdTech platform into the classroom; the program is equipped with attention monitoring. And in April, Zoom considering emotional artificial intelligence (AI) Enter its platform, which educators rely heavily on for distance learning.
We are no strangers to the harmful effects of remote and computer proctoring.
We passed the exam in a hurry because our proctoring software didn’t allow toilets. We are increasingly concerned that biometric monitoring software will flag our uncoordinated eye movements, auditory processing habits, fidgeting and uncontrollable tics as “cheating.” We were told to take the exam for too long, a maximum of 10 hours a day, two days in a row. And we choose not to seek accommodation for important exams or not to take them at all because the disability accommodation process is too onerous. In some cases, this affects our educational choices and reduces job opportunities.
Thirty-two years later, the full promise of ADA remains unfulfilled. To make matters worse, civil rights protections appear to be only further behind.
Looking to the next few decades, lawmakers and regulators cannot simply rest on their laurels. Those in power have no reason to ignore threats, and those who design technology have no reason to ignore how their tools can negatively impact people with disabilities. We need protections for the age of algorithms—a new set of ADA safeguards to protect students from the changing barriers of public life.