Discrimination
EdTech allows for intentional and unintentional discrimination.
Recruitment. Certain EdTech platforms share student data with colleges and employers, who are able to target students for recruitment. Investigations have revealed that some recruiters use data pertaining to immutable characteristics—like a child’s race, gender, or socioeconomic status—as a preliminary filter. Rather than use these filters to improve opportunities for disadvantaged groups, recruiters have instead used them to discriminate against those groups.
Surveillance. Some EdTech companies exist for the express purpose of monitoring student behavior. In a March 2022 report, Constant Surveillance: Implications of Around the Clock Student Activity Monitoring, United States Senators Elizabeth Warren and Ed Markey found that such monitoring may have discriminatory effects:
- Student activity monitoring software may be misused for disciplinary purposes and may result in increased contact with law enforcement, including outside of school hours.
- Companies have not taken any steps to determine whether student activity monitoring software disproportionately threatens students from marginalized groups.
- Schools, parents, and communities are not being informed of the use–and potential misuse–of students’ data.
- Student activity monitoring software is likely infringing students’ civil rights, including Title IV prohibiting discrimination based on race, color, and national origin, and Title IX prohibiting sex discrimination in education institutions, including discrimination based on sexual orientation and gender identity.
Artificial intelligence. That the advent of AI brings with it the panoply of human prejudices and biases is well documented. Adoption of AI by schools is no exception. Deploying AI in assessment of and decision-making about students will inevitably lead to discriminatory consequences if the algorithms don’t consider students’ nuanced experiences, trapping low-income and minority students in low-achievement tracks, where they face worse instruction and reduced expectations.
Further reading:
Takeaways from Our Investigation into Wisconsin’s Racially Inequitable Dropout Algorithm. The Markup (April 2023).
Why Schools Need to Talk About Racial Bias in AI-Powered Technologies. Education Week (April 2022).
The Big Business of Tracking and Profiling Students. The Markup (January 2022).
Predictive policing strategies for children face pushback. NBC News (June 2021).
AI is coming to schools, and if we’re not careful, so will its biases. Brookings Institute (September 2019).