Imada Cover Image
Image generated with assistance from DALL-E.

Generative AI’s impact on students of color and diverse students

Let me be transparent here. Much of this Relevance Report entry uses generative AI — specifically ChatGPT.
 
Generative AI is here to stay and its impact on our daily lives is everywhere. As the use of generative AI grows, so will debates — pro and con — in post-secondary institutions of higher learning. As many of us already know, AI is all about change, which is not always easy for highly structured institutions where even the slightest modification to curricula is viewed with angst.
 
During a discussion on generative AI hosted by VOICES for AAPIs, a national organization dedicated to supporting Asian Americans and Pacific Islanders in the fields of communications and marketing, panelist Dr. Gain Park, an associate professor in the Department of Journalism and Media Studies, talked about how students and faculty are feeling about AI. According to Dr. Park, students and faculty are excited and scared of AI. Why? Because it is growing so rapidly.
 
This rapid change highlights the importance of keeping up with AI and staying ahead of it for higher education. However, generative AI may pose even greater challenges for historically underserved students, especially for college students of color. As we delve further into AI, it is essential to discuss and evaluate the impact AI will have on underrepresented communities. 
 
Here are some of the challenges and opportunities that generative AI presents to students of color and the broader education ecosystem. While this list is not comprehensive, it will broaden conversations about AI relating to students of color and other diverse communities.
 
Bias and Fairness: Generative AI models aggregate information from large datasets that contain biases. This can generate content that may perpetuate negative stereotypes and tropes or fail to adequately represent the genuine voices and experiences of students of color and other diverse population segments. These individuals may encounter AI-generated materials that minimize their perspectives and experiences. Or worse, AI-sourced information could be historically inaccurate and rely on past racist beliefs and views. Suppose students of color and other diverse individuals cannot engage in building and shaping AI models. In that case, they may remain marginalized and unable to use generative AI tools to excel in their academic development.
 
Access Disparities: Students need equal access to technology and the internet. Students of color, particularly those from low-income backgrounds, may face barriers to accessing generative AI tools and the educational opportunities they offer. Furthermore, the digital divide only exacerbates existing inequalities in education. Chris Cathcart, a public relations lecturer at California State University, Northridge, and a VOICES panelist shared these same concerns using one word: diversity. Cathcart said the generative AI tools must be available to everyone, and not just for people of means. If access to AI tools and the internet are limited, it will impede the ability of marginalized communities from benefiting from these opportunities.
 
Privacy Concerns: Generative AI can collect and process vast amounts of data, including personal information such as race, gender, net worth, and family history. Students of color and diverse students may be at greater risk of privacy breaches, especially if their personal information is mishandled or misused. Concerns and fears over data security can hinder their engagement with AI-driven educational platforms.
 
Undocumented students are also at risk of using AI-enhanced tools. AI could be used to screen college applicants, revealing their legal status, and exposing details about a potential student that are private to that individual and their families. While companies are now using AI tools to address and reduce employment bias, it could also inadvertently create barriers for diverse applicants for entry into the workforce.
 
Personalized Learning: Generative AI can create bespoke educational content catering to the unique disposition of each student. Customized learning is particularly beneficial for students who may require lessons and assignments that take into consideration their diverse learning needs and preferences. Immigrants and refugees with limited experience attending U.S. colleges could have AI-enhanced lesson plans that incorporate more visual aids, bilingual text, and a timeline to facilitate learning at a comfortable pace for the student and professor. AI tools can also be deployed to detect learning challenges for students who have difficulties with written and oral presentations. When these difficulties are identified, AI can customize a process to help students overcome these challenges. Yet despite these opportunities, Ms. Christie Ly, an adjunct professor at USC with more than 20 years of strategic communications experience, says students cannot rely on generative AI to get by in class; instead, they need to fuel what they are learning with their human touch. 
 
Language Support: For students of color and non-native English speakers, AI-powered language translations and learning tools can provide invaluable support, making educational content more accessible and understandable. For example, an English as a second language student may have generative AI bilingual lesson plans created to match their skill level and offer an optimal pace for that individual to learn. AI can also bridge the gap in personalized education. Professors with many non-English-speaking immigrants and refugees in their classes can create culturally relevant curricula that speak to students in a language and tone that will accelerate their comprehension and learning.
 
Diversity in Curriculum: AI can assist educators in diversifying classroom materials, incorporating various voices and perspectives. This helps students of color and other diverse students feel more represented and engaged in their studies. Furthermore, AI can be used to identify terms that are demeaning, hurtful, and inappropriate in curriculum, creating a safer and more inclusive environment for learning.
 
Accessibility Features: Generative AI can create accessible content, benefiting students with disabilities, including those from diverse communities. This content promotes inclusivity and belonging and ensures that students living with disabilities have new opportunities to attain knowledge on par with others without apparent and non-apparent disabilities. For example, several AI-enhanced words-to-speech programs can help students on the autistic spectrum hear text from books, articles, and periodicals that will allow them to visualize content to facilitate learning and understanding. These same tools can be enhanced with different voices, sounds, delivery speeds, and tones to customize the learning experience.
 
Reducing Bias: As AI technology evolves, efforts are being made to reduce bias in AI-generated content. Students of color and diverse students stand to benefit from a more inclusive and unbiased educational experience. However, without diverse human interaction, general AI models will not learn about the experiences of those marginalized by racism, classism, and other biases. Finding the right AI prompts and inputs are essential to reducing and eventually eliminating unconscious bias in generative AI models and tools.

Conclusion

Generative AI presents challenges and opportunities for students of color and other diverse students. To responsibly utilize its potential to advance education, it is crucial to address ongoing issues of bias, unequal accessibility, and justifiable fears over privacy. By doing so, we can create a more equitable educational landscape where all students, regardless of their background and circumstances, can learn, grow, and thrive.
 
As generative AI continues to evolve, it is our responsibility to stay active in its development and use and to ensure that it remains accessible to everyone who wishes to use its tools to advance their knowledge. Furthermore, it is our responsibility to ensure that access to generative AI is managed wisely and responsibly, so that it is a tool for empowerment and not a source that widens disparities. As Mr. Cathcart shared in the remaining few minutes of the VOICES panel discussion, “AI is a tool, not a toy.” 
 
The future of education, with generative AI as an active partner, holds promise, but it is up to us to ensure that promise is fulfilled inclusively and equitably.

Bill Imada is founder, chairman and chief connectivity officer of IW Group, a minority owned and operated advertising, marketing and communications agency focusing on the growing multicultural markets. Bill is also a trainer and mentor, and serves on the advisory councils for Cal State Northridge, University of Florida, and Western Connecticut State University. He is a member of the USC Center for PR board of advisers.

This article was co-authored by Chat CPT.