In a thought-provoking lecture on gender inequality, an economics professor stumbled upon the dark side of AI-generated writing tools. The professor demonstrated how AI-powered language models, such as ChatGPT, can perpetuate sexist biases in writing. The experiment showed that the AI-generated letters of recommendation for two students, one male and one female, with equal marks, differed significantly in tone and language. The male student was described with adjectives like “confident” and “outstanding,” while the female student was described as “good for collaborative activities” and “empathetic.” This biased language can have significant implications in the job market, where AI-generated resumes and emails may inadvertently perpetuate gender stereotypes.
The professor’s experiment also showed that AI-generated job recommendations for male and female students differed, with men being seen as a better fit for jobs requiring quantitative skills, such as financial analysis, while women were seen as better suited for roles like financial planning and development consulting. This raises important questions about the role of AI in perpetuating biases and stereotypes. The article highlights the need for educators and students to be aware of these biases and to encourage original writing, rather than relying on AI-generated content. By embracing imperfections in written drafts and applications, we can ensure that the craft of writing is not lost in the age of AI.











