Document Type
Dissertation
Disciplines
1.2 COMPUTER AND INFORMATION SCIENCE, Computer Sciences
Abstract
Gender bias in Natural Language Processing (NLP) models is a non-trivial problem that can perpetuate and amplify existing societal biases. This thesis investigates gender bias in occupation classification and explores the effectiveness of different debiasing methods for language models to reduce the impact of bias in the model’s representations. The study employs a data-driven empirical methodology focusing heavily on experimentation and result investigation. The study uses five distinct semantic representations and models with varying levels of complexity to classify the occupation of individuals based on their biographies.
Recommended Citation
O'Carroll, J.M. (2023). Exploring Gender Bias in Semantic Representations for Occupational Classification in NLP: Techniques and Mitigation Strategies. [Technological University Dublin]
Creative Commons License
This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.
Publication Details
A dissertation submitted in partial fulfilment of the requirements of Technological University Dublin for the degree of M.Sc. in Computer Science (Data Science) March 2023.